Skip navigation
1 2 Previous Next

Accelrys Blog

19 Posts tagged with the nanotechnology tag
1

Materials Studio 7.0: More Science, More Applications…

 

And so another year of development comes to a close and we have released Materials Studio 7.0! From my perspective, this is a pretty exciting release because after Materials Studio 6.1 - which was mostly focused on re-architecting Materials Visualizer - the focus for this release is back on the science. We also used milestones in this release to gather very early feedback from a small set of customers which worked really well (a big thanks to those who contributed their time!)

 

So what did we deliver in this release? Well, a year is a long time and there are lots of interesting new pieces of functionality and performance and usability enhancements.

 

Way back in late 2012, we had a Materials Science and Technology Forum in India. This was the first meeting of its kind in India and was a great opportunity to meet face to face with our customers. One message that kept on coming over again and again was around performance. The main focus was on DMol3 with customers saying “it’s fast on one core but just doesn’t scale well enough”. Modern computer architectures have changed a lot since DMol3 was last parallelized, and so it was time for a re-think. Our DMol3 expert already had many ideas about how to improve things, and he succeeded spectacularly and now we have good scaling up to 64 cores. So check out the scaling, throw more cores at your calculations and get the results faster. It might go higher than 64 cores but that is all we have in house!

 

There was a similar message on the performance of Forcite Plus. Whilst we could do charge neutral molecules quickly using charge groups, when it came to charged systems like surfactants and ionic liquids, you had to use Ewald which scaled well but was slow. After quite a bit deep deliberation, we have added Particle-particle-particle Mesh Ewald (P3M). This is much faster for larger systems than Ewald and gives access to calculations on hundreds of thousands of atoms.

 

Also, in 2012, we sent out a large survey which many of you completed (in fact, over 400 people filled it out which was brilliant!) We asked you to rate different properties by level of importance to you. From this, solubility came out as the number one property of interest linked closely with free energy calculations. In response, In Materials Studio 7.0, we have added a new Solvation Free Energy task to Forcite Plus enabling you to use Thermodynamic Integration or Bennett Acceptance Ratio to calculate free energy of solvation.

 

Another important area from the survey was to improve the prediction of properties for electronic materials. In collaboration with TiberLab and the University of Bremen, we have added non-equilibrium greens functions to DFTB+ enabling the calculation of electron transport. This addition has also meant extending Materials Visualizer with a set of tools for building electrodes and transport devices. Coupled with the new DFTB+ task, you can calculate transmission and current-voltage curves for a range molecular electronic devices.

 

These are just some of the highlights of the release. I don’t even have space to mention new forcefields, barostats, mechanical properties from DMol3, optimization of excited states in CASTEP or the uber-usability enhancement of “copy script”. I guess you will just have to listen to the webinar and find out more about how Materials Studio 7.0 can accelerate your modeling and impact your work.

1,240 Views Permalink Categories: Modeling & Simulation Tags: materials_studio, materials_visualizer, forcite, classical_simulation, dmol3, polymer, catalysis, materials, nanotechnology, atomic-scale-modeling, innovation, quantum-chemistry
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,456 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,011 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
0

Today at the Materials Research Society meeting in Boston, Cyrus Wadia, the Assistant Director for Clean Energy and Materials Research and Development at the White House Office of Science and Technology Policy challenged the audience to “think as a network” of collaborators, not individual researchers, and to “learn from colleagues” how to better handle large data sets.

 

The recently publicized presidential initiative for a Materials Genome aims to insert or provide new materials for the production and technology advantage of US industry. This initiative mirrors exactly the focus Accelrys has on integrated multi-scale modeling and laboratory-to-plant optimization. The heart of these initiatives is improving the estimation and management of innovation and design of materials from the nano-scale right up to the process and macro production scale.

 

As many Accelrys users know, multi-scale modeling where the results of atomistic and micro-scale simulation can be combined with larger scale processes, high throughput screening and testing, and laboratory information yields valuable insights into materials design. The combination of computational and real analytical instruments alongside the organization of laboratory data allows the use of models at all length scales and can be considered as driving the innovation cycle of many companies.

 

The MGI initiative clearly recognizes this function and the idea of developing a framework and ontologies where information, models and data can be exchanged at different levels and different time scales. The value of making information, more widely available to the engineer and materials scientist for collaboration will help drive more accurate and systematic materials development and usage. Accelrys is very excited about this initiative and is sure its technology will provide clear aid in the acceleration of the innovation lifecycle of new materials—which is a clearly stated goal of this initiative.

 

How do you feel about this initiative?

759 Views 0 References Permalink Categories: Trend Watch, Data Mining & Knowledge Discovery, Modeling & Simulation, Materials Informatics Tags: materials, nanotechnology, pipeline-pilot, atomic-scale-modeling, high-throughput, materials-studio, cloud-computing
0

What a pleasure to hear the news this morning that the 2010 Nobel Prize in physics has been awarded to Andre Geim and Konstantin Novoselov. My colleague Johan Carlsson referred to these researchers in his informative post yesterday on graphene. The Nobel press release highlights the extraordinarily simple technique that Geim and Novoselov used to extract graphene from ordinary graphite:

 

Using regular adhesive tape they managed to obtain a flake of carbon with a thickness of just one atom. This at a time when many believed it was impossible for such thin crystalline materials to be stable.

 

All of us in materials science at Accelrys congratulate Geim and Novoselov on their achievement. Would it be pompous to refer to our webinar Thursday on graphene as the “Nobel Webinar”? Probably. Nevertheless, we hope you’ll attend to see the role modelling and simulation have played in helping us understand this extraordinary material.

545 Views 0 References Permalink Categories: Materials Informatics Tags: graphene, materials, nanotechnology, nobel-prize
0

 

Dr Johan Carlsson, Accelrys Contract Research Scientist

Dr Johan Carlsson, Accelrys Contract Research Scientist

 


The final presentation in our recent contract research webinar series features work on graphene, which has been described recently as the “ultimate” material for next-generation nanoelectronics development. I asked Johan Carlsson from the Accelrys contract research team to provide some background on this nanomaterial. Attend the webinar to see how simulation methods have helped and are helping unravel some of graphene’s secrets.

 

Graphene is one of the most interesting nano-materials at the moment [1,2]. With all the hype, it might seem as though graphene is a completely new material discovered just recently. And in a sense this is correct.

 

When I started to work on carbon materials some eight years ago, graphene was just an imaginary model system. Back then the excitement about the fullerenes was being replaced by what I call the “nanohype.” All the fashionable forms of carbon suddenly needed to have names including the buzzword “nano.” So we had nanohorns, nanofoams, nanoporous carbon, and, of course, nanotubes.

 

These fashion changes indicate the cyclic nature of carbon science. During the last 25 years, a new form of carbon has been discovered every five to six years. The fullerenes were discovered in 1985 [3], the nanotubes in 1991 [4], and, in 1998, graphynes were predicted [5]. That same year, thin graphite layers were grown on top of SiC wafers—what might today be referred to as the first few layers of graphene [6]. However, it took another cycle of carbon science before the graphene hype really took off, when researchers at the University of Manchester managed to extract individual graphene sheets from a graphite crystal in 2004 [7]. It’s actually about time for another carbon allotrope to emerge on the scene. Graphanes (graphene sheets fully saturated by hydrogens) have potential, but they are not strictly pure carbon structures [8]. We’ll have to see if some other new carbon material will emerge soon.

 

Yet while humans have only discovered the potential of graphene recently, graphene sheets have always been available in nature as two-dimensional layers that are the building blocks of graphite. They’ve just not been accessible, as individual graphene sheets have been very difficult to isolate. The barrier to graphene synthesis was perhaps more mental than technical, as the method that finally succeeded was incredibly simple. The reseachers in Manchester had the ingenious idea to use adhesive scotch tape to lift off the graphene sheets from a graphite crystal! [7] Of course, this method didn’t scale industrially, and since this breakthrough a number of alternative methods have been quickly developed.

 

Graphene is now a mature material. Perhaps the maturation of graphene has represented the next step in the cyclic carbon evolution? Anyway, the progress in this field has been rich because graphene science straddles two different communities: the nanotube community and the low dimensional semiconductor community. Graphene has been proposed for a variety of applications in diverse fields, including field effect transistors with graphene channels [9], gas sensors [10], and solar cells [11]. It’s even attracted interest in life science as an active membrane that can separate different molecules out of a solution [12].

 

Interestingly, the progress in graphene science is to a large extent driven by theoretical predictions and results from simulations. Graphene has been the model system of choice for theoretical investigations of sp2-bonded carbon materials. As such it has been the starting point to study graphite, fullerenes and nanotubes. Many properties of graphene were then known from theoretical point of view, before it was possible to perform actual measurements. The electronic structure of graphene for instance is known from theoretical calculations performed more than 60 years ago [13]. Only recently has it been possible to actually measure the bandstructure of individual graphene sheets. Similarly, a substantial amount of our knowledge about the structure and properties of point defects and edges of graphene was first obtained by simulations and later confirmed by experiments. This shows that simulations and experimentation go hand in hand, and theoretical methods will continue to play a major role in the further development of the graphene and other materials.

Literature


[1] A. K. Geim and K.S. Novoselov, Nature Materials 6, 183 (2007).
[2] A. K. Geim,Science 324, 1530 (2009).
[3] H. W. Kroto, J. R. Heath, S. C. O'Brien, R. F. Curl & R. E. Smalley, Nature 318, 162 (1985).
[4] S. Iijima, Nature 354, 56 (1991).
[5] N. Narita, S. Nagai, S. Suzuki, and K. Nakao,  Phys. Rev. B 58, 11009 (1998).
[6] I. Forbeaux, J.-M. Themlin, and J.-M. Debever, Phys. Rev. B 58, 16396 (1998).
[7] K.S. Novoselov, A. K. Geim, S. V. Morozov, D. Jiang, Y. Zhang, S. V. Dubonos, I. V. Grigorieva, and A. A. Firsov, Science 306, 666 (2004).
[8] J. O. Sofo, A. S. Chaudhari, and G. D. Barber, Phys Rev. B 75, 153401 (2007).
[9] Yu-Ming Lin, Keith A. Jenkins, Alberto Valdes-Garcia, Joshua P. Small, Damon B. Farmer and Phaedon Avouris, Nano Lett. 9, 422 (2009).
[10] F. Schedin, A. K. Geim, S. V. Morozov, E. W. Hill, P. Blake, M. I. Katsnelson, and K. S. Novoselov,  Nature Materials 6, 652 (2007).
[11] X. Wang, L. Zhi, and K. Mullen, Nano Lett. 8, 323 (2008).
[12] S. Garaj, W. Hubbard, A. Reina, J. Kong, D. Branton, and J. A.  Golovchenko, Nature 467, 190 (2010).
[13] P. R. Wallace, Phys. Rev. 71, 622 (1947).

835 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation, Trend Watch Tags: graphene, materials, nanotechnology, alternative-energy, guest-authors, semiconductor
0
Accelrys is exhibiting at booth #536 at the TechConnect expo & conference in Anaheim and the conference is teeming with researchers in Nanotech, Microtech, Cleantech and BioNanotech areas.  With separate tracks in these domains, scientists and leaders in various areas of chemistry, physics and biology have gathered to network, share and create partnerships for innovation and breakthrough in technology.

One of the invited talks stressed how crucial it is to use masterbatch products for maximizing performance of carbon nanostructure composite materials.  The speaker showed critical differences in the extent of dispersion in such products.  Another presentation dealt with nanotechnology applied in drugs and biologics delivery.   Other talks were about formulation, imaging agents, and carbon nanotube ink technology.    Feeling at home in the world of nanotech… signing off for now.
491 Views 0 References Permalink Categories: Trend Watch Tags: chemistry, imaging, biology, materials, nanotechnology, conferences, biologics, cleantech, formulations, microtech, nanostructure, physics
0
Christopher Kane, of Pfizer Global Research and Development, presented his work on targeted drug delivery yesterday at Informa's Nuclear Receptor conference in Berlin.  Kane highlighted the importance of a drugs therapeutic index: TI = efficacy/toxicity.  He showed that his team of scientists have utilized Folic Acid (vitamin B9 occurs naturally as folate) in order to carry a drug to specific cells in the body by way of the folate receptor.  Kane showed how the folate targeted nanoparticles (FTNP's) were preferentially taken up by activated macrophages as they are known to have a high level of folate receptors.  Activated macrophages are concentrated in areas of disease such as lesions and atherosclerosis. Studies showed promise of decreasing systemic exposure of drugs (toxicity) while increasing targeted up take (efficacy) both in-vitro and in-vivo.

Kane's presentation highlighted the use of multiple imaging modalities in carrying out his research. These included fluorescent cell microscopy (HCS), confocal microscopy, TEM, and bright field on tissue samples. His work in developing safe and effective nanoparticles for drug delivery supports the notion that multi modal imaging is advancing our understanding of biochemical interactions in-vitro and in-vivo. Effective and efficient integration of these disparate data sources for a holistic understanding of the research being carried out may best be supported by an image informatics platform. Moreover, this work was an excellent example of how a scientific informatics platform could enable an organization’s entire scientific community by automating and delivering relevant scientific information previously held in silos created by different acquisition modalities. Such a platform can dramatically improve collaboration, innovation, decision making and productivity.
448 Views 0 References Permalink Categories: Trend Watch Tags: nanotechnology, pipeline-pilot, nanoparticles, image-informatics, drug-delivery, folate, nuclear-receptor, vitamin-b
0
It was indeed very pleasant to visit the Stanford campus last week; I had a chance to see familiar faces, as well as new ones, amongst the attendees at the workshop, “Bridging the Gap Between Theory and Experiment: Which Theoretical Approaches Are Best Suited To Solve Real Problems In Nanotechnology and Biology”

There were several invited talks on semiconductors and catalyst nano particles, apart from my talk on alternate energy.  Many of the speakers discussed the suitability of a particular simulation approach for the study of specific applications, while others discussed the most recent state-of-the-art theoretical advances to tackle real problems at several timescales.  It is particularly challenging when simulations are to be used not just for gaining insights into a system but to be a predictive tool as well as for virtual screening.  While virtual screening is a well-studied art in the world of small molecule drug discovery, this is only now gaining traction in the materials world.

For further inight into virtual screening in materials, check out George Fitzgerald's webinar on High-throughput Quantum Chemistry and Virtual Screening for Lithium Ion Battery Electrolyte Materials, next Wednesday, March 16.
436 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: biology, catalysis, materials, nanotechnology, high-throughput, alternative-energy, semiconductors, virtual-screening
0

Calling DFT to Order

Posted by gxf Feb 15, 2010

One of the most interesting developments in density functional theory (DFT) in recent years is the emergence of the so-called "Order-N" methods. What's that mean? Quantum chemists and physicists classify the computational cost of a method by how rapidly it scales with the number of electrons (or the number of molecular orbitals.) This can get into a real jargon of computational chemistry, but here are some examples:  

 

 

ONETEP gets its speed by using localized molecular orbitals (MOs). Top: a conventional MO is spatially delocalized, hence it interacts with many other MOs. Bottom: localized MOs do not interacte, hence less computational effort is required to evaluate matrix elements.

 

 

Consider the N4 case as an example. This means that if you double the size of the system that you're modeling, say from a single amino acid to a DNA base pair, the cost  (i.e., CPU time) goes up by roughly 16x. That makes many of these approaches prohibitive for systems with a large number of atoms. The good news is that it doesn't really need to cost this much. The atomic orbitals that constitute the molecular orbitals have finite ranges, so clever implementations can hold down the scaling. The holy grail is to develop methods that scale as N1 or N, hence the expression "Order-N" or "linear scaling." Using such a method, doubling the size of the system simply doubles the amount of CPU time.  

 

My favorite Order-N method is ONETEP (not surprising, considering that it's distributed by Accelrys). As explained in their publications, this approach uses orbitals that can be spatially localized more than conventional molecular orbitals to achieve its speed. As a result of localization, there's a lot of sparsity in the DFT calculation, meaning a lot of terms go to zero and don't need to be evaluated. Consequently, it's possible to perform DFT calculations on systems with 1000s of atoms. Because of its ability to treat system of this size, it's ideally suited for nanotechnology applications. Some recent examples include silicon nanorods (Si766H462) or building quasicrystals (Penrose tiles) with 10,5-coronene

 

Why bring this up now? CECAM (Centre Euopéen de Calcul Atomique et Moléculaire) is hosting a workshop on linear-scaling DFT with ONETEP April 13-16 in Cambridge, UK. This is a chance for experienced modelers and newcomers to learn from the expert. Plus they'll have access to the Cambridge Darwin supercomputer cluster, so attendees will have fun running some really big calculations. What kind of materials would you want to study if you had access to this sort of technology?

447 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: nanotechnology, atomic-scale-modeling, linear-scaling, onetep, order-n
0
I am really excited that Prof Robert Langer is going to join the Accelrys UGM in Boston to deliver a plenary address. Though he probably needs no introduction, Langer received the Charles Stark Draper Prize, considered the equivalent of the Nobel Prize for engineers and the 2008 Millennium Prize, the world’s largest technology prize. Check out a recent video, to hear more about his own journey which took him from a Chemical Engineering degree to the forefront of medical research on delivery systems and tissue engineering.

As the UGM sets out to discuss the latest advances in materials and pharmaceuticals research, Prof Langer brings it all together. Drugs on their own are powerful, but putting them simply into pills seems a bit like a powerful engine in a car without a steering wheel. Langer has pioneered ways in which materials, especially polymers can be used to steer their delivery to much greater effect in curing disease. With the development of nanotechnology over the last decade, the sophistication of the drug and the material can work more and more together, and I really look forward to the materials and life science interaction at the UGM.
477 Views 0 References Permalink Categories: News from Accelrys, Trend Watch, Materials Informatics Tags: materials, nanotechnology, pharmaceuticals, chemical-engineering, polymers, tissue-engineering
2

In the last of the series of my blogs on Materials Studio 5.0 functionality, I will be writing about new functionality in the mesoscale area. Back in Materials Studio 4.4, we developed a new module called Mesocite. Mesocite is a module for doing coarse-grained molecular dynamics where the elementary particle is a bead. In coarse-grained molecular dynamics, a bead typically represents several heavy atoms. This has advantages over classical molecular dynamics such as Forcite Plus as you can access longer time and length scales.

 

In Materials Studio 5.0, we added the capability to do Dissipative Particle Dynamics (DPD) to Mesocite. DPD is a very coarse-grained dynamics approach where the bead can represent many atoms or even molecules. We already have a module which can do DPD in Materials Studio but this has limited ability to be extended. By developing the new DPD in Mesocite, we could take advantage of the underlying MatServer environment to easily extend DPD to run in parallel and work with MaterialsScript amongst other things.

 

One issue we faced is that the legacy DPD tool works in reduced units whereas MatServer requires physical units. The use of reduced units is fairly standard in DPD however it makes it more difficult to relate the results back to experimentalists. Therefore, we thought that switching to physical units would be a good idea. However, there were still questions as to how customers would work with a DPD in physical units. We asked a small focus group of customers very early in the release as to how they would like to parameterize DPD calculations. All agreed that getting the results in physical units was preferable but they still wanted to set up the calculations in reduced units as they have lots of historical data they want to re-use. So, we have a new user interface which allows setup in either reduced or physical units but then converts to physical units for the calculation!

 

When a new piece of functionality is added to Materials Studio, I like to add a tutorial on how to apply the software, what sort of values customers should use, and how to get the most out of the software. For the new DPD functionality, I looked at several papers before settling on an application by Groot and Rabone looking at the effect of non-ionic surfactants on membrane properties. This interested me as it demonstrates the strength of mesoscale in looking at varying concentrations of different components and seeing the effect on morphology. I also realized that I could use some of the Mesocite analysis to really analyze the system for properties such as concentration profiles and examining the diffusivity of the beads across the membrane. This mapped really well to the original results and produced what I hope is an interesting tutorial.

 

There was another reason I chose this paper too - Groot and Rabone also looked at the effect of strain on the membrane. This wasn’t possible with the old DPD module but, using some MaterialsScript, I could strain the system and then calculate the surface tension. As I like to dabble with MaterialsScript, the lure of this was irresistible and the script I made is available from the Accelrys Community website.

 

One minor issue was that the system sizes and relative amounts were not clear from the paper. Luckily there are several ex-Unilever people at Accelrys, so one of my colleagues, Dr Neil Spenley, contacted one of the authors, Dr Robert D. Groot. I was also lucky that Dr Groot obviously hordes his old research work and, a day later, I had the original DPD input file in my hands!

 

The results from the strain calculations also show the same trends as those reported by Groot and Rabone so I was pretty happy with this work.

 

So that wraps up my blogs on Materials Studio 5.0. I hope to have given you an insight into some of the processes that go into making a new version of Materials Studio.

613 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: mesoscale, nanotechnology, materials-studio, multiscale-modeling, computational-chemistry, polymers
0

Falling towards MRS

Posted by AccelrysTeam Nov 25, 2009

As we make our way to the MRS Fall Meeting at the John B. Hynes Convention Center in Boston, MA from November 30 to December 4, we find ourselves looking forward to the many wonderful things in store for us; not the least of which is the opportunity to visit such a great city.

 

We  eagerly anticipate the plenary session on Monday as Andre Geim from the University of Manchester, UK will present “an overview of [his] work on graphene, concentrating on its fascinating electronic and optical properties, and speculating about future applications.”

 

At the exhibit in booth #508, Accelrys materials modeling experts will showcase the new features and enhancements found in Materials Studio 5.0.

 

On Wednesday, December 2 at 12:00 pm, Dr. George Fitzgerald of Accelrys will host a luncheon workshop, “Data Pipelining and Workflow Management for Materials Science Applications,” that will demonstrate how to combine materials modeling with workflow management tools to improve productivity. The workshop will present examples in polymers, catalysts, and nanotechnology. To register, please visit: http://webrsvp.mrs.org/rsvp.aspx?meeting_id=55

 

We hope to see you there!

415 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: catalysis, nanotechnology, atomic-scale-modeling, materials-studio, polymers
0
Accelrys will be attending the 2009 Chemical and Biological Science and Technology Conference in Dallas TX, November 16 -20.

Two Accelrys posters have been selected to be presented during this conference:

Dr. Nick Reynolds will be presenting a poster on “Applications of nanoscale simulations methods for understanding the structure and mechanisms of chemical sensors." The poster presentation will be on Wednesday, November 18 and will be part of the “Novel Material Science Approaches for CB Defense”

Nancy Latimer will be presenting a poster on “Using Data Pipelining to Analyze Biological Threats: A Biomarker Case Study." The poster presentation will be on Wednesday, November 18 and will be part of the “Integrative Informatics, Systems, and Synthetic Biology Approaches for CB Defense”.

As is the research and development for new drug and vaccines continues to be important to the general health of our country, we need to be mindful that there is always the possibility of emergent threats that will require rapid response. September 11, 2001 was a painful reminder of this and since, there has been significant progress made in capability to counter these threats.

Accelrys is proud of its contribution to enabling innovative discovery and will continue to work closely with both commercial and federal agencies to ensure that the Unites States continues to be on leading edge.
434 Views 1 References Permalink Categories: Trend Watch Tags: nanotechnology, biomarkers, conferences, chemical-warfare, data-pipelining, national-security
0

Theory Meets Industry In Nagoya

Posted by gxf Nov 12, 2009

I've enjoyed 2 days so far in Nagoya attending the "Theory Meets Industry" conference. There is some amazing work going on by both developers of computational methods and those who apply them. We've heard from developers like Bernard Delley and his recent work onTDDFT in DMol3, which will enable excited state calculations and UV spectra. We've also heard from Georg Kresse about his recent work on the Random Phase Approximation (RPA),which offers a way to improve not just DFT band gaps but total energies, as well.

 

There's been an emphasis in alternative energy from the industrial participants. Applications are really diverse:

  • Rradiation damage in reactor containment materials by Christophe Domain of EDF
  • Improved solar cells by Royji Asahi of Toyota Central R&D Labs
  • Fischer-Tropsch catalysis by Werner Janse van Rensburg of Sasol Technologies
  • Hydrogen storage materials by Pascal Raybaud of IFP

 

This list also reflects the true international spirit of the conference.

 

I've also heard some interesting new approaches to doing calculations fast while not sacrificing accuracy. Gabor Csanyi of Cambridge University presented his Gaussian Approximation Potentials (GAP), an alternative to force fields that spans more of the potential energy surface. And Isao Tanaka of Kyoto University showed how he uses an improved Cluster Expansion method to study phase transitions. Keep your eye on these methods for future developments.

 

Today I make my own small contribution by presenting my work on high-throughput computation. Look for details on that in a future blog.

452 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: nanotechnology, multiscale-modeling, alternative-energy, quantum-chemistry
1 2 Previous Next