Skip navigation
1 2 Previous Next

Accelrys Blog

26 Posts tagged with the atomic-scale-modeling tag
1

Materials Studio 7.0: More Science, More Applications…

 

And so another year of development comes to a close and we have released Materials Studio 7.0! From my perspective, this is a pretty exciting release because after Materials Studio 6.1 - which was mostly focused on re-architecting Materials Visualizer - the focus for this release is back on the science. We also used milestones in this release to gather very early feedback from a small set of customers which worked really well (a big thanks to those who contributed their time!)

 

So what did we deliver in this release? Well, a year is a long time and there are lots of interesting new pieces of functionality and performance and usability enhancements.

 

Way back in late 2012, we had a Materials Science and Technology Forum in India. This was the first meeting of its kind in India and was a great opportunity to meet face to face with our customers. One message that kept on coming over again and again was around performance. The main focus was on DMol3 with customers saying “it’s fast on one core but just doesn’t scale well enough”. Modern computer architectures have changed a lot since DMol3 was last parallelized, and so it was time for a re-think. Our DMol3 expert already had many ideas about how to improve things, and he succeeded spectacularly and now we have good scaling up to 64 cores. So check out the scaling, throw more cores at your calculations and get the results faster. It might go higher than 64 cores but that is all we have in house!

 

There was a similar message on the performance of Forcite Plus. Whilst we could do charge neutral molecules quickly using charge groups, when it came to charged systems like surfactants and ionic liquids, you had to use Ewald which scaled well but was slow. After quite a bit deep deliberation, we have added Particle-particle-particle Mesh Ewald (P3M). This is much faster for larger systems than Ewald and gives access to calculations on hundreds of thousands of atoms.

 

Also, in 2012, we sent out a large survey which many of you completed (in fact, over 400 people filled it out which was brilliant!) We asked you to rate different properties by level of importance to you. From this, solubility came out as the number one property of interest linked closely with free energy calculations. In response, In Materials Studio 7.0, we have added a new Solvation Free Energy task to Forcite Plus enabling you to use Thermodynamic Integration or Bennett Acceptance Ratio to calculate free energy of solvation.

 

Another important area from the survey was to improve the prediction of properties for electronic materials. In collaboration with TiberLab and the University of Bremen, we have added non-equilibrium greens functions to DFTB+ enabling the calculation of electron transport. This addition has also meant extending Materials Visualizer with a set of tools for building electrodes and transport devices. Coupled with the new DFTB+ task, you can calculate transmission and current-voltage curves for a range molecular electronic devices.

 

These are just some of the highlights of the release. I don’t even have space to mention new forcefields, barostats, mechanical properties from DMol3, optimization of excited states in CASTEP or the uber-usability enhancement of “copy script”. I guess you will just have to listen to the webinar and find out more about how Materials Studio 7.0 can accelerate your modeling and impact your work.

1,231 Views Permalink Categories: Modeling & Simulation Tags: materials_studio, materials_visualizer, forcite, classical_simulation, dmol3, polymer, catalysis, materials, nanotechnology, atomic-scale-modeling, innovation, quantum-chemistry
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,454 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,009 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
0

Today at the Materials Research Society meeting in Boston, Cyrus Wadia, the Assistant Director for Clean Energy and Materials Research and Development at the White House Office of Science and Technology Policy challenged the audience to “think as a network” of collaborators, not individual researchers, and to “learn from colleagues” how to better handle large data sets.

 

The recently publicized presidential initiative for a Materials Genome aims to insert or provide new materials for the production and technology advantage of US industry. This initiative mirrors exactly the focus Accelrys has on integrated multi-scale modeling and laboratory-to-plant optimization. The heart of these initiatives is improving the estimation and management of innovation and design of materials from the nano-scale right up to the process and macro production scale.

 

As many Accelrys users know, multi-scale modeling where the results of atomistic and micro-scale simulation can be combined with larger scale processes, high throughput screening and testing, and laboratory information yields valuable insights into materials design. The combination of computational and real analytical instruments alongside the organization of laboratory data allows the use of models at all length scales and can be considered as driving the innovation cycle of many companies.

 

The MGI initiative clearly recognizes this function and the idea of developing a framework and ontologies where information, models and data can be exchanged at different levels and different time scales. The value of making information, more widely available to the engineer and materials scientist for collaboration will help drive more accurate and systematic materials development and usage. Accelrys is very excited about this initiative and is sure its technology will provide clear aid in the acceleration of the innovation lifecycle of new materials—which is a clearly stated goal of this initiative.

 

How do you feel about this initiative?

757 Views 0 References Permalink Categories: Trend Watch, Data Mining & Knowledge Discovery, Modeling & Simulation, Materials Informatics Tags: materials, nanotechnology, pipeline-pilot, atomic-scale-modeling, high-throughput, materials-studio, cloud-computing
0

Music-ular Dynamics

Posted by stodd Nov 25, 2011

Ever wondered what molecular vibrations would sound like? JC Jones and Andy Shipway in Israel pondered this question and decided to use molecular dynamics to investigate.You can find more information about their project here and listen to the ethereal sounds of a journey along the myelin protein.

 

I always like to hear about people taking software like Materials Studio and using it to create something completely outside of the original intent. When we introduced MaterialsScript to control parts of Materials Studio, I played my own games around the theme of Winter resulting in the Season's Greetings set of scripts. However, the sort of work linked above takes abstraction to a totally new level!

 

The question is, what else do you use Materials Studio for that isn't in our user stories?

723 Views 0 References Permalink Categories: Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, computational-chemistry
0

Catalysis is one of my favorite topics. They're challenging - but not impossible - to model; the chemistry is fascinating; and they are really, really useful. The Vision Catalysis 2020 Report from the U.S. Chemical Industry estimates that catalysis-based chemical syntheses account for 60 percent of today’s chemical products and 90 percent of current chemical processes. Development of new catalysts is motivated by the need to reduce energy costs, increase yields, and utilize alternative feedstock. Every two years, the leaders in catalysis R&D come together in the North American Catalyst Society meeting (incongruously abbreviated NAM). NAM 22 was held in Detroit 6-9 June. There were some excellent talks there, as well as opportunities to rub elbows catalyst researchers.

 

The theme of the meeting was "Driving Catalyst Innovation." As one might expect in the current climate, there was a lot of focus on alternative energy including electrocatalysis for energy conversion, syngas production, and CO2 capture. If you share my interest in catalysis, then check out the NAM 22 technical program to see all the hot topics. The extended abstracts provide quite a bit of good background for the talks. The program is quite lengthy, no way I can summarize it in this blog, but here are two of my favorite presentations:

  • Catalysts Live and Up Close:Insights from In-situ Micro- and Nano-Spectroscopy Studies, by Prof B.M. Weckhuysen, Ultrech University, discussed the spatiotemporal characterization of individual catalyst particles at the micron- and nano-scale. A detailed understanding of reaction mechanism requires an analysis at the molecular level, and the techniques described in this presentation show how advanced characterization techniques are making this possible.
  • Catalysis for Sustainable Energy by Prof J.K. Norskov, Stanford presented approaches to molecular level catalyst design. Prof. Norskov is well-recognized for his work modeling heterogeneous catalysts. In this presentation he drew from examples including carbon dioxide reduction, and biomass transformation reactions.

CoCatalyst.png

Images of two Co/Tio2 samples used in the study. IWI = preparation by Incipent Wetness Ipregnation. HDP = preparation by Homogeneous Deposition Precipitation.

In quite different ways, both of these underscore the importance of understanding catalysts at the atomic level, an area where modeling is indispensable.

I was delighted that this year I had the opportunity to give a presentation of my own on work that elucidated the structure of Co Fischer-Tropsch catalysts. The Fischer-Tropsch process is instrumental in converting biomass into fuel. By understanding the process in detail, we hope to create more efficient catalysts. Imagine one day dumping your yard waste in the top of a chemical reactor and draining fuel out the bottom. Realizing that vision is wtill a ways off, but we're working on it.

680 Views 0 References Permalink Categories: Modeling & Simulation Tags: catalysis, conferences, atomic-scale-modeling, alternative-energy
0

What it takes to model a gecko

Posted by gxf Jun 13, 2011

PBS ran a great series on materials science a few months ago called Making Stuff.There were episodes on making stuff Smaller, Stronger, Cleaner, and Smarter. One of my favorite segments was in Making Stuff: Smarter(see chapter 3, around 10 minutes in). The gecko climbs walls with ease because of millions of tiny hairs in contact with the surface (see the image taken from the PBS video). Van der Waals forces between the hairs and the wall provide the adhesion. Prof. Kellar Autumn of Lewis & Clark College tells us that about half of the gecko's foot is in molecular contact with the wall. Compare that to only about 10-4 of a human hand which contacts a surface at the molecular level.

PBS_Gecko.png

What amazes me about this is that the van der Waals (VDW) forces are extremely weak - but they add up to something significant. On the macroscopic level we see this in the gecko's feet. More subtly, we can observe this in the trends of boiling points of alkanes: heavier alkanes have higher boiling points because the contact surface area between larger molecules provides greater VDW forces. And the crystal structure of organic molecules is determined in large part by the VDW forces among molecules.

 

So what are VDW energies & forces? Wikipediahas a nice description so I won't need to go into detail. Suffice it to say that these forces are based on the dipole-dipole interactions between molecules. These can be due to permanent dipole moments or to induced dipole moments. Commonly, the VDW terms are included in molecular force fields with an expression like eij (sij/rij)6where the subscripts i and j refer to atoms; r is the distance between two atoms; and e and s are parameters based on the atoms types. People invest a lot of effort trying to generate parameters that can reproduce experimental results. To avoid parameterization, you'd like to be able to do this with a quantum mechanics-based method. Unfortunately, my favorite method, Density Functional Theory(DFT) doesn't encompass these dipole-dipole interactions. Higher level quantum mechanics method do include these term; but they take too long to run.

 

Fortunately, people are working on this. There exists a variety of hybrid semiempirical solutions that introduce damped atom-pairwise dispersion corrections of the form C6R-6. The most well-known schemes were put introduced by Grimme, Jurecka, et al., Ortmann, Bechstedt, and Schmidt, and most recently by Tkatchenko and Scheffler. These semi-empirical corrections to DFT were implemented in the programs DMol3 and CASTEP by Erik McNellis, Joerg Meyer, and Karsten Reuter of the Fritz-Haber-Institut der Max-Planck-Gesellschaft. Thanks to their work, we can perform more accurate calculations where weak forces are important. We're a long way from modeling an entire gecko, but now we can do a good job of modeling the tip toes of their tiny feet.

625 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials_studio, atomic-scale-modeling, quantum-chemistry, van_der_waals
0

DFT Redux Again

Posted by gxf Jan 4, 2011

In what is becoming an annual tradition, my first blog of the year is a quick summary of Density Functional Theory (DFT) applications for the past twelve months. Readers of Accelrys' blogs will be familiar with this modeling method. With this approach you can study heterogeneous catalysis on an extended (periodic) surface; predict crystal structures; or calculate elastic constants. Life science applications include small-molecule design and solvation energies. Using the modest computers that I have access to, I routinely run DMol3 calculations on systems with up to 300 atoms or so. With ONETEP and larger computers, it's easy to do computations on around 1000 atoms with DFT. This is amazing when you consider that the largest calculation I did in graduate school was around 10 atoms.

 

Some of my favorite publications from 2010:

  • Catlow, et al., "Advances in computational studies of energy materials," Phil. Trans. R. Soc. A 368, 3379 (2010). (full text). This is an excellent review of the ways that materials modeling is being applied to problems in alternative energy research. My co-bloggers and I have written quite a lot about sustainability, alternative energy, and green chemistry, so this article fits well with that very timely and topic theme.
  • Delley, "Time dependent density functional theory with DMol3," J. Phys.: Cond. Matter 22, 465501 (2010) (abstract). I wrote last year about the importance of being able to predict spectra. This long-anticipated development adds UV/visible spectrum capability to DMol3. The predictions are good and the calculations are fast. We've applied this to a bunch of cases including structure determination of zeolite catalysts as you can see here.
  • Pickard and Needs, "Ab Initio Random Structure Searching," Psi-k Newsletter 100, 42 (2010). (Full text). Usually, geometry optimization starts with an approximate structure and refines it, but how do you determine the structure of a solid when you have absolutely no idea where to start. Computers and DFT methods are fast enough now that you can actually use random structures and evolutionary algorithms to determine reasonable geometries. And on top of that, they found new classes of molecular structures for N2 under pressure.

Be sure to check out the Accelrys web site for a list of all of the DMol3 and CASTEP titles of 2010. And be sure to let me know what your favorite stories of 2010 were.

 

-

768 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, atomic-scale-modeling, quantum-chemistry
0

What are critical problems in alternative energy research? How does modeling play a role in bringing us closer to answers?

 

A recent review article on this topic by long-time associate Prof. Richard Catlow, et. al, caught my attention. Readers of this blog will be familiar with our many posts pertaining to 'green chemistry,' sustainable solutions, and the like. Last month, Dr. Misbah Sarwar of Johnson Matthey was featured in a blog and delivered a webinar on the development of improved fuel cell catalysts. Dr. Michael Doyle has written a series on sustainability. Drs. Subramanian and Goldbeck-Wood have also blogged on these topics, as have I. All of us share a desire to use resources more responsibly and to ensure the long-term viability of our ecosphere. This will require the development of energy sources that are inexpensive, renewable, non-polluting, and CO2 neutral. Prof. Catlow provides an excellent overview on the applications of molecular modeling to R&D in this area. Read the paper for a very comprehensive set of research problems and case studies, but here are a few of the high points.

 


  • Hydrogen production. We hear a lot about the "hydrogen economy," but where is all this hydrogen going to come from? Catlow's review discusses the generation of hydrogen from water. Research challenges include developing photocatalysts capable of splitting water using sunlight.

  • Hydrogen storage. Once you've created the hydrogen, you need to carry it around. Transporting H2 as a compressed gas is risky, so most solutions involve storing it intercalated in a solid material. LiBH4 is a prototypical example of a material that can reversibly store and release H2, but the process is too slow to be practical.

  • Light absorption and emission. Solar cells hold particular appeal, because they produce electricity while just sitting there (at least in a place like San Diego; I'm not so sure about Seattle). One still needs to improve conversion efficiency and worry about manufacturing cost, ease of deployment, and stability )with respect to weathering, defects, aging, and so forth).

  • Energy storage and conversion. Fuel cells and batteries provide mobile electrical power for items as small as hand-held devices or as large as automobiles. Catlow and co-workers discussed solid oxide fuel cells (SOFC) in their paper.

 

The basic idea with modeling, remember, is that we can test a lot of materials for less cost and in less time than with experiment alone. Modeling can help you find materials with the optimal band gaps for capture generation of photoelectric energy. It can tell us the thermodynamic stability of these new materials: can we actually make them and will they stick around before decomposing.

 

Simulation might not hit a home run every time, but if you can screen out, say, 70% of the bad leads, you've saved a lot of time and money. And if you're interested in saving the planet, isn't it great if you can do it using less resources?

 

Check out some of my favorite resources on alternative energy, green chemistry, and climate change.

 

506 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: catalysis, materials, atomic-scale-modeling, alternative-energy, green-chemistry, fuel-cells, hydrogen-storage, solar-cells
1

Talkin’ about a revolution...

Posted by stodd Jul 6, 2010

As I touched on when I was musing about copy and paste, it never ceases to amaze me how much compute power has increased since I started doing modeling 15 years ago. Back then, I could minimize a few hundred atoms with classical mechanics and watch the geometry update every few seconds after each iteration. Now, I would do a few thousand molecules of the same size in a few seconds.

 

Of course, as computers get faster, the sorts of calculations that customers want to run also get larger and more “realistic.” The explosive growth of multi-core architectures has lead to an increased focus on parallelization strategies to make the most efficient use of the CPU power. Traditionally, the main focus with high performance codes has been on getting them to run in fine-grained parallel as efficiently as possible, scaling a single job over 100’s of CPU’s. The issue here is that, if you are only studying a medium sized system, you will quickly get to a point where communication between nodes costs far more than the time in the CPU, leading to poor performance.

 

The solution, if you are doing multiple calculations, is to run multiple calculations in a coarse-grained, or embarrassingly parallel, approach. In this mode, you run an individual calculation on a single CPU but occupy all your CPU’s with multiple calculations. Materials Studio will do this but it can be monotonous, although you can use queuing systems to get somewhere here. The other issue is that when you start doing many calculations, you also get lots of data back to analyze. Now you not only need a way to submit and control all the jobs, but also to automatically get the results you want.

 

Luckily, the Materials Studio Collection in Pipeline Pilot has both of these functionalities. You can use the internal architecture to submit jobs in both fine and coarse grained parallel (or a mixture of the two if you have a nice cluster!). You can also use the reporting tools to extract the data you want and display it in a report.

 

A great example of this is working with classical simulations of polymers for calculation of solubility parameters. Here, you need to build several different cells to sample phase space and then run identical calculations on each cell. You can quickly develop the workflow and then use the coarse-grained parallel options to submit a calculation to each CPU. When the calculations are finished, a report of the solubility parameter can be easily generated automatically.

 

So, whilst there is a definite need for an expert modeling client like Materials Studio, the Materials Studio Collection really does free you up to focus on problem solving (using Materials Studio!), not waiting for calculations to finish and then having to monotonously create reports!

511 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, classical-simulations, polymers, materials-studio-collection, multicore, parallelization
2
3D Pareto Surface

3D Pareto surface shows the tradeoffs among target properties: dipole moment, chemical hardness, electron affinity. The optimal leads are colored red, poor leads blue.

How do you search through 106 materials to find just the one you want? In my very first blog post "High-Throughput- What's a Researcher to Do?" I discussed some ideas. The recent ACS had a session devoted to doing just that for materials related to alternative energy, as I wrote here and here.

My own contribution was work done with Dr. Ken Tasaki (of Mitsubishi Chemicals) and Dr. Mat Halls on high-throughput approaches for lithium ion battery electrolytes. This presentation is available now on Slideshare (a really terrific tool for sharing professional presentations).

We used high-throughput computation and semi-empirical quantum mechanical methods to screen a family of compounds for use in lithium ion batteries. I won't repeat the whole story here; you can read the slides foryourselves, but here are a couple take-away points:


  • Automation makes a big difference. Obviously automation tools make it a lot easier to run a few 1000 calculations. But the real payoff comes when you do the analysis. When you can screen this many materials, you can start to perform interesting statistical analyses and observe trends. The 3D Pareto surface in the accompanying image shows that you can't optimize all the properties simultaneously - you need to make tradeoffs. Charts like this one help you to understand the tradeoffs and make recommendations.

  • Don't work any harder than you need to. I'm a QM guy and I like to do calculations as accurately as possible. That isn't always possible when you want to study 1000s of molecules. Simply looking through the literature let us know that we can get away with semi-empirical.

Enjoy the Slideshare, watch for more applications of automation and high-throughput computation, and let me know about your applications, too.

658 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, atomic-scale-modeling, high-throughput, lithium-ion-batteries, materials-studio, alternative-energy, green-chemistry, virtual-screening
0
As I wrote last week, “Chemistry for a Sustainable World” was the theme of the ACS spring meeting. In this blog, I report on the research of 3 other speakers of the Monday morning session of CINF. All of these authors are trying to find ways to discover optimal materials from the huge selection of possibilities.

Heard of genomics? Proteomics? These methods generate a lotof data in the analysis of genes or proteins. Information management tools are needed to handle the large amounts of data. The goal, ultimately, is to be able to design new ones (genes or proteins) on the basis of the available information. The number of materials to search (the 'design space') is truly enormous. Searching for optimum materials effiently requires tools that were discussed by three of my fellow speakers in the session.

Prof. Krishna Rajan (Iowa State), discussed his "omics" approach to materials. In his presentation he discussed "a new alternative strategy, based on statistical learning. It systematically integrates diverse attributes of chemical and electronic structure descriptors of atoms with descriptors ...  to capture complexity in crystal geometry and bonding. ... we have been able to discover ... the chemical design rules governing the stability of these compounds..." To me, this is one of the key objectives for computational materials science: the development of these design rules. Design rules, empirical evidence, atomic-level insight - call it what you will, this sort of approach is necessary to make custom-designed materials feasible.

Prof. Geoffrey Hutchison (U. Pittsburgh) really did talk about "Finding a needle through the haystack." He discussed the "reverse design problem." Sure, we can predict the properties of any material that we can think up. But what we really want to know is what material will give us these properties. His group uses a combination of database searching, computation, and genetic algorithm optimization to search the haystack. It's a very efficient way to search these huge design spaces.

Dr Berend Rinderspacher (US Army Research Lab) also discussed the reverse design problem. He pointed out that there are around 10200 compounds of a size typical for, e.g., electro-optical chromophores. He unveiled a general optimization algorithm based on an interpolation of property values, which has the additional advantage of handling multiple constraints; and showed applications to optimizing electro-optic chromophores and organo-metallic clusters.

Terrific work by all the speakers in this session, who are using all methods at their disposal - whether based on informatics or atomistic modeling - to come up with better ways of looking for better materials. Next blog: a summary of my own contribution to the session.
473 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, conferences, atomic-scale-modeling, combinatorial-chemistry, data-mining
0

“Chemistry for a Sustainable World” was the theme of the ACS spring meeting. My own interests, of course, are in modeling, and more recently in high-throughput computation. The Monday morning session of CINF was devoted to the application of this to electronic and optical materials. The presentations included both methodological approaches to materials discover and applications to specific materials, with a focus on materials for alternative energy.


Steven Kaye of Wildcat Discovery Technologies(located only about 4 km from Accelrys) talked about high-throughput experimental approaches for applications such as batteries, hydrogen storage, carbon capture, and more. The company can screen 1000’s of materials a week, and produce these in gm or kg quantities for scale-up. Impressive work.

 

Alan Aspuru-Guzik of Harvard’s Clean Energy Project talked about the search for improved photovoltaics. He uses a combination of CHARMm and QCHEM software

to help look for the best molecules possible for: organic photovoltaics to provide inexpensive solar cells, polymers for the membranes used in fuel cells for electricity generation, and how best to assemble the molecules to make those devices.


Uniquely, the project uses the IBM World Community Grid to sort through the myriad materials and perform these long, tedious calculations. You may remember this approach from the SETI (at) home, which was among the first to try this. This gives everyonethe chance to contribute to these research projects: you download software from their site, and it runs on your home computer like a screensaver: when your machine is idle, it’s contributing to the project. Prof. Aspuru-Guzik said that some folks are so enthusiastic that they actually purchase computers just to contribute resources to the project. It’s great to see this sort of commitment from folks who can probably never be individually recognized for their efforts.

 

I don’t want to make this blog too long, so great talks by Prof. Krishna Rajan (Iowa State), Prof. Geoffrey Hutchison (U of Pittsburgh), and Dr. Berend Rinderspacher (Army Research Labs) will be covered in the next blog.

 

I was also quite happy to see that the some of the themes in my presentation were echoed by the others – so I’m not out in left field after all! I’ll blog about my own talk later on, but here’s a quick summary: Like Prof. Aspuru-Guzik’s work we used high-throughput computation to explore new materials, but we were searching for improved Li-ion battery electrolytes. We developed a combinatorial library of leads, set up automated computations using Pipeline Pilot and semiempirical VAMP calculations, and examined the results for the best leads. Stay tuned for a detailed explanation and a link to the slides.

 

And keep an eye out, too, for the 2nd part of Michael Doyle's blog on Sustainabilty.

433 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, conferences, atomic-scale-modeling, high-throughput, lithium-ion-batteries, alternative-energy, virtual-screening, clean-energy, photovoltaics
0

How many modelers?

Posted by AccelrysTeam Mar 8, 2010

How many of “us” are out there? I mean how many people doing modeling and simulation? I’d really like to know, ideally broken down by discipline, such as Materials Science vs Life Science, and quantum, classical and mesoscale.

 

Alas, there are preciously few statistics on that, so when I read in the Monthly Update (Feb 2010) of the Psi-k network that they conducted a study on size of the ab initio simulation community, it got my immediate attention.

 

Representing a network of people from the quantum mechanics field, Peter Dederichs, Volker Heine and colleagues Phivos Mavropoulos and Dirk Tunger from Research Center Jülich searched publications by keywords such as ‘ab initio’, and made sure not to double-count authors. In fact they tend to underestimate by assuming people with the same surname and first initial are the same. As Prof Dederichs, the chair of the network tells me, checks were also made to ensure that papers from completely different fields are not included. Also they estimate that their keyword range underestimates the number of papers by about 10%. Of course there are those that didn’t publish a paper in 2008, the year for which the study was done. Moreover, Dederichs says, there are those who published papers which don’t have proper keywords like “ab initio” or “first principles” in the abstract or title, so they are not found in the search. All of that is likely to compensate for counting co-authors that are not actually modelers.

 

All in all, they come up with about 23,000 people! And the number of publications in the field indicates a linear rise year on year.

 

That’s quite a lot more than they expected, and I agree. The global distribution was also surprising, with about 11,000 in Europe, about 5,600 in America, and 5,700 in East Asia (China, Japan, Korea, Taiwan and Singapore). That’s a lot of QM guys, especially here in Europe. Now, there will be a response from the US on that one I guess?

 

 

I wonder how many classical modelers there are. I’d hazard a guess that the number of classical modelers is about half those in the QM community, at least in the Materials Science field. Assuming that the mesoscale modeling community is quite small, that would make for a total of at least 30,000 modelers worldwide.

 

What is your view, or informed opinion? Anybody else knows about or has done some studies? I am going to open up a poll in the right sidebar on the number of people involved in quantum, classical and mesoscale modeling in total. It would be great to hear also how you came up with your selection.

422 Views 0 References Permalink Categories: Modeling & Simulation Tags: atomic-scale-modeling, quantum-mechanics, computational-chemistry, ab-initio-modeling
0

Calling DFT to Order

Posted by gxf Feb 15, 2010

One of the most interesting developments in density functional theory (DFT) in recent years is the emergence of the so-called "Order-N" methods. What's that mean? Quantum chemists and physicists classify the computational cost of a method by how rapidly it scales with the number of electrons (or the number of molecular orbitals.) This can get into a real jargon of computational chemistry, but here are some examples:  

 

 

ONETEP gets its speed by using localized molecular orbitals (MOs). Top: a conventional MO is spatially delocalized, hence it interacts with many other MOs. Bottom: localized MOs do not interacte, hence less computational effort is required to evaluate matrix elements.

 

 

Consider the N4 case as an example. This means that if you double the size of the system that you're modeling, say from a single amino acid to a DNA base pair, the cost  (i.e., CPU time) goes up by roughly 16x. That makes many of these approaches prohibitive for systems with a large number of atoms. The good news is that it doesn't really need to cost this much. The atomic orbitals that constitute the molecular orbitals have finite ranges, so clever implementations can hold down the scaling. The holy grail is to develop methods that scale as N1 or N, hence the expression "Order-N" or "linear scaling." Using such a method, doubling the size of the system simply doubles the amount of CPU time.  

 

My favorite Order-N method is ONETEP (not surprising, considering that it's distributed by Accelrys). As explained in their publications, this approach uses orbitals that can be spatially localized more than conventional molecular orbitals to achieve its speed. As a result of localization, there's a lot of sparsity in the DFT calculation, meaning a lot of terms go to zero and don't need to be evaluated. Consequently, it's possible to perform DFT calculations on systems with 1000s of atoms. Because of its ability to treat system of this size, it's ideally suited for nanotechnology applications. Some recent examples include silicon nanorods (Si766H462) or building quasicrystals (Penrose tiles) with 10,5-coronene

 

Why bring this up now? CECAM (Centre Euopéen de Calcul Atomique et Moléculaire) is hosting a workshop on linear-scaling DFT with ONETEP April 13-16 in Cambridge, UK. This is a chance for experienced modelers and newcomers to learn from the expert. Plus they'll have access to the Cambridge Darwin supercomputer cluster, so attendees will have fun running some really big calculations. What kind of materials would you want to study if you had access to this sort of technology?

445 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: nanotechnology, atomic-scale-modeling, linear-scaling, onetep, order-n
1 2 Previous Next