Skip navigation
1 2 3 Previous Next

Accelrys Blog

32 Posts authored by: gxf
1

One of the things I really enjoy about doing science is seeing it have an impact of everyday life. With Halloween upon us I thought I'd write a bit about how molecular modeling has helped make treats yummier.

scary_bloom.png

 

Chocolate Bloom - YUCK

 

Though less common these days, chocolate bloom can still strike the unwary candy bar. Bloom is an unsightly (some would even say 'scary') white crust that appears on chocolate. While decidedly not a health risk, discourages people from eating the chocolate, and potentially can give a candy bar a bad name. Don't be afraid: it's only triglycerides recrystallizing on the surface of the chocolate. Triglycerides can crystallize in several forms. The form that's ideal for chocolate, with the best texture and appearance, may not be the most thermodynamically stable. Let the chocolate sit too long or at an elevated temperature and the crystals can undergo a phase transformation into the spine-chilling form you see pictured here.

 

So how can modeling help? Glad you asked.

 

Applications like Materials Studio Polymorph Predictor can identify the lowest energy crystal structures. Is the creamy, chocolaty form of your triglyceride then most stable? Huzzah: your chocolate is stable and unlikely to bloom. Oh, it's not? Well, then you can predict the energy barrier to phase change and estimate the temperature conditions. This will let you know how stable the chocolate is.

 

emulsifier.png

More exciting, I think, is the opportunity to engineer a better chocolate. There are lots of different triglycerides that could work. What if you could add functional groups or create a mixture that was more stable? Modeling lets you test lots & lots of combinations rapidly. What about additives? Clearly, there is a lot of support for 'natural' products, but a lot of us will tolerate an additive or two. An emulsifier like the one pictured here can improve stability of particular crystal forms. Modeling, again, can help you to screen a myriad of combinations of triglycerides, functional groups, and emulsifiers fast.

 

Of course, predicting the taste and consumer appeal of the finished product is a complex process and quite beyond molecular modeling, so experimentation, taste tests, and consumer advisory panels are not going away any time soon. But with modeling, a better product can be produced faster - and for less money.

 

But wait! There's more

 

Recently I gave a webinar in which I discussed chocolate bloom and two other ways that modeling has helped with food: designing odorants, and keeping down calories. We'll post the link to the recording as soon as it's available. And be sure to share your stories about scary food this Halloween.

1,205 Views 0 References Permalink Categories: Materials Informatics, Data Mining & Knowledge Discovery, Modeling & Simulation Tags: materials_studio, classical_simulation, crystal_structure_prediction, atomic_scale_modeling
2

Over the past couple of years I've become a big fan of using 'workflow automation' to help me with my modeling projects. What exactly does that mean? Like many of my quantum mechanics colleagues, I had trouble getting my head wrapped around this idea. What I mean by 'workflow automation' is creating some kind of little computer "program" that helps me do my job faster and easier. I put "program" in quotes because you no longer need to be a programmer to create one of these. Instead you use a 'drag & drop' tool that makes it so easy that (as GEICO says) a caveman could do it. This terrific capability has been added to the commercial version of MS 6.1, and it should make life easier for lots of modelers.

 

workflow1.png

What happens when you need to do the same DFT calculation over & over & over again? You'll also probably need to extract one particular datum from the output files, e.g., HOMO-LUMO gap, or total energy, or a particular C-O bond length. With workflow automation you drag in a component that reads the molecular structure, another to do the DFT calculation, and a third to display the results. This is pictured in the figure. (I cheated a bit and added a 4th component to convert the total energy into Hartrees.)

 

Not impressed? Let's look at a more complex case. A colleague of mine was studying organic light-emitting diodes (OLEDs) based on ALQ3. He created a combinatorial library of 8,436 structures. In order to characterize these he needed to compute a bunch of stuff:

  • total energy, HOMO, and LUMO
  • Vertical ionization potential (IP)
  • Vertical electron affinity (EA)
  • Adiabatic IPOLED2.png
  • Adiabatic EA

This requires a total of 3 geometry optimizations (neutral, cation, and ion) and a few extra single point energy calculations. You then need to extract the total energies from the results and combine them to compute the IPs and EAs. Doing 1 calculation like that is no big deal, but how do you do it 8,436 times? And how do you sort through the results? Using automation, of course! Even a novice can set up the protocol in under an hour, about the time it would take to process 4 or 5 of the structures manually. Plus you can display your results in a really nice html report like the one pictured.

 

Using this approach makes it easy to combine calculations in new ways. One protocol I wrote, for example, uses MS Polymorph Predictor to determine the lowest energy crystal structures of a molecule, but before starting the calculation it determines the atomic partial charges with density functional theory (DFT). And once it's done it can use DFT again to refine the predictions. Another combines MS Amorphous cell and MS Forcite to automate the computation of solubility parameter of polymers.

 

You've been able to generate these sorts of workflows for quite some time using Pipeline Pilot. So what's new? Now Materials Studio customers will get a license to these tools, and they'll be able to call their protocols from within the MS GUI. The combination of Pipeline Pilot and Materials Studio means you can create sophisticated workflows easily - without the complexity of perl scripting - and you can launch the protocols from and get the results back to MS.

 

ProtocolDialog.png

Share and share alike

The final point I'd like to make about these protocols is how easy they are to share. Simply place the protocol in a shared folder on the server and anybody can run jobs. There's no need for them to download a copy or configure anything. If they have an MS Visualizer then they can use your protocol. This lets experienced modelers create reliable, repeatable workflows that they can share with non-modelers. Perhaps more interesting to expert modelers, we can learn from each other and share advanced workflows.

 

I think it'd be awesome if we had a section on the Materials Studio community pages where we exchange protocols. I'll kick off the process by loading my Polymorph Predictor protocol. I'll let you know when that's ready. Stay tuned.

 

In the meantime, let me know what types of protocols you'd like to see. What are the things that you spend the most time on? What things could be streamlined, automated? Don't be shy: post a note to this blog and let everybody know what you think.

808 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: pipeline_pilot, materials_studio, dft, materials, polymorphs, automation
0

DFT by the Numbers: 2011

Posted by gxf Jan 12, 2012

DFT_2011.pngThis blog post continues my annual tradition of reviewing the DFT literature for the previous year. Most users of Accelrys Materials Studio have at least heard of this method. DFT=Density Functional Theory, a quantum mechanical method for solving the electronic Schroedinger equation of molecules and crystals using a formalism that is based the charge density rather than the wavefunction. You can see the Kohn-Sham equations at a number of sites on the web. These are what have been implemented in a number of software packages such as Accelrys MS DMol3 and MS CASTEP, as well as, in packages developed by other research groups.

 

In my annual informal survey, the year 2011 saw a 22% increase in the number of publications with the term "DFT" in the text (as determined by full text searches at the ACS and Science Direct web sites). Total count was 12 080. DFT has been outrageously successful since it combines relatively high accuracy with low computational cost - at least low by QM standards. In addition, it works very well for periodic systems, so it has extended routine QM calculations to areas such as heterogeneous catalysis.

 

This year, I'd like to highlight some applications in one of my favorite areas: alternative energy. I found a total of 2,255 citations that included DFT along with 'fuel cells,' 'batteries,' 'biofuels,' 'solar cell,' or 'photovoltaic'. That's almost 20%: Just about 1 DFT paper in 5 addressed some topic in alternative energy! Let's hear it for green modelers.

 

Some of my favorite papers from 2011:

  • Gao, et al., Dynamic characterization of Co/TiO2Fischer-Tropsch catalysis with IR spectroscopy and DFT calculations, presentation at North American Catalyst Society meeting Detroit, 5-10 June 2011, available on slidesshare.
  • Castrucci, et al., Light harvesting with multiwall carbon nanotube/silicon heterojunctions Nanotechnology 22 (2011) 115701. doi:10.1088/0957-4484/22/11/115701
  • Shin et al, Effect of the alkyl chain length of C70-PCBX acceptors on the device performance of P3HT : C70-PCBX polymer solar cells, J. Mater. Chem., 2011, 21, 960–967. DOI: 10.1039/C0JM02459G
  • Gaetches, et al., Ab Initio Transition State Searching in Complex Systems: Fatty Acid Decarboxylation in Minerals, J. Phys. Chem. A 2011, 115, 2658–2667. dx.doi.org/10.1021/jp200106x
  • Di Noto et al, Structure–property interplay of proton conducting membranes based on PBI5N, SiO2–Im and H3PO4 for high temperature fuel cells, Phys. Chem. Chem. Phys., 2011, 13, 12146–12154. DOI: 10.1039/c1cp20902g

 

Full disclosure: my favorites were done with Accelrys tools. You can see all the 2011 scientific publication lists for DMol3 and CASTEP on the Accelrys web site. Be sure to let me know what your favorites are.

1,751 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials-studio, alternative-energy, green-chemistry, quantum-chemistry
0

As many readers know, I have a personal interest in alternative energy research. The past year, 2011, brought good news not just for research but for practical implementation, as well. A comprehensive analysis of the deployment of alternative energy technologies has been collected in the Renewables 2011 Global Status Report. This report covers historical growth in many areas of renewable energy, as well as, year-on-year growth from 2009 to 2010.

 

PV.pngThe report estimates that 194 gigawatts (GW) of electric generating capacity were added globally in 2010. It’s great to see that about half the new capacity comes from renewables. Including hydroelectricity, renewables account for about a quarter of total capacity: 1320 out of 4940 GW. Photovoltaics, in particular did well, increasing generating capacity by 73% year-on-year!

 

A lot of research articles were published in 2011. The research that I follow focused on improving the materials and processes used to generate or store energy and fuels. In biomass conversion, for example, people want to convert more of the plant to fuel, convert it to higher energy compounds, and do it cheaper and faster. In the area of batteries, we need materials that store more energy per kg, deliver it more rapidly, and last longer.

 

These kinds of topics may seem far removed from the practical engineering aspects of building a photovoltaic power station, but fundamental research is critical to improving efficiency and reducing cost. On ScienceDirect one can find for 2011 over 6500 citations on batteries, 5000 on photovoltaics, 6000 on fuel cells, and 3000 on biofuels. That’s about a 30% increase over 2010. Let’s hear it for fundamental research!

 

As a scientist, I’m thrilled at the technical achievements. As a concerned citizen, I’m pleased to see that alternative energy is making significant inroads in replacing fossil fuels.

 

What will next year bring? What do you readers think is the most significant alternative energy development we’ll see in 2012?

 

I’m hoping for a new battery I can swap into my Prius that will double my mileage to 100 mpg (42 km/L).

1,188 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: alternative-energy, green-chemistry
3

Do We Really Need Solar Power?

Posted by gxf Aug 4, 2011

Ok, before you start with the hate mail, the answer to this rhetorical question is YES. Photovoltaic solar currently provides only a small percent of our electrical needs, but it grew 53% in 2009. And advances like these printed panels mean that it's going to become less expensive and more widespread.

 

So why the provocative question? I recently saw a couple of postings where people ask "Hey, why don't we put solar panel on electric and hybrid cars to make them even more efficient?" I found an answer to this question from as early as 2008 in the Washington Times - and this analysis was probably carried out even earlier: the energy from solar panel is a drop in the bucketcompared to the needs of a commercial vehicle. The question bugged me enough that I thought I'd go through the analysis again in detail for folks. The bottom line: you need 70 m2 worth of panels to drive 60 km/h.

 

There's a terrific analysis of vehicle power consumption by Hideaki Horie in Lithium Ion Rechargeable Batteries: Materials, Technology, and New Applications, edited by Kazunori Ozawa. The force required to overcome rolling friction and air resistance in a vehicle is

F = mTg M + 0.5 rCD A V2

where

  • mT is the coefficient of friction, ~0.025
  • g is the acceleration of gravity, ~9.8 m/s2
  • M is vehicle mass, ~1500 kg
  • ris the density of air, ~1.29 g/cc
  • CD is the drag coefficient, ~0.35
  • A is the exposed vehicle area, ~2 m2
  • V is the velocity, taken here as 60 km/hr = 16.7 m/s

Power consumption, P, is force times velocity,

  • P = F * V

Put this all together and you get a requirement of about 8 kW to drive a vehicle at 60 km/h.

 

300px-Solar_Car_Tokai_Challenger.jpgHow does this compare to solar? TLC tells us that a home solar panel produces 70 mW/in2, or 110 W/m2. Hence you need over 70 m2worth of solar panels to drive at 60km/h. I haven't measured the surface area of my Prius, but it's definitely less than that.

 

Am I being negative? Not at all. I just want people to think critically about alternative energy. The same TLC site calculates that 26 m2of solar panels will meet the needs of a typical home, and that's quite feasible to put on a house or in a back yard. How about using those panels to charge your car battery? Driving the car at 60 km/h for 1 hour consumes 29 MJ of energy. 26 m2worth of solar panels will generate 10 MJ/h, so you'd need to charge for 3 hours to drive that far. This approach might actually work for small, efficient cars (think low M, A, and CD) on short trips.

 

What can we do to make solar power more relevant? As a scientist, I am fascinated by potential new photovoltaic technology like GaAs nanopillars or carbon nanotubes. As a concerned scientist, I can write my elected representatives and urge them to continue funding research like that. But as a consumer, I'll do more for the environment if I take the train to work than if I put solar panels on my car.

881 Views 0 References Permalink Categories: Materials Informatics Tags: alternative-energy, solar-energy
0

Catalysis is one of my favorite topics. They're challenging - but not impossible - to model; the chemistry is fascinating; and they are really, really useful. The Vision Catalysis 2020 Report from the U.S. Chemical Industry estimates that catalysis-based chemical syntheses account for 60 percent of today’s chemical products and 90 percent of current chemical processes. Development of new catalysts is motivated by the need to reduce energy costs, increase yields, and utilize alternative feedstock. Every two years, the leaders in catalysis R&D come together in the North American Catalyst Society meeting (incongruously abbreviated NAM). NAM 22 was held in Detroit 6-9 June. There were some excellent talks there, as well as opportunities to rub elbows catalyst researchers.

 

The theme of the meeting was "Driving Catalyst Innovation." As one might expect in the current climate, there was a lot of focus on alternative energy including electrocatalysis for energy conversion, syngas production, and CO2 capture. If you share my interest in catalysis, then check out the NAM 22 technical program to see all the hot topics. The extended abstracts provide quite a bit of good background for the talks. The program is quite lengthy, no way I can summarize it in this blog, but here are two of my favorite presentations:

  • Catalysts Live and Up Close:Insights from In-situ Micro- and Nano-Spectroscopy Studies, by Prof B.M. Weckhuysen, Ultrech University, discussed the spatiotemporal characterization of individual catalyst particles at the micron- and nano-scale. A detailed understanding of reaction mechanism requires an analysis at the molecular level, and the techniques described in this presentation show how advanced characterization techniques are making this possible.
  • Catalysis for Sustainable Energy by Prof J.K. Norskov, Stanford presented approaches to molecular level catalyst design. Prof. Norskov is well-recognized for his work modeling heterogeneous catalysts. In this presentation he drew from examples including carbon dioxide reduction, and biomass transformation reactions.

CoCatalyst.png

Images of two Co/Tio2 samples used in the study. IWI = preparation by Incipent Wetness Ipregnation. HDP = preparation by Homogeneous Deposition Precipitation.

In quite different ways, both of these underscore the importance of understanding catalysts at the atomic level, an area where modeling is indispensable.

I was delighted that this year I had the opportunity to give a presentation of my own on work that elucidated the structure of Co Fischer-Tropsch catalysts. The Fischer-Tropsch process is instrumental in converting biomass into fuel. By understanding the process in detail, we hope to create more efficient catalysts. Imagine one day dumping your yard waste in the top of a chemical reactor and draining fuel out the bottom. Realizing that vision is wtill a ways off, but we're working on it.

680 Views 0 References Permalink Categories: Modeling & Simulation Tags: catalysis, conferences, atomic-scale-modeling, alternative-energy
0

What it takes to model a gecko

Posted by gxf Jun 13, 2011

PBS ran a great series on materials science a few months ago called Making Stuff.There were episodes on making stuff Smaller, Stronger, Cleaner, and Smarter. One of my favorite segments was in Making Stuff: Smarter(see chapter 3, around 10 minutes in). The gecko climbs walls with ease because of millions of tiny hairs in contact with the surface (see the image taken from the PBS video). Van der Waals forces between the hairs and the wall provide the adhesion. Prof. Kellar Autumn of Lewis & Clark College tells us that about half of the gecko's foot is in molecular contact with the wall. Compare that to only about 10-4 of a human hand which contacts a surface at the molecular level.

PBS_Gecko.png

What amazes me about this is that the van der Waals (VDW) forces are extremely weak - but they add up to something significant. On the macroscopic level we see this in the gecko's feet. More subtly, we can observe this in the trends of boiling points of alkanes: heavier alkanes have higher boiling points because the contact surface area between larger molecules provides greater VDW forces. And the crystal structure of organic molecules is determined in large part by the VDW forces among molecules.

 

So what are VDW energies & forces? Wikipediahas a nice description so I won't need to go into detail. Suffice it to say that these forces are based on the dipole-dipole interactions between molecules. These can be due to permanent dipole moments or to induced dipole moments. Commonly, the VDW terms are included in molecular force fields with an expression like eij (sij/rij)6where the subscripts i and j refer to atoms; r is the distance between two atoms; and e and s are parameters based on the atoms types. People invest a lot of effort trying to generate parameters that can reproduce experimental results. To avoid parameterization, you'd like to be able to do this with a quantum mechanics-based method. Unfortunately, my favorite method, Density Functional Theory(DFT) doesn't encompass these dipole-dipole interactions. Higher level quantum mechanics method do include these term; but they take too long to run.

 

Fortunately, people are working on this. There exists a variety of hybrid semiempirical solutions that introduce damped atom-pairwise dispersion corrections of the form C6R-6. The most well-known schemes were put introduced by Grimme, Jurecka, et al., Ortmann, Bechstedt, and Schmidt, and most recently by Tkatchenko and Scheffler. These semi-empirical corrections to DFT were implemented in the programs DMol3 and CASTEP by Erik McNellis, Joerg Meyer, and Karsten Reuter of the Fritz-Haber-Institut der Max-Planck-Gesellschaft. Thanks to their work, we can perform more accurate calculations where weak forces are important. We're a long way from modeling an entire gecko, but now we can do a good job of modeling the tip toes of their tiny feet.

625 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials_studio, atomic-scale-modeling, quantum-chemistry, van_der_waals
0

Natural Gas to the Rescue

Posted by gxf Jan 20, 2011

A recent article in C&E News caught my attention: "Ethylene from Methane." This discusses some very interesting technology that can indirectly help ease the energy crunch.

 

Here's a little background. We do a lot with petrochemicals beside putting them into our automobiles. Most modern polymers are derived from petroleum, so as the oil wells dry up, there go our Barbie (TM) dolls, cell phone cases, and stretchy polyester trousers. The principle starting material for most of these polymers is ethylene, H2C=CH2. This is commonly formed from petroleum feedstock by steam cracking, which makes smaller molecules out of larger ones like naptha or ethane. How much longer can we keep this up? According to SEPM (Society for Sedimentary Geology) the current world-wide proven reserves are estimated to be over 1,000 billion barrels of crude oil; at the current rate of oil consumption there are only 32 years of reserves left. According the US Energy Information Administration, the reserves of natural gas are 6,609 trillion cubic feet, and the C&EN article claims that we have 'multiple centuries' of this left.

 

The problem is that natural gas is primarily methane (CH4) and it's been harder to build ethylene up from this small molecule than to crack it from larger ones... until now. According the the C&EN article, a new process of oxidative coupling of methane (OCM) holds the promise of tapping this new source of raw material and making ethylene for a lower energy cost. The approach to this problem was really neat. Based on the work of Angela Belcher of MIT, scientists at Siluria Technologies grow catalysts by mineralizing the surface of a virus. By genetically engineering the viruses, they can generate an almost limitless number of different surfaces and hence explore a huge range of catalysts. An effecient OCM catalyst means that it becomes commercially feasible to use cheap natural gas as a feedstock to create raw materials for sophisticated products.

 

This sort of creative solution to problems of energy and natural resource utilization are excellent examples of why government must continue funding fundamental research. This work draws on the fields of molecular biology, catalysis, high-throughput screening, and petrochemistry — disciplines that you might not think would play well together. To find the best solutions to our energy challenges, researchers need the freedom to explore lots of alternative solutions; and we need to have many scientist engaged in this. As scientists and citizens, let's make sure that our elected officials know how important fundamental research is to our future.

699 Views 0 References Permalink Categories: Materials Informatics Tags: catalysis, alternative-energy, green-chemistry
0

DFT Redux Again

Posted by gxf Jan 4, 2011

In what is becoming an annual tradition, my first blog of the year is a quick summary of Density Functional Theory (DFT) applications for the past twelve months. Readers of Accelrys' blogs will be familiar with this modeling method. With this approach you can study heterogeneous catalysis on an extended (periodic) surface; predict crystal structures; or calculate elastic constants. Life science applications include small-molecule design and solvation energies. Using the modest computers that I have access to, I routinely run DMol3 calculations on systems with up to 300 atoms or so. With ONETEP and larger computers, it's easy to do computations on around 1000 atoms with DFT. This is amazing when you consider that the largest calculation I did in graduate school was around 10 atoms.

 

Some of my favorite publications from 2010:

  • Catlow, et al., "Advances in computational studies of energy materials," Phil. Trans. R. Soc. A 368, 3379 (2010). (full text). This is an excellent review of the ways that materials modeling is being applied to problems in alternative energy research. My co-bloggers and I have written quite a lot about sustainability, alternative energy, and green chemistry, so this article fits well with that very timely and topic theme.
  • Delley, "Time dependent density functional theory with DMol3," J. Phys.: Cond. Matter 22, 465501 (2010) (abstract). I wrote last year about the importance of being able to predict spectra. This long-anticipated development adds UV/visible spectrum capability to DMol3. The predictions are good and the calculations are fast. We've applied this to a bunch of cases including structure determination of zeolite catalysts as you can see here.
  • Pickard and Needs, "Ab Initio Random Structure Searching," Psi-k Newsletter 100, 42 (2010). (Full text). Usually, geometry optimization starts with an approximate structure and refines it, but how do you determine the structure of a solid when you have absolutely no idea where to start. Computers and DFT methods are fast enough now that you can actually use random structures and evolutionary algorithms to determine reasonable geometries. And on top of that, they found new classes of molecular structures for N2 under pressure.

Be sure to check out the Accelrys web site for a list of all of the DMol3 and CASTEP titles of 2010. And be sure to let me know what your favorite stories of 2010 were.

 

-

769 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, atomic-scale-modeling, quantum-chemistry
0

Sustainability continues to grow in importance as an issue, both ecologically and sociologically. One source of resistance, I think, is the perceived cost of implementing sustainable practices vs the status quo. A recent article by Rosemary Grabowski in Consumer Goods Technology (CGT) outlines a number of practices and shows - among others things - why cost shouldn't be an issue. Sustainability isn't just good for the environment: it's good for the bottom line. Grabowski lists 10 requirements for sustainable strategies that also serve as drivers of business. I won't repeat them all here since you can read them for yourselves, but I'd like to highlight a couple that fit with the themes that we've already explored on this web site over the past year or so.

 

We have written a lot on sustainability on this web site, with examples such as Michael Doyle's blogs on sustainability, my own blog on methanol from biomass, or Michael Kopach's entry on incorporating green features into electronic lab notebooks (ELNs).

 

As a modeler I just love these topics because there's so much that software can contribute - in fact all 10 of Grabowski's requirements can be met this way. Open Collaboration, for example. This is an idea that P&G pioneered some years ago. Of course anyone can dump a new idea into the system, but you need a way to keep track of it, share it, add to it, and feed it back again. This is where software tools like ELNs come in. They also deliver centralized data, data continuity, and tracking, all of which are mentioned in the CGT article.

 

Another key requirement is Virtual Design. (You guessed it: as a modeler this is my absolute favorite topic.) Software has reached the stage where a lot of work can be done computationally rather than in the lab. No messy chemicals to clean up, no animals to experiment on, no toxic by products. The capabilities include alternative energy, more efficient use of resources, and the prediction of toxicology (see SlideShare for specific examples).

 

Sustainability shouldn't be a buzz word in 2011: it should be a serious consideration for all corporations and individuals concerned about the future. Software offers  inexpensive and non-pollution solutions. I've touched on only a few here, but I'm really interested to hear from you folks out there. What are your challenges in sustainability and how are you meeting them? Whether with software or otherwise, let me know.

723 Views 1 References Permalink Categories: Electronic Lab Notebook, Modeling & Simulation Tags: alternative-energy, green-chemistry, sustainability
0

What Happens When DFT Gets Excited?

Posted by gxf Nov 29, 2010

At the start of this year I wrote about the importance of spectroscopy in my blog Spectroscopy: Where Theory Meets the Real World. Experimentally, spectroscopy is essential for the characterization of compounds, identification of unknowns, tracking of contaminants, in situ monitoring of chemical processes, and more. Modeling has worked hard to develop reliable ways to predict spectra from first principles.

 

Since I wrote in February 2010, Materials Studio  has added Time-Dependent DFT (TD-DFT) to DMol3. This provides a means to predict optical properties like UV/visible spectra and hyperpolarizabilities, adding to the IR, THz, and other methods that I discussed earlier. My recent webinar discussed some of the theory behind this and gave some examples. (You can download a recording, or view a pdf copy.)

 

An interesting question came up during Q&A, and I wanted to highlight it here since I think it's of general interest and serves to underscore some of the less intuitive aspects of DFT (density functional theory). There are several different approximations that one can use to predict the energy of excited states. The simplest is to use the differences in the orbital eigenvalues. The question is why this is such a poor approximation in DFT, especially given that it's a reasonably good approximation in molecular orbital (MO) approaches like Hartree-Fock or AM1.

 

The answer lies in the difference between an MO-based approach and a density-based approach. The theory of DFT is based on the charge density, not the MOs. While the eigenvalues of Hartree-Fock theory are related to the energy of the orbitals the eigenvalues in DFT are related to, well, something else. Write down the Hartree-Fock energy expression for a molecule and for the cation and then take the difference. You get the formula for the orbital eigenvalue. This is Koopman's theorem: εi= E(neutral) - E(cation) (and you can read more about it here). When you do the same thing for the DFT energy, you discover Janak's theorem, εi=∂E/∂ni. If you approximate this with finite differences, εi=ΔE/Δniit then it might look a bit like Koopman's theorem, but the finite difference turns out to be a poor approximation to the infinitesimal derivative. You can read just how bad the approximation is in my 2008 paper.

 

Fortunately for us, we now have TD-DFT. As illustrated in Delley's original paper and in my own modest tests, this is a dramatic improvement over what was possible before. It increases the usefulness of this essential modeling tool and provides another way for theory to connect to the real world of observables. Stay tuned for more results in the coming weeks.

555 Views 0 References Permalink Categories: Modeling & Simulation Tags: dft, materials-studio, quantum-chemistry, spectroscopy
0

Everything old is new again. It’s interesting to see ideas in the high-performance computation community coming back around after many years. In the late 1980’s I used an array processor from FPS to speed up coupled-cluster calculations (very expensive and accurate electronic structure simulations). The array processor was bolted onto a VAX. The most compute-intensive parts of the application were shipped off to the FPS machine where they ran a lot – maybe 50-100x – faster.

 

A couple months ago I forwarded a tweet from HPC Wire, GPGPU Finds Its Groove in HPC, but a more recent article in Annual Reviews of Computational Chemistry, “Quantum Chemistry on Graphics Processing Units”, is far less optimistic about this new hardware. Authors Götz, Wölfle, and Walker discuss Hartree-Fock and density functional theory and “outline the underlying problems and present the approaches which aim at exploiting the performance of the massively parallel GPU hardware.” So far quantum chemistry results on the GPU have been less than stunning.

 

 

China's Tianhe-1A super computer uses 7,168 Nvidia Tesla M2050 GPUs and 14,336 CPUs

 

 

For those not in the know, the GPU, or graphical processing unit, “is a specialized microprocessor that offloads and accelerates 3D or 2D graphics rendering.” The idea behind General Purpose CPUs (GPGPUs) is that this power can be harnessed to do something other than graphical rendering.

 

The GPGPU approach is similar to the one we used in 1988 in Rob Bartlett’s research group. Routine calculations are done on the main microprocessor, but the most time-consuming computations are shipped off to specialized hardware (array processor then, GPU today). The programmer is required to insert directives into the code that tell the computer which hardware to use for which parts of the calculation. Usually a lot of code rewriting is required to achieve the speedups touted by the hardware suppliers. One of the major differences between then and now is the cost: our FPS cost on the order of $250 000, but a GPU board goes for around $300. And that’s one of the big reasons that people are so excited.

 

According to the article in HPC Wire, applications such as weather modeling and engineering simulation (ANSYS) have been ported, though ANSYS reported only a modest 2x speedup. The report in Ann. Rev. Comp. Chem., by contrast, points out the many challenges in getting quantum chemical (QC) calculations ported. In the case of the coupled-cluster calculations, we could formulate much of the problem as matrix-matrix and matrix-vector multiplications, which really fly on an array processor.  Hartree-Fock or DFT programs – the QC methods most commonly in use by commercial molecular modelers – need to evaluate integrals over a basis set and perform matrix diagonalizations, and these operations are less easy to port to the GPU.

 

GPUs hold tremendous potential to reduce the cost of high-performance computation, but it’s going to be a while before commercial QC products are widely available. In the meantime, stay hopeful, but at the same time stay realistic. Maybe the approach will work this time around, but it’s still too early to say.

995 Views 0 References Permalink Categories: Modeling & Simulation Tags: computational-chemistry, quantum-chemistry, ab-initio-modeling, graphical-processing-unit
0

Consumer packaged goods (cosmetics, cleaning products and the like) are something that we can all relate to. Who doesn’t want cleaner clothes, softer skin, or shinier hair (at least those of you who have hair). The predilection among consumers for a better look has made this a billion dollar industry. As I wrote last year (Cosmetics Got Chemistry) following my trip to the Society of Cosmetic Chemists conference, the development of goods such as personal cleaning products has progressed from the barbaric (i.e., fatty acid salts) to the sophisticated, e.g., adding silicones to combine shampoo and conditioner.

 

The new trend is ‘cosmeceuticals,’ the combining of cosmetics with active pharmaceutical ingredients to deliver added benefits like moisturizers or anti-aging creams. This was discussed in a recent article Improve Your Product’s Image with Smarter Scientific Informatics by my colleague Tim Moran.

 

He focuses on the use of image analysis and data integration to streamline the development of cosmeceuticals.  Modelers generally consider things like molecular structure, analytical results (IR spectra, etc.), synthesis pathways, and of course product performance. But substantiating the performance is time-consuming and often subjective: does this cream really remove wrinkles, or can another genuinely remove age spots?

 

 

The lips on the left belong to an older individual, the ones on the right to a younger. The ones on the left are clearly more wrinkled, but by how much?

 

 

Image processing provides a new dimension to the types of data and models that R&D teams can process, making quantifiable analysis possible far more efficiently than before. Consider the two images in this blog. One set of lips clearly has more wrinkles, but how many more? How could you substantiate the difference if the results are less obvious? You could measure wrinkle lines one at a time with a ruler, slowly & tediously; or use image analysis to count the total number of wrinkles in seconds. This provides immediate feedback to the R&D team so they know when they’ve got a better product - and how much better is it. It also provides the marketing (and legal) team with objective data that can be used to substantiate product claims.

 

Measurable, quantitative, verifiable information. That’s at the heart of the scientific method, and now even unstructured data can receive benefit.

453 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: publications, image-informatics, consumer-packaged-goods
0

Molecular and materials modeling has long been successful in areas such as catalysis, polymers, coatings, adhesives, and semiconductors. But did you know about all the applications in areas such as perfumes, chocolate, or cosmetics? These are just different types of materials after all, so it shouldn't  surprise anyone that Accelrys announced that it’s making inroads into the consumer packaged goods sector.

Over the past year several of my fellow scientists and I have been investigating the ways that both modeling and informatics solutions can be applied to CPG R&D. There are some great write-ups out there:

Also note Dr. Felix Grant's article Material Values in Scientific Computing World. And check out the great image on the cover.

 

What’s fueling this recent surge of scientific informatics and modeling into CPG? Like so many other R&D-based sectors, CPG has been challenged to remain competitive while holding down costs — and nothing says “take my money” like a wrinkle cream that really works, and is sold over the counter at the local pharmacy! As the articles I've cited illustrate, predictive analytics can help R&D teams find answers faster and for less $$ than can experimentation alone – a lesson that sectors such as Pharmaceuticals learned quite some time ago. Add to this the that predictive (molecular) modeling methods have become easier to use and are increasingly merged with informatics, and you gain unprecedented R&D capabilities.

 

Modern pharmaceuticals are extensively modeled before chemicals are ever mixed in the lab. Maybe the next great perfume will come out of a computer simulation, too.

353 Views 0 References Permalink Categories: Materials Informatics Tags: cosmetics, publications, consumer-packaged-goods, fragrances, beverages, flavors, food
0

What are critical problems in alternative energy research? How does modeling play a role in bringing us closer to answers?

 

A recent review article on this topic by long-time associate Prof. Richard Catlow, et. al, caught my attention. Readers of this blog will be familiar with our many posts pertaining to 'green chemistry,' sustainable solutions, and the like. Last month, Dr. Misbah Sarwar of Johnson Matthey was featured in a blog and delivered a webinar on the development of improved fuel cell catalysts. Dr. Michael Doyle has written a series on sustainability. Drs. Subramanian and Goldbeck-Wood have also blogged on these topics, as have I. All of us share a desire to use resources more responsibly and to ensure the long-term viability of our ecosphere. This will require the development of energy sources that are inexpensive, renewable, non-polluting, and CO2 neutral. Prof. Catlow provides an excellent overview on the applications of molecular modeling to R&D in this area. Read the paper for a very comprehensive set of research problems and case studies, but here are a few of the high points.

 


  • Hydrogen production. We hear a lot about the "hydrogen economy," but where is all this hydrogen going to come from? Catlow's review discusses the generation of hydrogen from water. Research challenges include developing photocatalysts capable of splitting water using sunlight.

  • Hydrogen storage. Once you've created the hydrogen, you need to carry it around. Transporting H2 as a compressed gas is risky, so most solutions involve storing it intercalated in a solid material. LiBH4 is a prototypical example of a material that can reversibly store and release H2, but the process is too slow to be practical.

  • Light absorption and emission. Solar cells hold particular appeal, because they produce electricity while just sitting there (at least in a place like San Diego; I'm not so sure about Seattle). One still needs to improve conversion efficiency and worry about manufacturing cost, ease of deployment, and stability )with respect to weathering, defects, aging, and so forth).

  • Energy storage and conversion. Fuel cells and batteries provide mobile electrical power for items as small as hand-held devices or as large as automobiles. Catlow and co-workers discussed solid oxide fuel cells (SOFC) in their paper.

 

The basic idea with modeling, remember, is that we can test a lot of materials for less cost and in less time than with experiment alone. Modeling can help you find materials with the optimal band gaps for capture generation of photoelectric energy. It can tell us the thermodynamic stability of these new materials: can we actually make them and will they stick around before decomposing.

 

Simulation might not hit a home run every time, but if you can screen out, say, 70% of the bad leads, you've saved a lot of time and money. And if you're interested in saving the planet, isn't it great if you can do it using less resources?

 

Check out some of my favorite resources on alternative energy, green chemistry, and climate change.

 

506 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: catalysis, materials, atomic-scale-modeling, alternative-energy, green-chemistry, fuel-cells, hydrogen-storage, solar-cells
1 2 3 Previous Next