This blog post continues my annual tradition of reviewing the DFT literature for the previous year. Most users of Accelrys Materials Studio have at least heard of this method. DFT=Density Functional Theory, a quantum mechanical method for solving the electronic Schroedinger equation of molecules and crystals using a formalism that is based the charge density rather than the wavefunction. You can see the Kohn-Sham equations at a number of sites on the web. These are what have been implemented in a number of software packages such as Accelrys MS DMol3 and MS CASTEP, as well as, in packages developed by other research groups.
In my annual informal survey, the year 2011 saw a 22% increase in the number of publications with the term "DFT" in the text (as determined by full text searches at the ACS and Science Direct web sites). Total count was 12 080. DFT has been outrageously successful since it combines relatively high accuracy with low computational cost - at least low by QM standards. In addition, it works very well for periodic systems, so it has extended routine QM calculations to areas such as heterogeneous catalysis.
This year, I'd like to highlight some applications in one of my favorite areas: alternative energy. I found a total of 2,255 citations that included DFT along with 'fuel cells,' 'batteries,' 'biofuels,' 'solar cell,' or 'photovoltaic'. That's almost 20%: Just about 1 DFT paper in 5 addressed some topic in alternative energy! Let's hear it for green modelers.
Some of my favorite papers from 2011:
Gao, et al., Dynamic characterization of Co/TiO2Fischer-Tropsch catalysis with IR spectroscopy and DFT calculations, presentation at North American Catalyst Society meeting Detroit, 5-10 June 2011, available on slidesshare.
Shin et al, Effect of the alkyl chain length of C70-PCBX acceptors on the device performance of P3HT : C70-PCBX polymer solar cells, J. Mater. Chem., 2011, 21, 960–967. DOI: 10.1039/C0JM02459G
Gaetches, et al., Ab Initio Transition State Searching in Complex Systems: Fatty Acid Decarboxylation in Minerals, J. Phys. Chem. A 2011, 115, 2658–2667. dx.doi.org/10.1021/jp200106x
Di Noto et al, Structure–property interplay of proton conducting membranes based on PBI5N, SiO2–Im and H3PO4 for high temperature fuel cells, Phys. Chem. Chem. Phys., 2011, 13, 12146–12154. DOI: 10.1039/c1cp20902g
Full disclosure: my favorites were done with Accelrys tools. You can see all the 2011 scientific publication lists for DMol3 and CASTEP on the Accelrys web site. Be sure to let me know what your favorites are.
As many readers know, I have a personal interest in alternative energy research. The past year, 2011, brought good news not just for research but for practical implementation, as well. A comprehensive analysis of the deployment of alternative energy technologies has been collected in the Renewables 2011 Global Status Report. This report covers historical growth in many areas of renewable energy, as well as, year-on-year growth from 2009 to 2010.
The report estimates that 194 gigawatts (GW) of electric generating capacity were added globally in 2010. It’s great to see that about half the new capacity comes from renewables. Including hydroelectricity, renewables account for about a quarter of total capacity: 1320 out of 4940 GW. Photovoltaics, in particular did well, increasing generating capacity by 73% year-on-year!
A lot of research articles were published in 2011. The research that I follow focused on improving the materials and processes used to generate or store energy and fuels. In biomass conversion, for example, people want to convert more of the plant to fuel, convert it to higher energy compounds, and do it cheaper and faster. In the area of batteries, we need materials that store more energy per kg, deliver it more rapidly, and last longer.
These kinds of topics may seem far removed from the practical engineering aspects of building a photovoltaic power station, but fundamental research is critical to improving efficiency and reducing cost. On ScienceDirect one can find for 2011 over 6500 citations on batteries, 5000 on photovoltaics, 6000 on fuel cells, and 3000 on biofuels. That’s about a 30% increase over 2010. Let’s hear it for fundamental research!
As a scientist, I’m thrilled at the technical achievements. As a concerned citizen, I’m pleased to see that alternative energy is making significant inroads in replacing fossil fuels.
What will next year bring? What do you readers think is the most significant alternative energy development we’ll see in 2012?
I’m hoping for a new battery I can swap into my Prius that will double my mileage to 100 mpg (42 km/L).
Ok, before you start with the hate mail, the answer to this rhetorical question is YES. Photovoltaic solar currently provides only a small percent of our electrical needs, but it grew 53% in 2009. And advances like these printed panels mean that it's going to become less expensive and more widespread.
So why the provocative question? I recently saw a couple of postings where people ask "Hey, why don't we put solar panel on electric and hybrid cars to make them even more efficient?" I found an answer to this question from as early as 2008 in the Washington Times - and this analysis was probably carried out even earlier: the energy from solar panel is a drop in the bucketcompared to the needs of a commercial vehicle. The question bugged me enough that I thought I'd go through the analysis again in detail for folks. The bottom line: you need 70 m2 worth of panels to drive 60 km/h.
V is the velocity, taken here as 60 km/hr = 16.7 m/s
Power consumption, P, is force times velocity,
P = F * V
Put this all together and you get a requirement of about 8 kW to drive a vehicle at 60 km/h.
How does this compare to solar? TLC tells us that a home solar panel produces 70 mW/in2, or 110 W/m2. Hence you need over 70 m2worth of solar panels to drive at 60km/h. I haven't measured the surface area of my Prius, but it's definitely less than that.
Am I being negative? Not at all. I just want people to think critically about alternative energy. The same TLC site calculates that 26 m2of solar panels will meet the needs of a typical home, and that's quite feasible to put on a house or in a back yard. How about using those panels to charge your car battery? Driving the car at 60 km/h for 1 hour consumes 29 MJ of energy. 26 m2worth of solar panels will generate 10 MJ/h, so you'd need to charge for 3 hours to drive that far. This approach might actually work for small, efficient cars (think low M, A, and CD) on short trips.
What can we do to make solar power more relevant? As a scientist, I am fascinated by potential new photovoltaic technology like GaAs nanopillars or carbon nanotubes. As a concerned scientist, I can write my elected representatives and urge them to continue funding research like that. But as a consumer, I'll do more for the environment if I take the train to work than if I put solar panels on my car.
Catalysis is one of my favorite topics. They're challenging - but not impossible - to model; the chemistry is fascinating; and they are really, really useful. The Vision Catalysis 2020 Report from the U.S. Chemical Industry estimates that catalysis-based chemical syntheses account for 60 percent of today’s chemical products and 90 percent of current chemical processes. Development of new catalysts is motivated by the need to reduce energy costs, increase yields, and utilize alternative feedstock. Every two years, the leaders in catalysis R&D come together in the North American Catalyst Society meeting (incongruously abbreviated NAM). NAM 22 was held in Detroit 6-9 June. There were some excellent talks there, as well as opportunities to rub elbows catalyst researchers.
The theme of the meeting was "Driving Catalyst Innovation." As one might expect in the current climate, there was a lot of focus on alternative energy including electrocatalysis for energy conversion, syngas production, and CO2 capture. If you share my interest in catalysis, then check out the NAM 22 technical program to see all the hot topics. The extended abstracts provide quite a bit of good background for the talks. The program is quite lengthy, no way I can summarize it in this blog, but here are two of my favorite presentations:
Catalysts Live and Up Close:Insights from In-situ Micro- and Nano-Spectroscopy Studies, by Prof B.M. Weckhuysen, Ultrech University, discussed the spatiotemporal characterization of individual catalyst particles at the micron- and nano-scale. A detailed understanding of reaction mechanism requires an analysis at the molecular level, and the techniques described in this presentation show how advanced characterization techniques are making this possible.
Catalysis for Sustainable Energy by Prof J.K. Norskov, Stanford presented approaches to molecular level catalyst design. Prof. Norskov is well-recognized for his work modeling heterogeneous catalysts. In this presentation he drew from examples including carbon dioxide reduction, and biomass transformation reactions.
Images of two Co/Tio2 samples used in the study. IWI = preparation by Incipent Wetness Ipregnation. HDP = preparation by Homogeneous Deposition Precipitation.
In quite different ways, both of these underscore the importance of understanding catalysts at the atomic level, an area where modeling is indispensable.
I was delighted that this year I had the opportunity to give a presentation of my own on work that elucidated the structure of Co Fischer-Tropsch catalysts. The Fischer-Tropsch process is instrumental in converting biomass into fuel. By understanding the process in detail, we hope to create more efficient catalysts. Imagine one day dumping your yard waste in the top of a chemical reactor and draining fuel out the bottom. Realizing that vision is wtill a ways off, but we're working on it.
I was recently chatting with a customer about studying phase morphology of fuel cell membranes between enclosed surfaces. He pointed me to this paper where Dorenbos et al. used mesoscale methods to model a surface interacting with a polymer. They varied the interaction potential between a coarse-grained representation of a Nafion® 1200 polymer membrane and the surface to mimic different levels of hydrophobicity. However, it seems that the surface of the carbon catalyst support would not necessarily have a single homogeneous interaction.
I started to think about whether it would be easy to build a mesoscale model where I had some idealized surfaces but with different interactions between a diblock polymer and the surface and whether this would affect the phase morphology.
In a simple example, I used a diblock copolymer with a fairly strong repulsive interaction between the two blocks. Initially, I performed Mesocite calculations in the bulk phase to check that I would get the expected formation of a lamellar phase (Fig 1a). I then introduced a surface which had no net interaction with the diblock copolymer (Fig 1b) and one where one of the blocks has a repulsive interaction the surface (Fig 1c). As expected, the introduction of the surface disrupts the lamellar structure and it forms a perforated lamellar phase. When the surface is modified so that one block repels the surface, the lamellar phase forms again but this time aligned along the surface.
Figure 1: Bulk (a), non-interacting surface (b), and surface with repulsion of one block (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase.
I also introduced two different interactions on the surface. In this case, one of the diblock components is repelled by the blue parts of the surface and the other block by the green parts of the surface. When the surface is split into two strips, a perforated lamellar forms with the blocks in the diblock aligning with the surfaces (Fig 2a). However, when the surface is patterned with four strips, the diblock still forms a lamellar phase but this time the blocks align perpendicular to the strips (Fig 2b). Finally, when a circular interaction template is used, another perforated lamellar forms but this time the blocks try to align with the circles (Fig 2c).
Figure 2: Two strips (a), four strips (b), and a single circular interaction site (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase
To attempt to characterize the effects of the surface, I calculated the order parameter for each system using a script from the Community. I found that as the surfaces were introduced, the order dropped from 0.19 to between 0.16 and 0.18. Although relatively small, this does indicate the surfaces are having an effect on the order of the system. You can see the evolution with time of the different morphologies in the video below
If you want to try calculations like this, I have posted more technical detail on how to generate the input structures, including templates for the structures used above, into the Materials Studio Community.
This rather superficial study demonstrates that the effect of surface templating or lithography can be simulated using mesoscale structures. Recent changes in the mesostructure building tools in Materials Studio have made it even simpler to construct complex patterned structures. This is a very simple study but what sort of surfaces and systems would you like to simulate?
A recent article in C&E News caught my attention: "Ethylene from Methane." This discusses some very interesting technology that can indirectly help ease the energy crunch.
Here's a little background. We do a lot with petrochemicals beside putting them into our automobiles. Most modern polymers are derived from petroleum, so as the oil wells dry up, there go our Barbie (TM) dolls, cell phone cases, and stretchy polyester trousers. The principle starting material for most of these polymers is ethylene, H2C=CH2. This is commonly formed from petroleum feedstock by steam cracking, which makes smaller molecules out of larger ones like naptha or ethane. How much longer can we keep this up? According to SEPM (Society for Sedimentary Geology) the current world-wide proven reserves are estimated to be over 1,000 billion barrels of crude oil; at the current rate of oil consumption there are only 32 years of reserves left. According the US Energy Information Administration, the reserves of natural gas are 6,609 trillion cubic feet, and the C&EN article claims that we have 'multiple centuries' of this left.
The problem is that natural gas is primarily methane (CH4) and it's been harder to build ethylene up from this small molecule than to crack it from larger ones... until now. According the the C&EN article, a new process of oxidative coupling of methane (OCM) holds the promise of tapping this new source of raw material and making ethylene for a lower energy cost. The approach to this problem was really neat. Based on the work of Angela Belcher of MIT, scientists at Siluria Technologies grow catalysts by mineralizing the surface of a virus. By genetically engineering the viruses, they can generate an almost limitless number of different surfaces and hence explore a huge range of catalysts. An effecient OCM catalyst means that it becomes commercially feasible to use cheap natural gas as a feedstock to create raw materials for sophisticated products.
This sort of creative solution to problems of energy and natural resource utilization are excellent examples of why government must continue funding fundamental research. This work draws on the fields of molecular biology, catalysis, high-throughput screening, and petrochemistry — disciplines that you might not think would play well together. To find the best solutions to our energy challenges, researchers need the freedom to explore lots of alternative solutions; and we need to have many scientist engaged in this. As scientists and citizens, let's make sure that our elected officials know how important fundamental research is to our future.
Sustainability continues to grow in importance as an issue, both ecologically and sociologically. One source of resistance, I think, is the perceived cost of implementing sustainable practices vs the status quo. A recent article by Rosemary Grabowski in Consumer Goods Technology (CGT) outlines a number of practices and shows - among others things - why cost shouldn't be an issue. Sustainability isn't just good for the environment: it's good for the bottom line. Grabowski lists 10 requirements for sustainable strategies that also serve as drivers of business. I won't repeat them all here since you can read them for yourselves, but I'd like to highlight a couple that fit with the themes that we've already explored on this web site over the past year or so.
As a modeler I just love these topics because there's so much that software can contribute - in fact all 10 of Grabowski's requirements can be met this way. Open Collaboration, for example. This is an idea that P&G pioneered some years ago. Of course anyone can dump a new idea into the system, but you need a way to keep track of it, share it, add to it, and feed it back again. This is where software tools like ELNs come in. They also deliver centralized data, data continuity, and tracking, all of which are mentioned in the CGT article.
Another key requirement is Virtual Design. (You guessed it: as a modeler this is my absolute favorite topic.) Software has reached the stage where a lot of work can be done computationally rather than in the lab. No messy chemicals to clean up, no animals to experiment on, no toxic by products. The capabilities include alternative energy, more efficient use of resources, and the prediction of toxicology (see SlideShare for specific examples).
Sustainability shouldn't be a buzz word in 2011: it should be a serious consideration for all corporations and individuals concerned about the future. Software offers inexpensive and non-pollution solutions. I've touched on only a few here, but I'm really interested to hear from you folks out there. What are your challenges in sustainability and how are you meeting them? Whether with software or otherwise, let me know.
What do energy materials and pharmaceuticals research have in common? They both stand to benefit from the major improvements in quantum tools now available Materials Studio 5.5.
Supporting the energy and functional materials field is a new method in Materials Studio’s DMol3 that’s all about light-- excited states, UV/Vis spectra, and non-linear optical properties. The measurement of these properties is essential in the design of new energy efficient lighting as well as alternative energy generation, such as photovoltaics. DMol3 optical properties are based on time-dependent density functional theory (TD-DFT) . Some of you may know, or perhaps vaguely remember, that the Schroedinger equation is time dependent, so that’s the one that is basically solved here, rather than the usual static approximation. As you can imagine, this can be costly, but I am happy to say that the implementation is actually quite fast (keeping computing costs down) and accurate (making experimentation faster and more efficient).
There are many more applications DMol3, for example materials for holographic discs, which have simply amazing storage capacity. Another application area which I find intriguing is in life sciences, for example helping to understand vision better, so I’ll be interested to see where DMol3 optical properties will be used next.
Talking about life sciences, another new quantum feature I am really excited about could have big implications for the development of drug molecule candidates into dosage forms. The crystallization and polymorphism of drug molecules needs to be well understood, and modelling has been used for a while to support that work. Polymorph prediction remains tricky however, for a number of reasons. Sampling all the possible structures is a tall order, particularly if the molecules are flexible, as many new drug molecules are. Also, it may not be sufficient to consider the thermodynamics of the structures, as crystallization kinetics can play a major role.
However, these considerations are in a way academic if one cannot even optimize structures and rank them by energy with high accuracy. One of the key reasons why this has been so difficult is that both classical and quantum methods have significant shortcomings in this type of application. The force-fields used in classical methods either neglect or at best approximate electronic effects such as polarization, which are often important in drug crystals. Density functional quantum methods on the other hand do that well, but are less effective at capturing more long range, dispersion interactions. Now in Materials Studio 5.5, DMol3 and CASTEP have dispersion correction terms to overcome this issue. With some pioneering work in the literature that demonstrated the power of combining density functional and dispersion methods, I look forward to many great applications of this technology.
Fermi surface of Yttrium, a rare-earth material used in lasers, displays etc
Dr Johan Carlsson, Accelrys Contract Research Scientist
The final presentation in our recent contract research webinar series features work on graphene, which has been described recently as the “ultimate” material for next-generation nanoelectronics development. I asked Johan Carlsson from the Accelrys contract research team to provide some background on this nanomaterial. Attend the webinar to see how simulation methods have helped and are helping unravel some of graphene’s secrets.
Graphene is one of the most interesting nano-materials at the moment [1,2]. With all the hype, it might seem as though graphene is a completely new material discovered just recently. And in a sense this is correct.
When I started to work on carbon materials some eight years ago, graphene was just an imaginary model system. Back then the excitement about the fullerenes was being replaced by what I call the “nanohype.” All the fashionable forms of carbon suddenly needed to have names including the buzzword “nano.” So we had nanohorns, nanofoams, nanoporous carbon, and, of course, nanotubes.
These fashion changes indicate the cyclic nature of carbon science. During the last 25 years, a new form of carbon has been discovered every five to six years. The fullerenes were discovered in 1985 , the nanotubes in 1991 , and, in 1998, graphynes were predicted . That same year, thin graphite layers were grown on top of SiC wafers—what might today be referred to as the first few layers of graphene . However, it took another cycle of carbon science before the graphene hype really took off, when researchers at the University of Manchester managed to extract individual graphene sheets from a graphite crystal in 2004 . It’s actually about time for another carbon allotrope to emerge on the scene. Graphanes (graphene sheets fully saturated by hydrogens) have potential, but they are not strictly pure carbon structures . We’ll have to see if some other new carbon material will emerge soon.
Yet while humans have only discovered the potential of graphene recently, graphene sheets have always been available in nature as two-dimensional layers that are the building blocks of graphite. They’ve just not been accessible, as individual graphene sheets have been very difficult to isolate. The barrier to graphene synthesis was perhaps more mental than technical, as the method that finally succeeded was incredibly simple. The reseachers in Manchester had the ingenious idea to use adhesive scotch tape to lift off the graphene sheets from a graphite crystal!  Of course, this method didn’t scale industrially, and since this breakthrough a number of alternative methods have been quickly developed.
Graphene is now a mature material. Perhaps the maturation of graphene has represented the next step in the cyclic carbon evolution? Anyway, the progress in this field has been rich because graphene science straddles two different communities: the nanotube community and the low dimensional semiconductor community. Graphene has been proposed for a variety of applications in diverse fields, including field effect transistors with graphene channels , gas sensors , and solar cells . It’s even attracted interest in life science as an active membrane that can separate different molecules out of a solution .
Interestingly, the progress in graphene science is to a large extent driven by theoretical predictions and results from simulations. Graphene has been the model system of choice for theoretical investigations of sp2-bonded carbon materials. As such it has been the starting point to study graphite, fullerenes and nanotubes. Many properties of graphene were then known from theoretical point of view, before it was possible to perform actual measurements. The electronic structure of graphene for instance is known from theoretical calculations performed more than 60 years ago . Only recently has it been possible to actually measure the bandstructure of individual graphene sheets. Similarly, a substantial amount of our knowledge about the structure and properties of point defects and edges of graphene was first obtained by simulations and later confirmed by experiments. This shows that simulations and experimentation go hand in hand, and theoretical methods will continue to play a major role in the further development of the graphene and other materials.
 A. K. Geim and K.S. Novoselov, Nature Materials 6, 183 (2007).  A. K. Geim,Science 324, 1530 (2009).  H. W. Kroto, J. R. Heath, S. C. O'Brien, R. F. Curl & R. E. Smalley, Nature 318, 162 (1985).  S. Iijima, Nature 354, 56 (1991).  N. Narita, S. Nagai, S. Suzuki, and K. Nakao, Phys. Rev. B 58, 11009 (1998).  I. Forbeaux, J.-M. Themlin, and J.-M. Debever, Phys. Rev. B 58, 16396 (1998).  K.S. Novoselov, A. K. Geim, S. V. Morozov, D. Jiang, Y. Zhang, S. V. Dubonos, I. V. Grigorieva, and A. A. Firsov, Science 306, 666 (2004).  J. O. Sofo, A. S. Chaudhari, and G. D. Barber, Phys Rev. B 75, 153401 (2007).  Yu-Ming Lin, Keith A. Jenkins, Alberto Valdes-Garcia, Joshua P. Small, Damon B. Farmer and Phaedon Avouris, Nano Lett. 9, 422 (2009).  F. Schedin, A. K. Geim, S. V. Morozov, E. W. Hill, P. Blake, M. I. Katsnelson, and K. S. Novoselov, Nature Materials 6, 652 (2007).  X. Wang, L. Zhi, and K. Mullen, Nano Lett. 8, 323 (2008).  S. Garaj, W. Hubbard, A. Reina, J. Kong, D. Branton, and J. A. Golovchenko, Nature 467, 190 (2010).  P. R. Wallace, Phys. Rev. 71, 622 (1947).
What are critical problems in alternative energy research? How does modeling play a role in bringing us closer to answers?
A recent review article on this topic by long-time associate Prof. Richard Catlow, et. al, caught my attention. Readers of this blog will be familiar with our many posts pertaining to 'green chemistry,' sustainable solutions, and the like. Last month, Dr. Misbah Sarwar of Johnson Matthey was featured in a blog and delivered a webinar on the development of improved fuel cell catalysts. Dr. Michael Doyle has written a series on sustainability. Drs. Subramanian and Goldbeck-Wood have also blogged on these topics, as have I. All of us share a desire to use resources more responsibly and to ensure the long-term viability of our ecosphere. This will require the development of energy sources that are inexpensive, renewable, non-polluting, and CO2 neutral. Prof. Catlow provides an excellent overview on the applications of molecular modeling to R&D in this area. Read the paper for a very comprehensive set of research problems and case studies, but here are a few of the high points.
Hydrogen production. We hear a lot about the "hydrogen economy," but where is all this hydrogen going to come from? Catlow's review discusses the generation of hydrogen from water. Research challenges include developing photocatalysts capable of splitting water using sunlight.
Hydrogen storage. Once you've created the hydrogen, you need to carry it around. Transporting H2 as a compressed gas is risky, so most solutions involve storing it intercalated in a solid material. LiBH4 is a prototypical example of a material that can reversibly store and release H2, but the process is too slow to be practical.
Light absorption and emission. Solar cells hold particular appeal, because they produce electricity while just sitting there (at least in a place like San Diego; I'm not so sure about Seattle). One still needs to improve conversion efficiency and worry about manufacturing cost, ease of deployment, and stability )with respect to weathering, defects, aging, and so forth).
Energy storage and conversion. Fuel cells and batteries provide mobile electrical power for items as small as hand-held devices or as large as automobiles. Catlow and co-workers discussed solid oxide fuel cells (SOFC) in their paper.
The basic idea with modeling, remember, is that we can test a lot of materials for less cost and in less time than with experiment alone. Modeling can help you find materials with the optimal band gaps for capture generation of photoelectric energy. It can tell us the thermodynamic stability of these new materials: can we actually make them and will they stick around before decomposing.
Simulation might not hit a home run every time, but if you can screen out, say, 70% of the bad leads, you've saved a lot of time and money. And if you're interested in saving the planet, isn't it great if you can do it using less resources?
Check out some of my favorite resources on alternative energy, green chemistry, and climate change.
3D Pareto surface shows the tradeoffs among target properties: dipole moment, chemical hardness, electron affinity. The optimal leads are colored red, poor leads blue.
How do you search through 106materials to find just the one you want? In my very first blog post "High-Throughput- What's a Researcher to Do?"I discussed some ideas. The recentACShad a session devoted to doing just that for materials related to alternative energy, as I wrotehereandhere.
My own contribution was work done with Dr. Ken Tasaki (of Mitsubishi Chemicals) and Dr. Mat Halls on high-throughput approaches for lithium ion battery electrolytes. This presentation is available now onSlideshare(a really terrific tool for sharing professional presentations).
We used high-throughput computation and semi-empirical quantum mechanical methods to screen a family of compounds for use in lithium ion batteries. I won't repeat the whole story here; you can read the slides foryourselves, but here are a couple take-away points:
Automation makes a big difference. Obviously automation tools make it a lot easier to run a few 1000 calculations. But the real payoff comes when you do the analysis. When you can screen this many materials, you can start to perform interesting statistical analyses and observe trends. The 3D Pareto surface in the accompanying image shows that you can't optimize all the properties simultaneously - you need to make tradeoffs. Charts like this one help you to understand the tradeoffs and make recommendations.
Don't work any harder than you need to. I'm a QM guy and I like to do calculations as accurately as possible. That isn't always possible when you want to study 1000s of molecules. Simply looking through the literature let us know that we can get away with semi-empirical.
Enjoy theSlideshare, watch for more applications of automation and high-throughput computation, and let me know aboutyourapplications, too.
“Chemistry for a Sustainable World” was the theme of the ACS spring meeting. My own interests, of course, are in modeling, and more recently in high-throughput computation. The Monday morning session of CINF was devoted to the application of this to electronic and optical materials. The presentations included both methodological approaches to materials discover and applications to specific materials, with a focus on materials for alternative energy.
to help look for the best molecules possible for: organic photovoltaics to provide inexpensive solar cells, polymers for the membranes used in fuel cells for electricity generation, and how best to assemble the molecules to make those devices.
Uniquely, the project uses the IBM World Community Grid to sort through the myriad materials and perform these long, tedious calculations. You may remember this approach from the SETI (at) home, which was among the first to try this. This gives everyonethe chance to contribute to these research projects: you download software from their site, and it runs on your home computer like a screensaver: when your machine is idle, it’s contributing to the project. Prof. Aspuru-Guzik said that some folks are so enthusiastic that they actually purchase computers just to contribute resources to the project. It’s great to see this sort of commitment from folks who can probably never be individually recognized for their efforts.
I don’t want to make this blog too long, so great talks by Prof. Krishna Rajan (Iowa State), Prof. Geoffrey Hutchison (U of Pittsburgh), and Dr. Berend Rinderspacher (Army Research Labs) will be covered in the next blog.
I was also quite happy to see that the some of the themes in my presentation were echoed by the others – so I’m not out in left field after all! I’ll blog about my own talk later on, but here’s a quick summary: Like Prof. Aspuru-Guzik’s work we used high-throughput computation to explore new materials, but we were searching for improved Li-ion battery electrolytes. We developed a combinatorial library of leads, set up automated computations using Pipeline Pilot and semiempirical VAMP calculations, and examined the results for the best leads. Stay tuned for a detailed explanation and a link to the slides.
And keep an eye out, too, for the 2nd part of Michael Doyle's blog on Sustainabilty.
There were several invited talks on semiconductors and catalyst nano particles, apart from my talk on alternate energy. Many of the speakers discussed the suitability of a particular simulation approach for the study of specific applications, while others discussed the most recent state-of-the-art theoretical advances to tackle real problems at several timescales. It is particularly challenging when simulations are to be used not just for gaining insights into a system but to be a predictive tool as well as for virtual screening. While virtual screening is a well-studied art in the world of small molecule drug discovery, this is only now gaining traction in the materials world.
I've enjoyed 2 days so far in Nagoya attending the "Theory Meets Industry" conference. There is some amazing work going on by both developers of computational methods and those who apply them. We've heard from developers likeBernard Delleyand his recent work onTDDFT inDMol3, which will enable excited state calculations and UV spectra. We've also heard fromGeorg Kresseabout his recent work on theRandom Phase Approximation(RPA),which offers a way to improve not just DFT band gaps but total energies, as well.
There's been an emphasis in alternative energy from the industrial participants. Applications are really diverse:
Rradiation damage in reactor containment materials by Christophe Domain of EDF
Hydrogen storage materials by Pascal Raybaud of IFP
This list also reflects the true international spirit of the conference.
I've also heard some interesting new approaches to doing calculationsfastwhile not sacrificing accuracy.Gabor Csanyiof Cambridge University presented hisGaussian Approximation Potentials(GAP), an alternative to force fields that spans more of the potential energy surface. AndIsao Tanakaof Kyoto University showed how he uses an improved Cluster Expansion method to study phase transitions. Keep your eye on these methods for future developments.
Today I make my own small contribution by presenting my work on high-throughput computation. Look for details on that in a future blog.
A number of recent (and not so recent) initiatives in Congress are designed to encourage production of ethanol as an alternative fuel, but how much is reallyfeasibleand how much is catering to eco-hype? A search of congressional bills for the111th congressturns up an astonishing1592bills relating to “energy” and150to “renewable energy.” These bills do everything from providing tax credits for growing corn, to funding development of production facilities, to providing tax credits for consumers. But which options really make sense?
The debate over methanol from corn has been going on for a while, and judging from the available information, there’s plenty to be concerned about.PaztecandPimantelhave published numbers that suggest that such production is a net energyloss. This is supported by anEPA reportsummarized inChemical & Engineering News(C&EN) May 11, 2009 [sorry, you’ll need a subscription to read that]. A 2007 energy law set a production target of 36 billion gallons of biofuels by 2022. As reported in C&EN, the law requires a full life-cycle analysis that “reflects a growing concern that ethanol may result in higher CO2emissions due to land-use practices, such as clearing rain forest…” And another recent C&ENarticlediscussed the potholes on the road to commercial biofuels. According to the article, of the six cellulosic ethanol projects to receive DoE grants in 2007, none of the projects has been built, although one is under construction.
Despite the differences between the optimists and pessimists, I think that they agree on one thing: the need for higher efficiency. Given the current efficiencies of biofuel production, internal combustion engines, and fuel cells, biofuels can’t reach the goals that we’ve set for them (e.g., 10% of electricity from renewable sources by 2012, and 25 percent by 2025). What is unquestionably needed is morefundamental research. To underscore some of my favorite, recent high points:
Similar work was presented by Robert Andersson at the 21st North American Catalyst Society meeting earlier this year. This converted biomass to syngas (H2/CO) and subsequently alcohols using heterogeneous catalysis.
These are just a few examples of the many fundamental advances that will be required to make biofuel sustainable and commercially viable.
Scientists regularly cry out for more fundamental research funding at the start of each federal budget cycle. The American Reinvestment and Recovery Act (ARRA) of 2009 provides for $4.6 billion inDOE grantsfor basic R&D. Thelatest congressional omnibus billprovides $151.1 billion in federal R&D, an increase of $6.8 billion or 4.7 percent above the FY 2008 value. This is a real good start. Let’s make sure that we use the money wisely.
Here are the results from the poll attached to this blog post: