Skip navigation
1 2 Previous Next

Accelrys Blog

22 Posts tagged with the materials tag
1

Materials Studio 7.0: More Science, More Applications…

 

And so another year of development comes to a close and we have released Materials Studio 7.0! From my perspective, this is a pretty exciting release because after Materials Studio 6.1 - which was mostly focused on re-architecting Materials Visualizer - the focus for this release is back on the science. We also used milestones in this release to gather very early feedback from a small set of customers which worked really well (a big thanks to those who contributed their time!)

 

So what did we deliver in this release? Well, a year is a long time and there are lots of interesting new pieces of functionality and performance and usability enhancements.

 

Way back in late 2012, we had a Materials Science and Technology Forum in India. This was the first meeting of its kind in India and was a great opportunity to meet face to face with our customers. One message that kept on coming over again and again was around performance. The main focus was on DMol3 with customers saying “it’s fast on one core but just doesn’t scale well enough”. Modern computer architectures have changed a lot since DMol3 was last parallelized, and so it was time for a re-think. Our DMol3 expert already had many ideas about how to improve things, and he succeeded spectacularly and now we have good scaling up to 64 cores. So check out the scaling, throw more cores at your calculations and get the results faster. It might go higher than 64 cores but that is all we have in house!

 

There was a similar message on the performance of Forcite Plus. Whilst we could do charge neutral molecules quickly using charge groups, when it came to charged systems like surfactants and ionic liquids, you had to use Ewald which scaled well but was slow. After quite a bit deep deliberation, we have added Particle-particle-particle Mesh Ewald (P3M). This is much faster for larger systems than Ewald and gives access to calculations on hundreds of thousands of atoms.

 

Also, in 2012, we sent out a large survey which many of you completed (in fact, over 400 people filled it out which was brilliant!) We asked you to rate different properties by level of importance to you. From this, solubility came out as the number one property of interest linked closely with free energy calculations. In response, In Materials Studio 7.0, we have added a new Solvation Free Energy task to Forcite Plus enabling you to use Thermodynamic Integration or Bennett Acceptance Ratio to calculate free energy of solvation.

 

Another important area from the survey was to improve the prediction of properties for electronic materials. In collaboration with TiberLab and the University of Bremen, we have added non-equilibrium greens functions to DFTB+ enabling the calculation of electron transport. This addition has also meant extending Materials Visualizer with a set of tools for building electrodes and transport devices. Coupled with the new DFTB+ task, you can calculate transmission and current-voltage curves for a range molecular electronic devices.

 

These are just some of the highlights of the release. I don’t even have space to mention new forcefields, barostats, mechanical properties from DMol3, optimization of excited states in CASTEP or the uber-usability enhancement of “copy script”. I guess you will just have to listen to the webinar and find out more about how Materials Studio 7.0 can accelerate your modeling and impact your work.

1,249 Views Permalink Categories: Modeling & Simulation Tags: materials_studio, materials_visualizer, forcite, classical_simulation, dmol3, polymer, catalysis, materials, nanotechnology, atomic-scale-modeling, innovation, quantum-chemistry
1

As a chemist, organometallic chemistry was one of my favorite areas of science because it combined the synthetic elegance of traditional organic chemistry with the rational ordering and geometric beauty of inorganic systems. In a way, it’s the best of both worlds. Plus, there is a lot of unknown and unchartered chemical space in this branch of chemistry - from novel stereotactic control catalysts to metal-organic medical and chemotheraputic reagents.

 

Many of these types of complexes feature coordination bonds between metal and organic ligands. The organic ligands often bind the metal through a heteroatom such as oxygen, nitrogen or sulphur, in which case such compounds are considered coordination compounds. Where the ligand forms a direct metal bond, the system is usually termed organometallic. These compounds are

not new - i.e. many organic coordination compounds occur naturally. For example, hemoglobin and myoglobin contain an iron center coordinated to the nitrogen atoms of a

porphyrin ring. Magnesium is the center of chlorophyll. In the bioinorganic area, Vitamin B12 has a cobalt-methyl bond. The metal-carbon bond in organometallic compounds is generally of character intermediate between ionic and covalent.

 

One of the reasons that this area of chemistry was studied in great detail is that organometallic compounds are very important in industry, as they are both relatively stable in solutions and relatively ionic to undergo reactions. Two obvious examples are organolithium and Grignard reagents.

 

Organometallics also find practical uses in stoichiometric and catalytic processes, especially processes involving carbon monoxide and alkene-derived polymers. All the world's polyethylene and polypropylene are produced via organometallic catalysts, usually heterogeneously via Ziegler-Natta catalysis.

 

Some classes of semiconductor materials are produced from trimethylgallium, trimethylindium, trimethylaluminum and related nitrogen / phosphorus / arsenic / antimony compounds. These volatile compounds are decomposed along with ammonia, arsine, phosphine and related hydrides on a heated substrate via metal-organic vapor phase epitaxy (MOVPE) process or atomic layer deposition, vapor deposition or decomposition processes (ALD, CVALD) and are used to produce materials for applications such as light emitting diodes (LEDs) fabrication.

 

So why did I choose my title about these materials? Well, recently scientists in the UK have made a dramatic breakthrough in the fight against global warming and it involves sea urchins. A chance find by Newcastle University physicist Dr. Lidija Siller as she studied the sea creature, is being hailed as a potential answer to one of the most important environmental issues of modern times. She found sea urchins use nickel particles to harness carbon dioxide from the sea to grow their "exoskeleton". That is, they naturally complex CO2 out of their environment and with natural enzymes, use the CO2 to produce harmless calcium carbonate metal loaded materials. This was not deemed practical or possible historically so the breakthrough has the potential to revolutionize the way carbon or CO2 is captured and stored, while at the same time producing a useful material that is strong, light and water resistant.

 

In other words, this process or the ability to mimic this process in industrial environments not only reduces carbon output but also produces a useful by-product. This discovery could significantly reduce CO2 emissions, the key greenhouse gas responsible for climate change.

 

Where Accelrys technology plays a key part is the understanding of the organometallic structure - its geometry, energy barriers, electronic properties, fluxional or dynamic behavior. Also, Accelrys technology allows the screening of options, choices and chemical modifications that can be made to the framework. These changes can enhance, stabilize or industrialize the natural system in this recent discovery.

 

In the future, the application of virtual and predictive chemistry systems will have an elegant symmetry - the research and development of the catalyst which produces plastics and materials will be complemented by research and development in the remediation and end of life of those materials. Although plastics contribute to our carbon footprint, catalysts that create them can and should be used to reduce or sequester byproducts of those processes and materials. This will clearly increase the sustainability of the complete process.

1,240 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation, Cheminformatics Tags: chemistry, materials, doyle, catalyst, global_warming
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,460 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,017 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
2

Over the past couple of years I've become a big fan of using 'workflow automation' to help me with my modeling projects. What exactly does that mean? Like many of my quantum mechanics colleagues, I had trouble getting my head wrapped around this idea. What I mean by 'workflow automation' is creating some kind of little computer "program" that helps me do my job faster and easier. I put "program" in quotes because you no longer need to be a programmer to create one of these. Instead you use a 'drag & drop' tool that makes it so easy that (as GEICO says) a caveman could do it. This terrific capability has been added to the commercial version of MS 6.1, and it should make life easier for lots of modelers.

 

workflow1.png

What happens when you need to do the same DFT calculation over & over & over again? You'll also probably need to extract one particular datum from the output files, e.g., HOMO-LUMO gap, or total energy, or a particular C-O bond length. With workflow automation you drag in a component that reads the molecular structure, another to do the DFT calculation, and a third to display the results. This is pictured in the figure. (I cheated a bit and added a 4th component to convert the total energy into Hartrees.)

 

Not impressed? Let's look at a more complex case. A colleague of mine was studying organic light-emitting diodes (OLEDs) based on ALQ3. He created a combinatorial library of 8,436 structures. In order to characterize these he needed to compute a bunch of stuff:

  • total energy, HOMO, and LUMO
  • Vertical ionization potential (IP)
  • Vertical electron affinity (EA)
  • Adiabatic IPOLED2.png
  • Adiabatic EA

This requires a total of 3 geometry optimizations (neutral, cation, and ion) and a few extra single point energy calculations. You then need to extract the total energies from the results and combine them to compute the IPs and EAs. Doing 1 calculation like that is no big deal, but how do you do it 8,436 times? And how do you sort through the results? Using automation, of course! Even a novice can set up the protocol in under an hour, about the time it would take to process 4 or 5 of the structures manually. Plus you can display your results in a really nice html report like the one pictured.

 

Using this approach makes it easy to combine calculations in new ways. One protocol I wrote, for example, uses MS Polymorph Predictor to determine the lowest energy crystal structures of a molecule, but before starting the calculation it determines the atomic partial charges with density functional theory (DFT). And once it's done it can use DFT again to refine the predictions. Another combines MS Amorphous cell and MS Forcite to automate the computation of solubility parameter of polymers.

 

You've been able to generate these sorts of workflows for quite some time using Pipeline Pilot. So what's new? Now Materials Studio customers will get a license to these tools, and they'll be able to call their protocols from within the MS GUI. The combination of Pipeline Pilot and Materials Studio means you can create sophisticated workflows easily - without the complexity of perl scripting - and you can launch the protocols from and get the results back to MS.

 

ProtocolDialog.png

Share and share alike

The final point I'd like to make about these protocols is how easy they are to share. Simply place the protocol in a shared folder on the server and anybody can run jobs. There's no need for them to download a copy or configure anything. If they have an MS Visualizer then they can use your protocol. This lets experienced modelers create reliable, repeatable workflows that they can share with non-modelers. Perhaps more interesting to expert modelers, we can learn from each other and share advanced workflows.

 

I think it'd be awesome if we had a section on the Materials Studio community pages where we exchange protocols. I'll kick off the process by loading my Polymorph Predictor protocol. I'll let you know when that's ready. Stay tuned.

 

In the meantime, let me know what types of protocols you'd like to see. What are the things that you spend the most time on? What things could be streamlined, automated? Don't be shy: post a note to this blog and let everybody know what you think.

812 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: pipeline_pilot, materials_studio, dft, materials, polymorphs, automation
0

Today at the Materials Research Society meeting in Boston, Cyrus Wadia, the Assistant Director for Clean Energy and Materials Research and Development at the White House Office of Science and Technology Policy challenged the audience to “think as a network” of collaborators, not individual researchers, and to “learn from colleagues” how to better handle large data sets.

 

The recently publicized presidential initiative for a Materials Genome aims to insert or provide new materials for the production and technology advantage of US industry. This initiative mirrors exactly the focus Accelrys has on integrated multi-scale modeling and laboratory-to-plant optimization. The heart of these initiatives is improving the estimation and management of innovation and design of materials from the nano-scale right up to the process and macro production scale.

 

As many Accelrys users know, multi-scale modeling where the results of atomistic and micro-scale simulation can be combined with larger scale processes, high throughput screening and testing, and laboratory information yields valuable insights into materials design. The combination of computational and real analytical instruments alongside the organization of laboratory data allows the use of models at all length scales and can be considered as driving the innovation cycle of many companies.

 

The MGI initiative clearly recognizes this function and the idea of developing a framework and ontologies where information, models and data can be exchanged at different levels and different time scales. The value of making information, more widely available to the engineer and materials scientist for collaboration will help drive more accurate and systematic materials development and usage. Accelrys is very excited about this initiative and is sure its technology will provide clear aid in the acceleration of the innovation lifecycle of new materials—which is a clearly stated goal of this initiative.

 

How do you feel about this initiative?

763 Views 0 References Permalink Categories: Trend Watch, Data Mining & Knowledge Discovery, Modeling & Simulation, Materials Informatics Tags: materials, nanotechnology, pipeline-pilot, atomic-scale-modeling, high-throughput, materials-studio, cloud-computing
1

In a recent article, David Axe of Wired Science rightfully points out many of the pros and cons and bids farewell to the NASA shuttle program, fondly calling it "the most visually impressive high-tech boondoggle in American history".

 

 

The space shuttle, like any iconic and classical piece of engineering and science, has a finite life and operating span. For its time, that the shuttle flew so well for so long and in such amazing situations was truly an amazing tribute to the legion of aerodynamics, space engineering, production and the aeronautical materials and chemical engineers and scientists. Just consider the conditions the shuttles endure on re-entry for one second: This is a speed, i.e., drag, across the shuttle's surface, of over Mach 5 which creates incredible forces that stress the surface, the tiles and the materials.

 

 

 

Further, this flow challenge for the shuttle's materials happens simultaneously with other challenges such as plasma temperatures over 5000 degrees Celsius and the transition from vacuum to atmosphere. Pretty tough environment for anything to cope with, let alone fly afterwards. The fact that we as a species can consider, design and build the system and the materials that constitute this, is a tour de force of science.

 

 

 

 

This is just part of the amazing heritage the shuttle brings.  There is also the incredible materials science that goes into the leading edge coatings, the tiles, the thrusters, the propellant and yes, even the solid rocket boosters. Just remember each tile is made, tested, coated and affixed in place. Consider the heat cycle that the leading edge of the shuttle goes through and how amazing the chemical understanding of the heat mediated grain segregation and zone diffusion is. These parts create a visible rainbow of beautiful colors due to the effect of extreme heat and pressure have on the alloy structure. Being able to predict and understand the performance of these parts under stringent space flight safety limits is again a triumph of materials science and engineering.

 

 

 

So the shuttle, rather than representing a failure or waste of engineering time, truly represents the triumph of our materials science and engineering innovation.

 

As we look forward to the next decade and the new multi-purpose crew vehicle (MPCV) derived from the NASA Gemini project, there are many new challenges for longer duration missions.

 

 

 

First, is thermal protection for a larger re-entry vehicle. With more astronauts the vehicle is enlarged and consequently, on atmospheric re-entry requires more deceleration and hence more heat to be dissipated. Next of course is the radiation encountered for the long haul or extended duration mission to Mars, which requires advanced shielding to protect both the crew members, as well as, the sensitive command and control subsystems in the MPCV. Further, the altitude control and motor systems have to function and continue to work after long period idle in temperatures from 100 degrees below zero, to baked in the sun. And finally, the MPCV will have different missions, in different orbits and of different durations which then require a degree of flexibility without compromising the system capabilities and its fundamental safety profile and performance.

 

 

 

For example, one might consider the use of advanced graphene based or Bucky tube based composites, such as are used in the new generation of commercial aircraft. The difficulty or tradeoff there is that the use of composites in exposed areas is complicated by atomic oxygen levels in certain orbits or at certain oxygen levels. This atomic oxygen is known, as Dr. Lackritz of Lockheed Martin said, to "chew up Bucky tubes and graphene" at an amazing rate, so this causes the design challenges of these materials to increase.

 

 

 

If you then think about the next generation of solid rocket boosters and attitude control systems, these too need higher performance. They need to reduce size, extend their operating lifetime, survive longer, i.e., resist space radiation longer and have a lower environmental impact. A long list of somewhat contradictory or opposing challenges.

 

 

 

 

 

So the nature, demand and need for advanced materials will inevitably continue. Perhaps I would be so bold as to say it is going to increase as our demand for faster low orbit communications, quicker travel and even space tourism takes off. This is the true nature of the challenge we face:  Increasing our expertise and knowledge of materials, advanced processing capabilities while simultaneously increasing our knowledge capital and skill in designing and optimizing these materials from their inception in the lab, to their use in a design system (conceptual and practical), to their integration in a production plant, to their use in space.

 

 

 

 

I feel that the closing of the space shuttle program is not a failure or a negative event, at all. It merely marks the next chapter or the first step in the continuation of good old fashioned, scientific innovation. This innovation is what I believe will, as did developments like penicillin and the light bulb, move us as a society forward to greater achievements and greater accomplishments.

 

 

 

What do you think?

678 Views 0 References Permalink Categories: Trend Watch Tags: materials, materials-studio, materials-science, aerospace
0

DFT Redux Again

Posted by gxf Jan 4, 2011

In what is becoming an annual tradition, my first blog of the year is a quick summary of Density Functional Theory (DFT) applications for the past twelve months. Readers of Accelrys' blogs will be familiar with this modeling method. With this approach you can study heterogeneous catalysis on an extended (periodic) surface; predict crystal structures; or calculate elastic constants. Life science applications include small-molecule design and solvation energies. Using the modest computers that I have access to, I routinely run DMol3 calculations on systems with up to 300 atoms or so. With ONETEP and larger computers, it's easy to do computations on around 1000 atoms with DFT. This is amazing when you consider that the largest calculation I did in graduate school was around 10 atoms.

 

Some of my favorite publications from 2010:

  • Catlow, et al., "Advances in computational studies of energy materials," Phil. Trans. R. Soc. A 368, 3379 (2010). (full text). This is an excellent review of the ways that materials modeling is being applied to problems in alternative energy research. My co-bloggers and I have written quite a lot about sustainability, alternative energy, and green chemistry, so this article fits well with that very timely and topic theme.
  • Delley, "Time dependent density functional theory with DMol3," J. Phys.: Cond. Matter 22, 465501 (2010) (abstract). I wrote last year about the importance of being able to predict spectra. This long-anticipated development adds UV/visible spectrum capability to DMol3. The predictions are good and the calculations are fast. We've applied this to a bunch of cases including structure determination of zeolite catalysts as you can see here.
  • Pickard and Needs, "Ab Initio Random Structure Searching," Psi-k Newsletter 100, 42 (2010). (Full text). Usually, geometry optimization starts with an approximate structure and refines it, but how do you determine the structure of a solid when you have absolutely no idea where to start. Computers and DFT methods are fast enough now that you can actually use random structures and evolutionary algorithms to determine reasonable geometries. And on top of that, they found new classes of molecular structures for N2 under pressure.

Be sure to check out the Accelrys web site for a list of all of the DMol3 and CASTEP titles of 2010. And be sure to let me know what your favorite stories of 2010 were.

 

-

773 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, atomic-scale-modeling, quantum-chemistry
0

Let it snow, let it snow...

Posted by stodd Dec 19, 2010

As  the winter vacation looms nearer, I have been thinking about how I am  going to get to Scotland for Christmas. At the same time, a colleague  asked if I was going  to write a seasonal script again this year. Merging those two streams  of thought, I came up with a new seasons greeting script which you can  download or see a movie from here. Note the script requires Materials Studio 5.5 to run.

 

I  wondered initially about how to make snow fall down the screen. I  started off using shearing in DPD in Mesocite but that didn’t really  give me the flexibility  I needed. One of my colleagues, Neil Spenley, recommended trying out  the new electric fields that we added in Materials Studio 5.5. This was a  great idea as it enabled me to switch the field directions with a lot  of flexibility – hence the “wind” blows across  the screen.

 

Next  issue was how to keep the snow falling. As I only have a fixed number  of snow beads in my structure, I had to make them fall from the sky  repeatedly. To do  this, I used a soft, short range potential which, due to the magic of periodic boundary conditions, enables the beads to  fall through the ground and back in through the top of screen. In the  last part of the script, I used close contacts to dock the snow onto the  ground.

 

The  final challenge was to make the greeting appear. Initially, I fixed the  bead positions but this makes the beads appear all the time which was a  bit of an issue!  Finally, I realised that I could use the restraints energy, also  introduced in Materials Studio 5.5, to move the beads to anchor  positions that I created from a pre-defined set of coordinates.

 

After  this, it was just a matter of tuning the parameters. I experimented  with a lower electric field strength which allowed the beads to move  side to side a little.  However, when they hit the Lennard-Jones potential on the roof beads,  they bounced back up into space – not realistic!

 

After  much playing, I finally got something I was happy with – so Seasons  Greetings everyone! Now I have to think about next year...

861 Views 0 References Permalink Categories: Modeling & Simulation Tags: classical_simulation, materials, materials-studio, computational-chemistry
0

What do energy materials and pharmaceuticals research have in common? They both stand to benefit from the major improvements in quantum tools now available Materials Studio 5.5.

 

Supporting the energy and functional materials field is a new method in Materials Studio’s DMol3 that’s all about light-- excited states, UV/Vis spectra, and non-linear optical properties. The measurement of these properties is essential in the design of new energy efficient lighting as well as alternative energy generation, such as photovoltaics. DMol3 optical properties are based on time-dependent density functional theory (TD-DFT) . Some of you may know, or perhaps vaguely remember, that the Schroedinger equation is time dependent, so that’s the one that is basically solved here, rather than the usual static approximation. As you can imagine, this can be costly, but I am happy to say that the implementation is actually quite fast (keeping computing costs down) and accurate (making experimentation faster and more efficient).

 

There are many more applications DMol3, for example materials for holographic discs, which have simply amazing storage capacity. Another application area which I find intriguing is in life sciences, for example helping to understand vision better, so I’ll be interested to see where DMol3 optical properties will be used next.

 

Talking about life sciences, another new quantum feature I am really excited about could have big implications for the development of drug molecule candidates into dosage forms. The crystallization and polymorphism of drug molecules needs to be well understood, and modelling has been used for a while to support that work. Polymorph prediction remains tricky however, for a number of reasons. Sampling all the possible structures is a tall order, particularly if the molecules are flexible, as many new drug molecules are. Also, it may not be sufficient to consider the thermodynamics of the structures, as crystallization kinetics can play a major role.

 

However, these considerations are in a way academic if one cannot even optimize structures and rank them by energy with high accuracy. One of the key reasons why this has been so difficult is that both classical and quantum methods have significant shortcomings in this type of application. The force-fields used in classical methods either neglect or at best approximate electronic effects such as polarization, which are often important in drug crystals. Density functional quantum methods on the other hand do that well, but are less effective at capturing more long range, dispersion interactions. Now in Materials Studio 5.5, DMol3 and CASTEP have dispersion correction terms to overcome this issue. With some pioneering work in the literature that demonstrated the power of combining density functional and dispersion methods, I look forward to many great applications of this technology.

 

 

Fermi surface of Yttrium, a rare-earth material used in lasers, displays etc

 

 

There is much more in the release (see the above picture for example) and you can hear all about it at our upcoming webinars, or see it live at the European User Group Meeting later this month.

712 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, quantum-mechanics, materials-studio, alternative-energy, quantum-chemistry, spectroscopy, ab-initio-modeling, polymorphs, optics
0

What a pleasure to hear the news this morning that the 2010 Nobel Prize in physics has been awarded to Andre Geim and Konstantin Novoselov. My colleague Johan Carlsson referred to these researchers in his informative post yesterday on graphene. The Nobel press release highlights the extraordinarily simple technique that Geim and Novoselov used to extract graphene from ordinary graphite:

 

Using regular adhesive tape they managed to obtain a flake of carbon with a thickness of just one atom. This at a time when many believed it was impossible for such thin crystalline materials to be stable.

 

All of us in materials science at Accelrys congratulate Geim and Novoselov on their achievement. Would it be pompous to refer to our webinar Thursday on graphene as the “Nobel Webinar”? Probably. Nevertheless, we hope you’ll attend to see the role modelling and simulation have played in helping us understand this extraordinary material.

548 Views 0 References Permalink Categories: Materials Informatics Tags: graphene, materials, nanotechnology, nobel-prize
0

 

Dr Johan Carlsson, Accelrys Contract Research Scientist

Dr Johan Carlsson, Accelrys Contract Research Scientist

 


The final presentation in our recent contract research webinar series features work on graphene, which has been described recently as the “ultimate” material for next-generation nanoelectronics development. I asked Johan Carlsson from the Accelrys contract research team to provide some background on this nanomaterial. Attend the webinar to see how simulation methods have helped and are helping unravel some of graphene’s secrets.

 

Graphene is one of the most interesting nano-materials at the moment [1,2]. With all the hype, it might seem as though graphene is a completely new material discovered just recently. And in a sense this is correct.

 

When I started to work on carbon materials some eight years ago, graphene was just an imaginary model system. Back then the excitement about the fullerenes was being replaced by what I call the “nanohype.” All the fashionable forms of carbon suddenly needed to have names including the buzzword “nano.” So we had nanohorns, nanofoams, nanoporous carbon, and, of course, nanotubes.

 

These fashion changes indicate the cyclic nature of carbon science. During the last 25 years, a new form of carbon has been discovered every five to six years. The fullerenes were discovered in 1985 [3], the nanotubes in 1991 [4], and, in 1998, graphynes were predicted [5]. That same year, thin graphite layers were grown on top of SiC wafers—what might today be referred to as the first few layers of graphene [6]. However, it took another cycle of carbon science before the graphene hype really took off, when researchers at the University of Manchester managed to extract individual graphene sheets from a graphite crystal in 2004 [7]. It’s actually about time for another carbon allotrope to emerge on the scene. Graphanes (graphene sheets fully saturated by hydrogens) have potential, but they are not strictly pure carbon structures [8]. We’ll have to see if some other new carbon material will emerge soon.

 

Yet while humans have only discovered the potential of graphene recently, graphene sheets have always been available in nature as two-dimensional layers that are the building blocks of graphite. They’ve just not been accessible, as individual graphene sheets have been very difficult to isolate. The barrier to graphene synthesis was perhaps more mental than technical, as the method that finally succeeded was incredibly simple. The reseachers in Manchester had the ingenious idea to use adhesive scotch tape to lift off the graphene sheets from a graphite crystal! [7] Of course, this method didn’t scale industrially, and since this breakthrough a number of alternative methods have been quickly developed.

 

Graphene is now a mature material. Perhaps the maturation of graphene has represented the next step in the cyclic carbon evolution? Anyway, the progress in this field has been rich because graphene science straddles two different communities: the nanotube community and the low dimensional semiconductor community. Graphene has been proposed for a variety of applications in diverse fields, including field effect transistors with graphene channels [9], gas sensors [10], and solar cells [11]. It’s even attracted interest in life science as an active membrane that can separate different molecules out of a solution [12].

 

Interestingly, the progress in graphene science is to a large extent driven by theoretical predictions and results from simulations. Graphene has been the model system of choice for theoretical investigations of sp2-bonded carbon materials. As such it has been the starting point to study graphite, fullerenes and nanotubes. Many properties of graphene were then known from theoretical point of view, before it was possible to perform actual measurements. The electronic structure of graphene for instance is known from theoretical calculations performed more than 60 years ago [13]. Only recently has it been possible to actually measure the bandstructure of individual graphene sheets. Similarly, a substantial amount of our knowledge about the structure and properties of point defects and edges of graphene was first obtained by simulations and later confirmed by experiments. This shows that simulations and experimentation go hand in hand, and theoretical methods will continue to play a major role in the further development of the graphene and other materials.

Literature


[1] A. K. Geim and K.S. Novoselov, Nature Materials 6, 183 (2007).
[2] A. K. Geim,Science 324, 1530 (2009).
[3] H. W. Kroto, J. R. Heath, S. C. O'Brien, R. F. Curl & R. E. Smalley, Nature 318, 162 (1985).
[4] S. Iijima, Nature 354, 56 (1991).
[5] N. Narita, S. Nagai, S. Suzuki, and K. Nakao,  Phys. Rev. B 58, 11009 (1998).
[6] I. Forbeaux, J.-M. Themlin, and J.-M. Debever, Phys. Rev. B 58, 16396 (1998).
[7] K.S. Novoselov, A. K. Geim, S. V. Morozov, D. Jiang, Y. Zhang, S. V. Dubonos, I. V. Grigorieva, and A. A. Firsov, Science 306, 666 (2004).
[8] J. O. Sofo, A. S. Chaudhari, and G. D. Barber, Phys Rev. B 75, 153401 (2007).
[9] Yu-Ming Lin, Keith A. Jenkins, Alberto Valdes-Garcia, Joshua P. Small, Damon B. Farmer and Phaedon Avouris, Nano Lett. 9, 422 (2009).
[10] F. Schedin, A. K. Geim, S. V. Morozov, E. W. Hill, P. Blake, M. I. Katsnelson, and K. S. Novoselov,  Nature Materials 6, 652 (2007).
[11] X. Wang, L. Zhi, and K. Mullen, Nano Lett. 8, 323 (2008).
[12] S. Garaj, W. Hubbard, A. Reina, J. Kong, D. Branton, and J. A.  Golovchenko, Nature 467, 190 (2010).
[13] P. R. Wallace, Phys. Rev. 71, 622 (1947).

840 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation, Trend Watch Tags: graphene, materials, nanotechnology, alternative-energy, guest-authors, semiconductor
0

What are critical problems in alternative energy research? How does modeling play a role in bringing us closer to answers?

 

A recent review article on this topic by long-time associate Prof. Richard Catlow, et. al, caught my attention. Readers of this blog will be familiar with our many posts pertaining to 'green chemistry,' sustainable solutions, and the like. Last month, Dr. Misbah Sarwar of Johnson Matthey was featured in a blog and delivered a webinar on the development of improved fuel cell catalysts. Dr. Michael Doyle has written a series on sustainability. Drs. Subramanian and Goldbeck-Wood have also blogged on these topics, as have I. All of us share a desire to use resources more responsibly and to ensure the long-term viability of our ecosphere. This will require the development of energy sources that are inexpensive, renewable, non-polluting, and CO2 neutral. Prof. Catlow provides an excellent overview on the applications of molecular modeling to R&D in this area. Read the paper for a very comprehensive set of research problems and case studies, but here are a few of the high points.

 


  • Hydrogen production. We hear a lot about the "hydrogen economy," but where is all this hydrogen going to come from? Catlow's review discusses the generation of hydrogen from water. Research challenges include developing photocatalysts capable of splitting water using sunlight.

  • Hydrogen storage. Once you've created the hydrogen, you need to carry it around. Transporting H2 as a compressed gas is risky, so most solutions involve storing it intercalated in a solid material. LiBH4 is a prototypical example of a material that can reversibly store and release H2, but the process is too slow to be practical.

  • Light absorption and emission. Solar cells hold particular appeal, because they produce electricity while just sitting there (at least in a place like San Diego; I'm not so sure about Seattle). One still needs to improve conversion efficiency and worry about manufacturing cost, ease of deployment, and stability )with respect to weathering, defects, aging, and so forth).

  • Energy storage and conversion. Fuel cells and batteries provide mobile electrical power for items as small as hand-held devices or as large as automobiles. Catlow and co-workers discussed solid oxide fuel cells (SOFC) in their paper.

 

The basic idea with modeling, remember, is that we can test a lot of materials for less cost and in less time than with experiment alone. Modeling can help you find materials with the optimal band gaps for capture generation of photoelectric energy. It can tell us the thermodynamic stability of these new materials: can we actually make them and will they stick around before decomposing.

 

Simulation might not hit a home run every time, but if you can screen out, say, 70% of the bad leads, you've saved a lot of time and money. And if you're interested in saving the planet, isn't it great if you can do it using less resources?

 

Check out some of my favorite resources on alternative energy, green chemistry, and climate change.

 

514 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: catalysis, materials, atomic-scale-modeling, alternative-energy, green-chemistry, fuel-cells, hydrogen-storage, solar-cells
0
Accelrys is exhibiting at booth #536 at the TechConnect expo & conference in Anaheim and the conference is teeming with researchers in Nanotech, Microtech, Cleantech and BioNanotech areas.  With separate tracks in these domains, scientists and leaders in various areas of chemistry, physics and biology have gathered to network, share and create partnerships for innovation and breakthrough in technology.

One of the invited talks stressed how crucial it is to use masterbatch products for maximizing performance of carbon nanostructure composite materials.  The speaker showed critical differences in the extent of dispersion in such products.  Another presentation dealt with nanotechnology applied in drugs and biologics delivery.   Other talks were about formulation, imaging agents, and carbon nanotube ink technology.    Feeling at home in the world of nanotech… signing off for now.
495 Views 0 References Permalink Categories: Trend Watch Tags: chemistry, imaging, biology, materials, nanotechnology, conferences, biologics, cleantech, formulations, microtech, nanostructure, physics
2
3D Pareto Surface

3D Pareto surface shows the tradeoffs among target properties: dipole moment, chemical hardness, electron affinity. The optimal leads are colored red, poor leads blue.

How do you search through 106 materials to find just the one you want? In my very first blog post "High-Throughput- What's a Researcher to Do?" I discussed some ideas. The recent ACS had a session devoted to doing just that for materials related to alternative energy, as I wrote here and here.

My own contribution was work done with Dr. Ken Tasaki (of Mitsubishi Chemicals) and Dr. Mat Halls on high-throughput approaches for lithium ion battery electrolytes. This presentation is available now on Slideshare (a really terrific tool for sharing professional presentations).

We used high-throughput computation and semi-empirical quantum mechanical methods to screen a family of compounds for use in lithium ion batteries. I won't repeat the whole story here; you can read the slides foryourselves, but here are a couple take-away points:


  • Automation makes a big difference. Obviously automation tools make it a lot easier to run a few 1000 calculations. But the real payoff comes when you do the analysis. When you can screen this many materials, you can start to perform interesting statistical analyses and observe trends. The 3D Pareto surface in the accompanying image shows that you can't optimize all the properties simultaneously - you need to make tradeoffs. Charts like this one help you to understand the tradeoffs and make recommendations.

  • Don't work any harder than you need to. I'm a QM guy and I like to do calculations as accurately as possible. That isn't always possible when you want to study 1000s of molecules. Simply looking through the literature let us know that we can get away with semi-empirical.

Enjoy the Slideshare, watch for more applications of automation and high-throughput computation, and let me know about your applications, too.

669 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, atomic-scale-modeling, high-throughput, lithium-ion-batteries, materials-studio, alternative-energy, green-chemistry, virtual-screening
1 2 Previous Next