Skip navigation
1 2 3 Previous Next

Accelrys Blog

38 Posts tagged with the materials-studio tag
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,453 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,009 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
0

DFT by the Numbers: 2011

Posted by gxf Jan 12, 2012

DFT_2011.pngThis blog post continues my annual tradition of reviewing the DFT literature for the previous year. Most users of Accelrys Materials Studio have at least heard of this method. DFT=Density Functional Theory, a quantum mechanical method for solving the electronic Schroedinger equation of molecules and crystals using a formalism that is based the charge density rather than the wavefunction. You can see the Kohn-Sham equations at a number of sites on the web. These are what have been implemented in a number of software packages such as Accelrys MS DMol3 and MS CASTEP, as well as, in packages developed by other research groups.

 

In my annual informal survey, the year 2011 saw a 22% increase in the number of publications with the term "DFT" in the text (as determined by full text searches at the ACS and Science Direct web sites). Total count was 12 080. DFT has been outrageously successful since it combines relatively high accuracy with low computational cost - at least low by QM standards. In addition, it works very well for periodic systems, so it has extended routine QM calculations to areas such as heterogeneous catalysis.

 

This year, I'd like to highlight some applications in one of my favorite areas: alternative energy. I found a total of 2,255 citations that included DFT along with 'fuel cells,' 'batteries,' 'biofuels,' 'solar cell,' or 'photovoltaic'. That's almost 20%: Just about 1 DFT paper in 5 addressed some topic in alternative energy! Let's hear it for green modelers.

 

Some of my favorite papers from 2011:

  • Gao, et al., Dynamic characterization of Co/TiO2Fischer-Tropsch catalysis with IR spectroscopy and DFT calculations, presentation at North American Catalyst Society meeting Detroit, 5-10 June 2011, available on slidesshare.
  • Castrucci, et al., Light harvesting with multiwall carbon nanotube/silicon heterojunctions Nanotechnology 22 (2011) 115701. doi:10.1088/0957-4484/22/11/115701
  • Shin et al, Effect of the alkyl chain length of C70-PCBX acceptors on the device performance of P3HT : C70-PCBX polymer solar cells, J. Mater. Chem., 2011, 21, 960–967. DOI: 10.1039/C0JM02459G
  • Gaetches, et al., Ab Initio Transition State Searching in Complex Systems: Fatty Acid Decarboxylation in Minerals, J. Phys. Chem. A 2011, 115, 2658–2667. dx.doi.org/10.1021/jp200106x
  • Di Noto et al, Structure–property interplay of proton conducting membranes based on PBI5N, SiO2–Im and H3PO4 for high temperature fuel cells, Phys. Chem. Chem. Phys., 2011, 13, 12146–12154. DOI: 10.1039/c1cp20902g

 

Full disclosure: my favorites were done with Accelrys tools. You can see all the 2011 scientific publication lists for DMol3 and CASTEP on the Accelrys web site. Be sure to let me know what your favorites are.

1,751 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials-studio, alternative-energy, green-chemistry, quantum-chemistry
0

Today at the Materials Research Society meeting in Boston, Cyrus Wadia, the Assistant Director for Clean Energy and Materials Research and Development at the White House Office of Science and Technology Policy challenged the audience to “think as a network” of collaborators, not individual researchers, and to “learn from colleagues” how to better handle large data sets.

 

The recently publicized presidential initiative for a Materials Genome aims to insert or provide new materials for the production and technology advantage of US industry. This initiative mirrors exactly the focus Accelrys has on integrated multi-scale modeling and laboratory-to-plant optimization. The heart of these initiatives is improving the estimation and management of innovation and design of materials from the nano-scale right up to the process and macro production scale.

 

As many Accelrys users know, multi-scale modeling where the results of atomistic and micro-scale simulation can be combined with larger scale processes, high throughput screening and testing, and laboratory information yields valuable insights into materials design. The combination of computational and real analytical instruments alongside the organization of laboratory data allows the use of models at all length scales and can be considered as driving the innovation cycle of many companies.

 

The MGI initiative clearly recognizes this function and the idea of developing a framework and ontologies where information, models and data can be exchanged at different levels and different time scales. The value of making information, more widely available to the engineer and materials scientist for collaboration will help drive more accurate and systematic materials development and usage. Accelrys is very excited about this initiative and is sure its technology will provide clear aid in the acceleration of the innovation lifecycle of new materials—which is a clearly stated goal of this initiative.

 

How do you feel about this initiative?

757 Views 0 References Permalink Categories: Trend Watch, Data Mining & Knowledge Discovery, Modeling & Simulation, Materials Informatics Tags: materials, nanotechnology, pipeline-pilot, atomic-scale-modeling, high-throughput, materials-studio, cloud-computing
0

Music-ular Dynamics

Posted by stodd Nov 25, 2011

Ever wondered what molecular vibrations would sound like? JC Jones and Andy Shipway in Israel pondered this question and decided to use molecular dynamics to investigate.You can find more information about their project here and listen to the ethereal sounds of a journey along the myelin protein.

 

I always like to hear about people taking software like Materials Studio and using it to create something completely outside of the original intent. When we introduced MaterialsScript to control parts of Materials Studio, I played my own games around the theme of Winter resulting in the Season's Greetings set of scripts. However, the sort of work linked above takes abstraction to a totally new level!

 

The question is, what else do you use Materials Studio for that isn't in our user stories?

723 Views 0 References Permalink Categories: Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, computational-chemistry
2

Turning Over Rocks to Find Oil

Posted by mdoyle Oct 13, 2011

As many of you know or have noticed, the price of petrol or gasoline is increasing. This is due to the fact that we have reached or passed the peak of easily available and exploitable oil reserves. Of course, there are fields which become economically producible depending on different oil prices, however, the majority consumption of gasoline and other petrochemical products is now coming from increasingly costly and difficult reserves. Finding, locating, quantifying, producing and then operating oil reserves is a complex multi-disciplinary process. There is a huge amount of science and technology devoted to the processing and analysis of seismic and geo-seismic data. However, I am focusing here on the drilling, evaluation, completion and production areas of the oil extraction and production process from shale.

 

Shale is the most abundant of the various types and volumes of rocks where oil is found #Shale is the most abundant source of hydrocarbons for oil and gas fields. Shale is commonly considered to be the hydrocarbon source mineral from which the majority of the oil and gas in conventional reservoirs originated. Shale is a sedimentary (layered) rock that has thin layers of clay or mud. There are many variations in shale geology, geochemistry, and production mechanisms. These variations occur between wells or even between or within the same shale area.  Shales are dissimilar one to another and each shale deposit offers its own set of unique technical challenges and learning curves. This being said, Shale gas was characterized recently as a "once-in-a-lifetime opportunity" for U.S. manufacturing, by Bayer Corp. CEO Gregory Babe.

 

Shale is unique in that it often contains both free and absorbed gas. This results in initially high production rates, quick decline curves, and then long-term steady production. Formed millions of years ago by silt and organic debris accumulations on lake beds and sea bottoms, the oil substances in oil shale are solid and cannot be pumped directly out of the ground. To be produced effectively, oil shale is first mined then heated to a high temperature. An alternative, but currently experimental process referred to as in situ retorting, involves heating the oil shale while it is still underground, and then pumping the resulting liquid to the surface. When producing gas from a shale deposit, the challenge and difficulty is in allowing and enhancing the access to the gas stored within the shale's natural fracture system.

 

The complexity of the materials' interactions which occur sub-surface and the advanced challenges within the system require that formulators and product designers in the service companies providing drilling fluids, muds and completion fluids investigate all the complex interactions and trade-offs in formulation, recipe and ingredient space.

 

761 Views 0 References Permalink Categories: Materials Informatics, Data Mining & Knowledge Discovery, Modeling & Simulation, Electronic Lab Notebook, Trend Watch Tags: pipeline_pilot, materials-studio
1

In a recent article, David Axe of Wired Science rightfully points out many of the pros and cons and bids farewell to the NASA shuttle program, fondly calling it "the most visually impressive high-tech boondoggle in American history".

 

 

The space shuttle, like any iconic and classical piece of engineering and science, has a finite life and operating span. For its time, that the shuttle flew so well for so long and in such amazing situations was truly an amazing tribute to the legion of aerodynamics, space engineering, production and the aeronautical materials and chemical engineers and scientists. Just consider the conditions the shuttles endure on re-entry for one second: This is a speed, i.e., drag, across the shuttle's surface, of over Mach 5 which creates incredible forces that stress the surface, the tiles and the materials.

 

 

 

Further, this flow challenge for the shuttle's materials happens simultaneously with other challenges such as plasma temperatures over 5000 degrees Celsius and the transition from vacuum to atmosphere. Pretty tough environment for anything to cope with, let alone fly afterwards. The fact that we as a species can consider, design and build the system and the materials that constitute this, is a tour de force of science.

 

 

 

 

This is just part of the amazing heritage the shuttle brings.  There is also the incredible materials science that goes into the leading edge coatings, the tiles, the thrusters, the propellant and yes, even the solid rocket boosters. Just remember each tile is made, tested, coated and affixed in place. Consider the heat cycle that the leading edge of the shuttle goes through and how amazing the chemical understanding of the heat mediated grain segregation and zone diffusion is. These parts create a visible rainbow of beautiful colors due to the effect of extreme heat and pressure have on the alloy structure. Being able to predict and understand the performance of these parts under stringent space flight safety limits is again a triumph of materials science and engineering.

 

 

 

So the shuttle, rather than representing a failure or waste of engineering time, truly represents the triumph of our materials science and engineering innovation.

 

As we look forward to the next decade and the new multi-purpose crew vehicle (MPCV) derived from the NASA Gemini project, there are many new challenges for longer duration missions.

 

 

 

First, is thermal protection for a larger re-entry vehicle. With more astronauts the vehicle is enlarged and consequently, on atmospheric re-entry requires more deceleration and hence more heat to be dissipated. Next of course is the radiation encountered for the long haul or extended duration mission to Mars, which requires advanced shielding to protect both the crew members, as well as, the sensitive command and control subsystems in the MPCV. Further, the altitude control and motor systems have to function and continue to work after long period idle in temperatures from 100 degrees below zero, to baked in the sun. And finally, the MPCV will have different missions, in different orbits and of different durations which then require a degree of flexibility without compromising the system capabilities and its fundamental safety profile and performance.

 

 

 

For example, one might consider the use of advanced graphene based or Bucky tube based composites, such as are used in the new generation of commercial aircraft. The difficulty or tradeoff there is that the use of composites in exposed areas is complicated by atomic oxygen levels in certain orbits or at certain oxygen levels. This atomic oxygen is known, as Dr. Lackritz of Lockheed Martin said, to "chew up Bucky tubes and graphene" at an amazing rate, so this causes the design challenges of these materials to increase.

 

 

 

If you then think about the next generation of solid rocket boosters and attitude control systems, these too need higher performance. They need to reduce size, extend their operating lifetime, survive longer, i.e., resist space radiation longer and have a lower environmental impact. A long list of somewhat contradictory or opposing challenges.

 

 

 

 

 

So the nature, demand and need for advanced materials will inevitably continue. Perhaps I would be so bold as to say it is going to increase as our demand for faster low orbit communications, quicker travel and even space tourism takes off. This is the true nature of the challenge we face:  Increasing our expertise and knowledge of materials, advanced processing capabilities while simultaneously increasing our knowledge capital and skill in designing and optimizing these materials from their inception in the lab, to their use in a design system (conceptual and practical), to their integration in a production plant, to their use in space.

 

 

 

 

I feel that the closing of the space shuttle program is not a failure or a negative event, at all. It merely marks the next chapter or the first step in the continuation of good old fashioned, scientific innovation. This innovation is what I believe will, as did developments like penicillin and the light bulb, move us as a society forward to greater achievements and greater accomplishments.

 

 

 

What do you think?

672 Views 0 References Permalink Categories: Trend Watch Tags: materials, materials-studio, materials-science, aerospace
0

In my last blog, I talked about how improved global collaboration in the Cloud is not only improving Neglected Diseases research but also the "exuberance quotient" of science. Today our current economic woes tied to the sovereign debt crisis have got me thinking about the darker cloud hovering over researchers today, one that may very well threaten "exuberant" science in the months ahead, especially in university labs.

 

There’s no way you can look at today’s economic situation and postulate that government funding of scientific research in academic labs is going anywhere but down. It stands to reason that this will drive changes in behavior and requirements for Academia to find funding alternatives for their research.

 

First, university labs will need new ways to collaborate externally, not only with colleagues at other institutions but with those at the many commercial companies that will likely end up funding more and more academic research as government sources dry up. Second, they will need viable channels for commercializing the technology they develop, so that new applications, protocols and processes emerging from university labs become readily available to the wider scientific community (while also providing a return revenue stream supporting university research). Last but not least, with university researchers under increasing pressure to publish results, secure patents and acquire grants in the face of shrinking budgets and resources, they need simplified access to affordable software and services -- and we just took steps towards that end with our recently announced academic program.

 

This new academic paradigm and resulting wish list become much more achievable when university researchers deploy their technology on a scientific informatics platform that’s already widely used in the commercial world. This provides a built-in installed base and ready market for workflows and protocols. A widely deployed platform with the ability to capture a protocol as a set of XML definitions enables scientists working in the same environment to replicate an experiment or calculation with drag-and-drop simplicity and precision. If you start with the same data set, you end with the same results. Experiments are more reproducible, academic papers more credible and, most importantly, non-experts can advance their research using robust, expert workflows.

 

Academic researchers drive innovation that impacts the larger scientific community, but getting the innovation out there is still a challenge. In this regard, an industry-standard platform can also serve as the basis for an innovative new marketplace, a kind of scientific application exchange, where academics and their partners can expose their breakthrough technologies to a wider audience—and even charge a fee for using them. In the present economy, this new channel could provide much needed additional funding and a feedback loop for academic groups, enabling them to continue their vital research.

 

What are your thoughts on surviving—and perhaps even thriving—in today’s down economy?

685 Views 0 References Permalink Categories: Executive Insights, Trend Watch Tags: discovery_studio, pipeline-pilot, materials-studio, innovation, cloud-computing, academic-research-funding
3

As a Product Manager, I am always interested in how we can use new technologies to improve the experience of working with Materials Studio.  After the Nintendo Wii was released, I pondered whether we could use the Wii Remote to do things like rotate and zoom into the structure. However, there were several reasons why I didn't pursue this – mainly that Nintendo does not really support using the Wii remote in this way.  Also, I thought it would be fairly limited in terms of what it could do.

 

However, recently I saw that Microsoft had released the Kinect for Windows SDK for download. Although only in beta at the moment, and limited to non-commercial use, it does provide the capability to interface a Kinect with Windows to control applications. With gestures and voice, this broadens the capacity for  interaction with the software. You could still do simple tasks like zoom, pan and rotate structures – that would be pretty cool and some smart people have already worked with Jmol to enable this using an open source implementation. But how about taking it further and using a gesture like touch to select different atoms, swipe to delete objects, create bonds by dragging between two atoms, etc.? What sort of tasks do you see being useful in a gesture-based world?

 

Of course, I can always get carried away with cool new ideas but back in the real world, I wonder how useful this would actually be. Do you think there is a future in gesture based control for molecular modeling applications?

768 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials-studio, computational-chemistry
0

I was recently chatting with a customer about studying phase morphology of fuel cell membranes between enclosed surfaces. He pointed me to this paper where Dorenbos et al. used mesoscale methods to model a surface interacting with a polymer. They varied the interaction potential between a coarse-grained representation of a Nafion® 1200 polymer membrane and the surface to mimic different levels of hydrophobicity. However, it seems that the surface of the carbon catalyst support would not necessarily have a single homogeneous interaction.

 

I started to think about whether it would be easy to build a mesoscale model where I had some idealized surfaces but with different interactions between a diblock polymer and the surface and whether this would affect the phase morphology.

 

In a simple example, I used a diblock copolymer with a fairly strong repulsive interaction between the two blocks. Initially, I performed Mesocite calculations in the bulk phase to check that I would get the expected formation of a lamellar phase (Fig 1a). I then introduced a surface which had no net interaction with the diblock copolymer (Fig 1b) and one where one of the blocks has a repulsive interaction the surface (Fig 1c). As expected, the introduction of the surface disrupts the lamellar structure and it forms a perforated lamellar phase. When the surface is modified so that one block repels the surface, the lamellar phase forms again but this time aligned along the surface.

 

normalSurfaces.jpg

Figure 1: Bulk (a), non-interacting surface (b), and surface with repulsion of one block (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase.

 

I also introduced two different interactions on the surface. In this case, one of the diblock components is repelled by the blue parts of the surface and the other block by the green parts of the surface. When the surface is split into two strips, a perforated lamellar forms with the blocks in the diblock aligning with the surfaces (Fig 2a). However, when the surface is patterned with four strips, the diblock still forms a lamellar phase but this time the blocks align perpendicular to the strips (Fig 2b). Finally, when a circular interaction template is used, another perforated lamellar forms but this time the blocks try to align with the circles (Fig 2c).

 

patternedSurfaces.jpg

Figure 2: Two strips (a), four strips (b), and a single circular interaction site (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase

 

To attempt to characterize the effects of the surface, I calculated the order parameter for each system using a script from the Community. I found that as the surfaces were introduced, the order dropped from 0.19 to between 0.16 and 0.18. Although relatively small, this does indicate the surfaces are having an effect on the order of the system. You can see the evolution with time of the different morphologies in the video below

 

 

If you want to try calculations like this, I have posted more technical detail on how to generate the input structures, including templates for the structures used above, into the Materials Studio Community.

 

This rather superficial study demonstrates that the effect of surface templating or lithography can be simulated using mesoscale structures. Recent changes in the mesostructure building tools in Materials Studio have made it even simpler to construct complex patterned structures. This is a very simple study but what sort of surfaces and systems would you like to simulate?

796 Views 0 References Permalink Categories: Modeling & Simulation Tags: mesoscale, case-studies, materials-studio, computational-chemistry, alternative-energy, polymers
0

Let it snow, let it snow...

Posted by stodd Dec 19, 2010

As  the winter vacation looms nearer, I have been thinking about how I am  going to get to Scotland for Christmas. At the same time, a colleague  asked if I was going  to write a seasonal script again this year. Merging those two streams  of thought, I came up with a new seasons greeting script which you can  download or see a movie from here. Note the script requires Materials Studio 5.5 to run.

 

I  wondered initially about how to make snow fall down the screen. I  started off using shearing in DPD in Mesocite but that didn’t really  give me the flexibility  I needed. One of my colleagues, Neil Spenley, recommended trying out  the new electric fields that we added in Materials Studio 5.5. This was a  great idea as it enabled me to switch the field directions with a lot  of flexibility – hence the “wind” blows across  the screen.

 

Next  issue was how to keep the snow falling. As I only have a fixed number  of snow beads in my structure, I had to make them fall from the sky  repeatedly. To do  this, I used a soft, short range potential which, due to the magic of periodic boundary conditions, enables the beads to  fall through the ground and back in through the top of screen. In the  last part of the script, I used close contacts to dock the snow onto the  ground.

 

The  final challenge was to make the greeting appear. Initially, I fixed the  bead positions but this makes the beads appear all the time which was a  bit of an issue!  Finally, I realised that I could use the restraints energy, also  introduced in Materials Studio 5.5, to move the beads to anchor  positions that I created from a pre-defined set of coordinates.

 

After  this, it was just a matter of tuning the parameters. I experimented  with a lower electric field strength which allowed the beads to move  side to side a little.  However, when they hit the Lennard-Jones potential on the roof beads,  they bounced back up into space – not realistic!

 

After  much playing, I finally got something I was happy with – so Seasons  Greetings everyone! Now I have to think about next year...

855 Views 0 References Permalink Categories: Modeling & Simulation Tags: classical_simulation, materials, materials-studio, computational-chemistry
0

What Happens When DFT Gets Excited?

Posted by gxf Nov 29, 2010

At the start of this year I wrote about the importance of spectroscopy in my blog Spectroscopy: Where Theory Meets the Real World. Experimentally, spectroscopy is essential for the characterization of compounds, identification of unknowns, tracking of contaminants, in situ monitoring of chemical processes, and more. Modeling has worked hard to develop reliable ways to predict spectra from first principles.

 

Since I wrote in February 2010, Materials Studio  has added Time-Dependent DFT (TD-DFT) to DMol3. This provides a means to predict optical properties like UV/visible spectra and hyperpolarizabilities, adding to the IR, THz, and other methods that I discussed earlier. My recent webinar discussed some of the theory behind this and gave some examples. (You can download a recording, or view a pdf copy.)

 

An interesting question came up during Q&A, and I wanted to highlight it here since I think it's of general interest and serves to underscore some of the less intuitive aspects of DFT (density functional theory). There are several different approximations that one can use to predict the energy of excited states. The simplest is to use the differences in the orbital eigenvalues. The question is why this is such a poor approximation in DFT, especially given that it's a reasonably good approximation in molecular orbital (MO) approaches like Hartree-Fock or AM1.

 

The answer lies in the difference between an MO-based approach and a density-based approach. The theory of DFT is based on the charge density, not the MOs. While the eigenvalues of Hartree-Fock theory are related to the energy of the orbitals the eigenvalues in DFT are related to, well, something else. Write down the Hartree-Fock energy expression for a molecule and for the cation and then take the difference. You get the formula for the orbital eigenvalue. This is Koopman's theorem: εi= E(neutral) - E(cation) (and you can read more about it here). When you do the same thing for the DFT energy, you discover Janak's theorem, εi=∂E/∂ni. If you approximate this with finite differences, εi=ΔE/Δniit then it might look a bit like Koopman's theorem, but the finite difference turns out to be a poor approximation to the infinitesimal derivative. You can read just how bad the approximation is in my 2008 paper.

 

Fortunately for us, we now have TD-DFT. As illustrated in Delley's original paper and in my own modest tests, this is a dramatic improvement over what was possible before. It increases the usefulness of this essential modeling tool and provides another way for theory to connect to the real world of observables. Stay tuned for more results in the coming weeks.

555 Views 0 References Permalink Categories: Modeling & Simulation Tags: dft, materials-studio, quantum-chemistry, spectroscopy
0

What do energy materials and pharmaceuticals research have in common? They both stand to benefit from the major improvements in quantum tools now available Materials Studio 5.5.

 

Supporting the energy and functional materials field is a new method in Materials Studio’s DMol3 that’s all about light-- excited states, UV/Vis spectra, and non-linear optical properties. The measurement of these properties is essential in the design of new energy efficient lighting as well as alternative energy generation, such as photovoltaics. DMol3 optical properties are based on time-dependent density functional theory (TD-DFT) . Some of you may know, or perhaps vaguely remember, that the Schroedinger equation is time dependent, so that’s the one that is basically solved here, rather than the usual static approximation. As you can imagine, this can be costly, but I am happy to say that the implementation is actually quite fast (keeping computing costs down) and accurate (making experimentation faster and more efficient).

 

There are many more applications DMol3, for example materials for holographic discs, which have simply amazing storage capacity. Another application area which I find intriguing is in life sciences, for example helping to understand vision better, so I’ll be interested to see where DMol3 optical properties will be used next.

 

Talking about life sciences, another new quantum feature I am really excited about could have big implications for the development of drug molecule candidates into dosage forms. The crystallization and polymorphism of drug molecules needs to be well understood, and modelling has been used for a while to support that work. Polymorph prediction remains tricky however, for a number of reasons. Sampling all the possible structures is a tall order, particularly if the molecules are flexible, as many new drug molecules are. Also, it may not be sufficient to consider the thermodynamics of the structures, as crystallization kinetics can play a major role.

 

However, these considerations are in a way academic if one cannot even optimize structures and rank them by energy with high accuracy. One of the key reasons why this has been so difficult is that both classical and quantum methods have significant shortcomings in this type of application. The force-fields used in classical methods either neglect or at best approximate electronic effects such as polarization, which are often important in drug crystals. Density functional quantum methods on the other hand do that well, but are less effective at capturing more long range, dispersion interactions. Now in Materials Studio 5.5, DMol3 and CASTEP have dispersion correction terms to overcome this issue. With some pioneering work in the literature that demonstrated the power of combining density functional and dispersion methods, I look forward to many great applications of this technology.

 

 

Fermi surface of Yttrium, a rare-earth material used in lasers, displays etc

 

 

There is much more in the release (see the above picture for example) and you can hear all about it at our upcoming webinars, or see it live at the European User Group Meeting later this month.

707 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, quantum-mechanics, materials-studio, alternative-energy, quantum-chemistry, spectroscopy, ab-initio-modeling, polymorphs, optics
0
Offering insight from the perspective of a Pipeline Pilot and Materials Studio user, Accelrys is pleased to host a posting written by guest blogger Dr. Misbah Sarwar, Research Scientist at Johnson Matthey. Dr. Sarwar recently completed a collaboration project focused on fuel cell catalyst discovery and will share her results in an upcoming webinar. This post provides a sneak peek into her findings...   

“In recent years there has been a lot of interest in fuel cells as a ‘green’ power source in the future, particularly for use in cars, which could revolutionize the way we travel. A (Proton Exchange Membrane) fuel cell uses hydrogen as a fuel source and oxygen (from air), which react to produce water and electricity. However, we are still some time away from driving fuel cell cars, as there are many issues that need to be overcome for this technology to become commercially viable. These include improving the stability and reactivity of the catalyst as well as lowering their cost, which can potentially be achieved by alloying, but identifying the correct combinations and ratios of metals is key. This is a huge task as there are potentially thousands of different combinations and one where modeling can play a crucial role.

As part of the iCatDesign project, a three-year collaboration with Accelrys and CMR Fuel Cells funded by the UK Technology Strategy Board, we screened hundreds of metal combinations using plane wave CASTEP calculations.

In terms of stability, understanding the surface composition in the fuel cell environment is key. Predicting activity usually involves calculating barriers to each of the steps in the reaction, which is extremely time consuming and not really suited to a screening approach. Could we avoid these calculations and predict the activity of the catalyst based on adsorption energies or some fundamental surface property? Of course these predictions would have to be validated and alongside the modeling work, an experimental team at JM worked on synthesizing, characterizing and testing the catalysts for stability and activity.

The prospect of setting up the hundreds of calculations, monitoring these and then analyzing the results seemed to us to be quite daunting and it was clear that some automation was required to both set up the calculations and process the results quickly. Using Pipeline Pilot technology (now part of Materials Studio Collection) protocols were developed which processed the calculations and statistical analysis tools developed to establish correlations between materials composition, stability and reactivity. The results are available to all partners through a customized web-interface. 

The protocols have been invaluable as data can be processed at the click of a button and customized charts produced in seconds. The timesaving is immense, saving days of endless copying, pasting and manipulating data in spreadsheets, not to mention minimizing human error, leaving us to do the more interesting task of thinking about the science behind the results. I look forward to sharing these results and describing the tools used to obtain them in more detail in the webinar, Fuel Cell Catalyst Discovery with the Materials Studio Collection, on 21st July.”
488 Views 1 References Permalink Categories: Modeling & Simulation Tags: castep, catalysis, pipeline-pilot, materials-studio, webinars, fuel-cells, guest-authors, materials-studio-collection
1

Talkin’ about a revolution...

Posted by stodd Jul 6, 2010

As I touched on when I was musing about copy and paste, it never ceases to amaze me how much compute power has increased since I started doing modeling 15 years ago. Back then, I could minimize a few hundred atoms with classical mechanics and watch the geometry update every few seconds after each iteration. Now, I would do a few thousand molecules of the same size in a few seconds.

 

Of course, as computers get faster, the sorts of calculations that customers want to run also get larger and more “realistic.” The explosive growth of multi-core architectures has lead to an increased focus on parallelization strategies to make the most efficient use of the CPU power. Traditionally, the main focus with high performance codes has been on getting them to run in fine-grained parallel as efficiently as possible, scaling a single job over 100’s of CPU’s. The issue here is that, if you are only studying a medium sized system, you will quickly get to a point where communication between nodes costs far more than the time in the CPU, leading to poor performance.

 

The solution, if you are doing multiple calculations, is to run multiple calculations in a coarse-grained, or embarrassingly parallel, approach. In this mode, you run an individual calculation on a single CPU but occupy all your CPU’s with multiple calculations. Materials Studio will do this but it can be monotonous, although you can use queuing systems to get somewhere here. The other issue is that when you start doing many calculations, you also get lots of data back to analyze. Now you not only need a way to submit and control all the jobs, but also to automatically get the results you want.

 

Luckily, the Materials Studio Collection in Pipeline Pilot has both of these functionalities. You can use the internal architecture to submit jobs in both fine and coarse grained parallel (or a mixture of the two if you have a nice cluster!). You can also use the reporting tools to extract the data you want and display it in a report.

 

A great example of this is working with classical simulations of polymers for calculation of solubility parameters. Here, you need to build several different cells to sample phase space and then run identical calculations on each cell. You can quickly develop the workflow and then use the coarse-grained parallel options to submit a calculation to each CPU. When the calculations are finished, a report of the solubility parameter can be easily generated automatically.

 

So, whilst there is a definite need for an expert modeling client like Materials Studio, the Materials Studio Collection really does free you up to focus on problem solving (using Materials Studio!), not waiting for calculations to finish and then having to monotonously create reports!

511 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, classical-simulations, polymers, materials-studio-collection, multicore, parallelization
1 2 3 Previous Next