Skip navigation
1 2 Previous Next

Accelrys Blog

16 Posts tagged with the computational-chemistry tag
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,460 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,017 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
0

Music-ular Dynamics

Posted by stodd Nov 25, 2011

Ever wondered what molecular vibrations would sound like? JC Jones and Andy Shipway in Israel pondered this question and decided to use molecular dynamics to investigate.You can find more information about their project here and listen to the ethereal sounds of a journey along the myelin protein.

 

I always like to hear about people taking software like Materials Studio and using it to create something completely outside of the original intent. When we introduced MaterialsScript to control parts of Materials Studio, I played my own games around the theme of Winter resulting in the Season's Greetings set of scripts. However, the sort of work linked above takes abstraction to a totally new level!

 

The question is, what else do you use Materials Studio for that isn't in our user stories?

728 Views 0 References Permalink Categories: Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, computational-chemistry
0

Neglected Diseases like malaria, Chagas, schistosomiasis and human African trypanosomiasis (sleeping sickness) affect millions of people in the developing world. Drugs currently used to treat these diseases are of limited availability and efficacy. They’re also costly, often based on old molecules and some have severe toxic effects. Even more worrying, drug resistance is emerging in several infectious diseases.The bottom line is: A coordinated, global campaign investigating therapeutics for Neglected Diseases is a critical imperative.  

 

When SCYNEXIS approached us to donate software licenses and be a part of the cure, the Accelrys executive team readily agreed, and I’m so happy and quite proud to be with an organization that’s a part of this worthy effort.

 

A collaboration involving Accelrys, SCYNEXIS and Tibco Software is now providing a way for scientists around the world to work together on Neglected Diseases. It’s already helping to change the way science is done today and also creating the possibility for new economic opportunities for under-resourced labs in developing countries. 

 

The scientific collaboration consists of SCYNEXIS’ SaaS-based platform for drug discovery—the Hit Explorer Operating System (HEOS®) —which is providing hosted data for several not-for-profit, public-private partnerships (PPPs) that are leading the charge against Neglected Diseases. Accelrys’ contribution includes: Pipeline Pilot for moving data around, running calculations and assembling reports, while our chemical registration software builds the chemical registry. Completing the system, Tibco Spotfire Analytics provides visual analysis tools enabling scientists to interact with their data in real-time. The collaborations are truly global in nature and HEOS® allows real-time sharing of data.

fig1.png

 

Our Neglected Diseases collaboration has resulted in at least two “Eureka!” moments for me. First, I’m intrigued by the geographical distribution of the scientists using the system. Thanks to the hosted HEOS® platform, Principal Investigators (PIs) in Brazil, Ivory Coast, the Philippines, South Africa, Zimbabwe and many other countries have come together in a vibrant virtual research community. Like other social and professional networks today, this virtual community is empowering isolated researchers as never before, making them part of a larger team, much like big pharma. The hosted system is also increasing the importance of these researchers’ work by making it widely available to their colleagues around the world. A participating researcher recently told me: “My molecules matter, now that they’re part of the larger collection. So what if I only contribute a few… one of them could be a winner someday, which means my work is important now.”

 

The other thing I find interesting is the remarkably diverse chemistry that is emerging from the project. With so many disparate molecules from so many different places now available for testing against screens, it’s easier for scientists to “jump the chasm” when assessing activity because they’re not locked into only a couple of series.fig2.png The number of coumpounds, data points and disease targets are growing every year (see figure).

 

One of the major benefits of a global project like this is really untainted perspective, providing the ability to move beyond fixed ideas and preconceptions to fresh insights. The far-flung researchers now contributing to the HEOS® database bring an unabashed passion to their search for answers. Let’s face it; the diseases they’re researching are endemic to their locality, often touching neighbors, friends and family. When motivated scientists are empowered to make a difference, everything becomes possible — everything from important scientific breakthroughs resulting from better sharing of data to improved viability for the labs providing the data. For example, hosted environments like HEOS® can simplify the process of registering molecules in industry-standard sourcing databases like the Available Chemicals Directory. Under-resourced labs in economically challenged regions can become more sustainable by selling the molecules they discover.

 

The Neglected Diseases project demonstrates that scientific data can be stored securely and shared globally on a thin client. However, the real takeaway message is more compelling than this technical accomplishment. The real value is: Improved global collaboration in the cloud is empowering researchers in developing regions by making their work available to—and important to—the wider research community. This is not only changing the way we do science; it’s increasing the exuberance quotient of science for many of us.

 

What’s your vision for scientific collaboration in the cloud?

1,242 Views 1 References Permalink Categories: Data Mining & Knowledge Discovery, Trend Watch Tags: visualization, simulation, pipeline-pilot, computational-chemistry, cloud-computing, neglected-diseases
3

As a Product Manager, I am always interested in how we can use new technologies to improve the experience of working with Materials Studio.  After the Nintendo Wii was released, I pondered whether we could use the Wii Remote to do things like rotate and zoom into the structure. However, there were several reasons why I didn't pursue this – mainly that Nintendo does not really support using the Wii remote in this way.  Also, I thought it would be fairly limited in terms of what it could do.

 

However, recently I saw that Microsoft had released the Kinect for Windows SDK for download. Although only in beta at the moment, and limited to non-commercial use, it does provide the capability to interface a Kinect with Windows to control applications. With gestures and voice, this broadens the capacity for  interaction with the software. You could still do simple tasks like zoom, pan and rotate structures – that would be pretty cool and some smart people have already worked with Jmol to enable this using an open source implementation. But how about taking it further and using a gesture like touch to select different atoms, swipe to delete objects, create bonds by dragging between two atoms, etc.? What sort of tasks do you see being useful in a gesture-based world?

 

Of course, I can always get carried away with cool new ideas but back in the real world, I wonder how useful this would actually be. Do you think there is a future in gesture based control for molecular modeling applications?

773 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials-studio, computational-chemistry
0

I was recently chatting with a customer about studying phase morphology of fuel cell membranes between enclosed surfaces. He pointed me to this paper where Dorenbos et al. used mesoscale methods to model a surface interacting with a polymer. They varied the interaction potential between a coarse-grained representation of a Nafion® 1200 polymer membrane and the surface to mimic different levels of hydrophobicity. However, it seems that the surface of the carbon catalyst support would not necessarily have a single homogeneous interaction.

 

I started to think about whether it would be easy to build a mesoscale model where I had some idealized surfaces but with different interactions between a diblock polymer and the surface and whether this would affect the phase morphology.

 

In a simple example, I used a diblock copolymer with a fairly strong repulsive interaction between the two blocks. Initially, I performed Mesocite calculations in the bulk phase to check that I would get the expected formation of a lamellar phase (Fig 1a). I then introduced a surface which had no net interaction with the diblock copolymer (Fig 1b) and one where one of the blocks has a repulsive interaction the surface (Fig 1c). As expected, the introduction of the surface disrupts the lamellar structure and it forms a perforated lamellar phase. When the surface is modified so that one block repels the surface, the lamellar phase forms again but this time aligned along the surface.

 

normalSurfaces.jpg

Figure 1: Bulk (a), non-interacting surface (b), and surface with repulsion of one block (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase.

 

I also introduced two different interactions on the surface. In this case, one of the diblock components is repelled by the blue parts of the surface and the other block by the green parts of the surface. When the surface is split into two strips, a perforated lamellar forms with the blocks in the diblock aligning with the surfaces (Fig 2a). However, when the surface is patterned with four strips, the diblock still forms a lamellar phase but this time the blocks align perpendicular to the strips (Fig 2b). Finally, when a circular interaction template is used, another perforated lamellar forms but this time the blocks try to align with the circles (Fig 2c).

 

patternedSurfaces.jpg

Figure 2: Two strips (a), four strips (b), and a single circular interaction site (c). Only one of the blocks from the diblock is displayed, with the other hidden to see the shape of the phase

 

To attempt to characterize the effects of the surface, I calculated the order parameter for each system using a script from the Community. I found that as the surfaces were introduced, the order dropped from 0.19 to between 0.16 and 0.18. Although relatively small, this does indicate the surfaces are having an effect on the order of the system. You can see the evolution with time of the different morphologies in the video below

 

 

If you want to try calculations like this, I have posted more technical detail on how to generate the input structures, including templates for the structures used above, into the Materials Studio Community.

 

This rather superficial study demonstrates that the effect of surface templating or lithography can be simulated using mesoscale structures. Recent changes in the mesostructure building tools in Materials Studio have made it even simpler to construct complex patterned structures. This is a very simple study but what sort of surfaces and systems would you like to simulate?

803 Views 0 References Permalink Categories: Modeling & Simulation Tags: mesoscale, case-studies, materials-studio, computational-chemistry, alternative-energy, polymers
0

Let it snow, let it snow...

Posted by stodd Dec 19, 2010

As  the winter vacation looms nearer, I have been thinking about how I am  going to get to Scotland for Christmas. At the same time, a colleague  asked if I was going  to write a seasonal script again this year. Merging those two streams  of thought, I came up with a new seasons greeting script which you can  download or see a movie from here. Note the script requires Materials Studio 5.5 to run.

 

I  wondered initially about how to make snow fall down the screen. I  started off using shearing in DPD in Mesocite but that didn’t really  give me the flexibility  I needed. One of my colleagues, Neil Spenley, recommended trying out  the new electric fields that we added in Materials Studio 5.5. This was a  great idea as it enabled me to switch the field directions with a lot  of flexibility – hence the “wind” blows across  the screen.

 

Next  issue was how to keep the snow falling. As I only have a fixed number  of snow beads in my structure, I had to make them fall from the sky  repeatedly. To do  this, I used a soft, short range potential which, due to the magic of periodic boundary conditions, enables the beads to  fall through the ground and back in through the top of screen. In the  last part of the script, I used close contacts to dock the snow onto the  ground.

 

The  final challenge was to make the greeting appear. Initially, I fixed the  bead positions but this makes the beads appear all the time which was a  bit of an issue!  Finally, I realised that I could use the restraints energy, also  introduced in Materials Studio 5.5, to move the beads to anchor  positions that I created from a pre-defined set of coordinates.

 

After  this, it was just a matter of tuning the parameters. I experimented  with a lower electric field strength which allowed the beads to move  side to side a little.  However, when they hit the Lennard-Jones potential on the roof beads,  they bounced back up into space – not realistic!

 

After  much playing, I finally got something I was happy with – so Seasons  Greetings everyone! Now I have to think about next year...

861 Views 0 References Permalink Categories: Modeling & Simulation Tags: classical_simulation, materials, materials-studio, computational-chemistry
0

Everything old is new again. It’s interesting to see ideas in the high-performance computation community coming back around after many years. In the late 1980’s I used an array processor from FPS to speed up coupled-cluster calculations (very expensive and accurate electronic structure simulations). The array processor was bolted onto a VAX. The most compute-intensive parts of the application were shipped off to the FPS machine where they ran a lot – maybe 50-100x – faster.

 

A couple months ago I forwarded a tweet from HPC Wire, GPGPU Finds Its Groove in HPC, but a more recent article in Annual Reviews of Computational Chemistry, “Quantum Chemistry on Graphics Processing Units”, is far less optimistic about this new hardware. Authors Götz, Wölfle, and Walker discuss Hartree-Fock and density functional theory and “outline the underlying problems and present the approaches which aim at exploiting the performance of the massively parallel GPU hardware.” So far quantum chemistry results on the GPU have been less than stunning.

 

 

China's Tianhe-1A super computer uses 7,168 Nvidia Tesla M2050 GPUs and 14,336 CPUs

 

 

For those not in the know, the GPU, or graphical processing unit, “is a specialized microprocessor that offloads and accelerates 3D or 2D graphics rendering.” The idea behind General Purpose CPUs (GPGPUs) is that this power can be harnessed to do something other than graphical rendering.

 

The GPGPU approach is similar to the one we used in 1988 in Rob Bartlett’s research group. Routine calculations are done on the main microprocessor, but the most time-consuming computations are shipped off to specialized hardware (array processor then, GPU today). The programmer is required to insert directives into the code that tell the computer which hardware to use for which parts of the calculation. Usually a lot of code rewriting is required to achieve the speedups touted by the hardware suppliers. One of the major differences between then and now is the cost: our FPS cost on the order of $250 000, but a GPU board goes for around $300. And that’s one of the big reasons that people are so excited.

 

According to the article in HPC Wire, applications such as weather modeling and engineering simulation (ANSYS) have been ported, though ANSYS reported only a modest 2x speedup. The report in Ann. Rev. Comp. Chem., by contrast, points out the many challenges in getting quantum chemical (QC) calculations ported. In the case of the coupled-cluster calculations, we could formulate much of the problem as matrix-matrix and matrix-vector multiplications, which really fly on an array processor.  Hartree-Fock or DFT programs – the QC methods most commonly in use by commercial molecular modelers – need to evaluate integrals over a basis set and perform matrix diagonalizations, and these operations are less easy to port to the GPU.

 

GPUs hold tremendous potential to reduce the cost of high-performance computation, but it’s going to be a while before commercial QC products are widely available. In the meantime, stay hopeful, but at the same time stay realistic. Maybe the approach will work this time around, but it’s still too early to say.

999 Views 0 References Permalink Categories: Modeling & Simulation Tags: computational-chemistry, quantum-chemistry, ab-initio-modeling, graphical-processing-unit
0

How many modelers?

Posted by AccelrysTeam Mar 8, 2010

How many of “us” are out there? I mean how many people doing modeling and simulation? I’d really like to know, ideally broken down by discipline, such as Materials Science vs Life Science, and quantum, classical and mesoscale.

 

Alas, there are preciously few statistics on that, so when I read in the Monthly Update (Feb 2010) of the Psi-k network that they conducted a study on size of the ab initio simulation community, it got my immediate attention.

 

Representing a network of people from the quantum mechanics field, Peter Dederichs, Volker Heine and colleagues Phivos Mavropoulos and Dirk Tunger from Research Center Jülich searched publications by keywords such as ‘ab initio’, and made sure not to double-count authors. In fact they tend to underestimate by assuming people with the same surname and first initial are the same. As Prof Dederichs, the chair of the network tells me, checks were also made to ensure that papers from completely different fields are not included. Also they estimate that their keyword range underestimates the number of papers by about 10%. Of course there are those that didn’t publish a paper in 2008, the year for which the study was done. Moreover, Dederichs says, there are those who published papers which don’t have proper keywords like “ab initio” or “first principles” in the abstract or title, so they are not found in the search. All of that is likely to compensate for counting co-authors that are not actually modelers.

 

All in all, they come up with about 23,000 people! And the number of publications in the field indicates a linear rise year on year.

 

That’s quite a lot more than they expected, and I agree. The global distribution was also surprising, with about 11,000 in Europe, about 5,600 in America, and 5,700 in East Asia (China, Japan, Korea, Taiwan and Singapore). That’s a lot of QM guys, especially here in Europe. Now, there will be a response from the US on that one I guess?

 

 

I wonder how many classical modelers there are. I’d hazard a guess that the number of classical modelers is about half those in the QM community, at least in the Materials Science field. Assuming that the mesoscale modeling community is quite small, that would make for a total of at least 30,000 modelers worldwide.

 

What is your view, or informed opinion? Anybody else knows about or has done some studies? I am going to open up a poll in the right sidebar on the number of people involved in quantum, classical and mesoscale modeling in total. It would be great to hear also how you came up with your selection.

426 Views 0 References Permalink Categories: Modeling & Simulation Tags: atomic-scale-modeling, quantum-mechanics, computational-chemistry, ab-initio-modeling
3
One of the most successful uses of quantum mechanical modeling methods is to predict spectra. These methods are capable of yielding good predictions of UV/Visible, NMR, Infrared, Raman, THz, and EELS (electron energy loss spectroscopy) to name just a few. Spectroscopy (according to Wikipedia) is the "study of the interaction between radiation and matter as a function of wavelength ... or frequency." How does this help chemists? We can use the spectra to determine the structure of new molecules or materials; to determine the composition of mixtures; or to follow the course of a chemical reaction in situ. How does modeling help with this? In a number of ways, but I'll cover just 2.

One way modeling comes into play is by working with experimental results to remove ambiguities. When a chemist is trying the determine the structure of a new material, he or she takes a spectrum, or two, or three. His or her knowledge of the ingredients together with the spectra gives a pretty good idea what the chemical or crystal structure is. In a lot of cases the data are sufficient only  to narrow this down to 3-4 possible structures. Molecular modeling resolve this ambiguity by predicting the spectrum of each possibility; the spectrum that matches the experimental one presumably corresponds to the "right" one. Modeling is even more valuable when investigating defect structures like this work on Mg2.5VMoO8.

Another use is telling where experimentalists to look for the spectral peaks of a new compound. This can be especially important when trying to detect the spectra of new, novel, or poorly characterized materials. Experimental terahertz (THz) spectroscopy, for example, examines the spectral range of 3-120 cm-1, and can be used for detection and identification for a wide assortment of compounds including explosives like HMX. It's a lot safer to investigate these materials by modeling than in the lab.

A recent blog by Dr. Damian Allishighlights the importance of doing the simulations correctly. (By the way, Damian, congrats on getting to page 1000.) A lot of work for the past 40-odd years has gone into predicting spectra of isolated - or gas phase - molecules. But materials like HMX are crystalline, and calculations on the isolated molecules make for poor comparison with crystals. The recent work underscores how important it is to simulate crystals using crystals. And it's not just for THz spectra. Recent work on NMR leads to the same conclusion. A couple of programs can do this. Damian's blog focuses on DMol3 and Crystal06, but we should also mention CASTEP and Gaussian as other applications capable of predicting a wide variety of properties for solids.

Let's keep modeling - but be careful out there: short cuts will lead to poor results, and molecular modeling will end up taking the rap for user error.
635 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials-studio, computational-chemistry, quantum-chemistry, spectroscopy
0

DFT Redux

Posted by gxf Jan 14, 2010

DFT_refs.jpgI thought I'd start the year with an easy blog, simply following up on my earlier ramblings of 25 October 2009: DFT Goes (Even More) Mainstream. In that article I discussed the success of Density http://blog.accelrys.com/wp-content/uploads/2010/01/DFT_refs.jpgFunctionalTheory (DFT) and used the annual number of publications as a metric. The numbers show that publications grew by over 25% per annum, but the results for 2009 were naturally incomplete.

Happily the trend continued through 2009 for a total of 4621 DFT references in ACS Journals. Here are a few of my favorite publications, thought not all are drawn from the ACS citations. Yes, of course, these use Accelrys DFT packages, but they are still pretty cool articles:

Let me and my readers know what you think are the most interesting DFT articles from 2009.

 

†Strictly speaking, this was not QSAR, Quantitative Structure-Activity Relationship, because they didn't actually base predictions on the structure. I use the term here more generally to refer to relationships that predict complex properites like catalytic activity, on the basis of simpler properties, like workfunction.

410 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials-studio, computational-chemistry, density-functional-theory, molecular-modeling
2

In the last of the series of my blogs on Materials Studio 5.0 functionality, I will be writing about new functionality in the mesoscale area. Back in Materials Studio 4.4, we developed a new module called Mesocite. Mesocite is a module for doing coarse-grained molecular dynamics where the elementary particle is a bead. In coarse-grained molecular dynamics, a bead typically represents several heavy atoms. This has advantages over classical molecular dynamics such as Forcite Plus as you can access longer time and length scales.

 

In Materials Studio 5.0, we added the capability to do Dissipative Particle Dynamics (DPD) to Mesocite. DPD is a very coarse-grained dynamics approach where the bead can represent many atoms or even molecules. We already have a module which can do DPD in Materials Studio but this has limited ability to be extended. By developing the new DPD in Mesocite, we could take advantage of the underlying MatServer environment to easily extend DPD to run in parallel and work with MaterialsScript amongst other things.

 

One issue we faced is that the legacy DPD tool works in reduced units whereas MatServer requires physical units. The use of reduced units is fairly standard in DPD however it makes it more difficult to relate the results back to experimentalists. Therefore, we thought that switching to physical units would be a good idea. However, there were still questions as to how customers would work with a DPD in physical units. We asked a small focus group of customers very early in the release as to how they would like to parameterize DPD calculations. All agreed that getting the results in physical units was preferable but they still wanted to set up the calculations in reduced units as they have lots of historical data they want to re-use. So, we have a new user interface which allows setup in either reduced or physical units but then converts to physical units for the calculation!

 

When a new piece of functionality is added to Materials Studio, I like to add a tutorial on how to apply the software, what sort of values customers should use, and how to get the most out of the software. For the new DPD functionality, I looked at several papers before settling on an application by Groot and Rabone looking at the effect of non-ionic surfactants on membrane properties. This interested me as it demonstrates the strength of mesoscale in looking at varying concentrations of different components and seeing the effect on morphology. I also realized that I could use some of the Mesocite analysis to really analyze the system for properties such as concentration profiles and examining the diffusivity of the beads across the membrane. This mapped really well to the original results and produced what I hope is an interesting tutorial.

 

There was another reason I chose this paper too - Groot and Rabone also looked at the effect of strain on the membrane. This wasn’t possible with the old DPD module but, using some MaterialsScript, I could strain the system and then calculate the surface tension. As I like to dabble with MaterialsScript, the lure of this was irresistible and the script I made is available from the Accelrys Community website.

 

One minor issue was that the system sizes and relative amounts were not clear from the paper. Luckily there are several ex-Unilever people at Accelrys, so one of my colleagues, Dr Neil Spenley, contacted one of the authors, Dr Robert D. Groot. I was also lucky that Dr Groot obviously hordes his old research work and, a day later, I had the original DPD input file in my hands!

 

The results from the strain calculations also show the same trends as those reported by Groot and Rabone so I was pretty happy with this work.

 

So that wraps up my blogs on Materials Studio 5.0. I hope to have given you an insight into some of the processes that go into making a new version of Materials Studio.

617 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: mesoscale, nanotechnology, materials-studio, multiscale-modeling, computational-chemistry, polymers
0

So long AIChe and Tennessee

Posted by mdoyle Nov 12, 2009
The show was excellent and we saw many people; old friends and new, young users of modeling tools. The level of interest, in a range of approaches, from chemical process development to solar energy and bio mass, point to an exciting future and the opportunities that may occur as we face the challenges of the next decade and beyond.
441 Views 0 References Permalink Categories: Cheminformatics, Modeling & Simulation Tags: conferences, combinatorial-chemistry, computational-chemistry
0

Headin' to AIChE...

Posted by mdoyle Nov 9, 2009
Sat here in my seat en-route to the 2009 AICHE meeting, although a weekend flight, I had plenty of time to catch up on the back issues of my journals. So it was with interest I read Derek Lowe's column on Pharmaceutical production and formulation as well as Anne Thayer's on sustainable chemical synthesis and Rick Mullins in Process IT. I wonder how these themes of better formulation design, greener synthesis and better reaction solvent use, and linking process chemistry into business decision making will be reflected in the talks over the next 4 days.

The location sounds very exciting and - I am shocked to say after traveling on business to 45 countries, its my first time in Nashville. I am excited and hope to have some excellent discussions with scientists and engineers.

This meeting is a mile stone for those of us in the modeling and simulation area, since we are now releasing Materials Studio 50 which represents over 15 years continuous development and which on a simple PC platform encompasses quantum, atomistic, meso and data scale simulations.

There is as I say above many challenges in chemistry now perhaps more than ever, and I am certain these tools have a growing place in the chemists tool set when facing them and a growing applicability across all areas of science and materials development.
403 Views 0 References Permalink Categories: Cheminformatics, Modeling & Simulation Tags: conferences, combinatorial-chemistry, computational-chemistry
0

A Man with 5 Rules

Posted by AccelrysTeam Aug 12, 2009

Chatting with Christopher Lipinski at Drug Discovery & Development Week


Some ten years ago, I first “met” the Lipinski rules in a software project.  That was my last direct “hands-on” encounter with chemistry.  At Accelrys I am the senior product manager for the Biosciences and Analytics Collections for Pipeline Pilot.  Think genomics, proteomics, sequencing, and ontologies and not chemistry!  This week I was at the DDDW show in Boston – don’t think “booth babe”.

The conference was not as busy this year as it had been in the past and it was the afternoon of the last day.  A distinguished gentleman walked up to our booth wearing a name tag of “Christopher A. Lipinski,” happy to see a fellow booth dweller.   Half in jest I asked if he might be the man with 5 rules.  Turns out he was and, boy, I was in for an intellectual treat.   That Lipinski filter came to life in a new way over the next hour or so.  I was spell bound by Dr. Lipinski’s breadth of knowledge, passion for science, and his out of the box thinking.  What I didn’t anticipate were his insights into the importance of chemistry for the biomarker and translational research space.

He was saying some really awesome things so I started writing them down.    It was hard to focus on note taking because Dr. Lipinski is an excellent speaker and very animated.  Below are a few items that I am willing to share in no particular order:


  • Translational research must have good chemistry married to good biology.

  • Your company (Accelrys) combines chemistry and biology in one software application.  If biologists are using your software to look at high throughput screening (assay) data that has associated chemical structures, they could better filter out results for poor compounds.

  • When faced with people problems (like chemistry—biology conflicts) versus technical problems—the people problems are always much more difficult to solve.

  • The people side is the most important.

  • NIH is making good strides in the dialog between chemists and biologists.

  • As soon as the biologist has an assay for a small molecule they should probe/stress test the assay with compounds known historically to cause assay problems.

  • In software for the (bench) biologist – it needs to be dead easy.  Too many peer-reviewed publications have great biology but rotten chemistry.

  • Biologically active compounds are tightly clustered in chemical space.  It is always best to look for new activity in areas of chemical space where you previously found activity.

  • It takes 10 years to “mature” a medicinal chemist.  He then becomes an expert in pattern recognition even if he can’t articulate why certain structures look better than others

  • Areas of interest

    • Stem cell (non-embryonic source)  derived screening application

    • Many previously proprietary databases are now in the public domain  (See PMID:  17897036).  These provide a great starting point for the discovery of drugs for rare diseases.


Dr Lipinski’s long and prestigious career in medicinal chemistry, assay development, computational chemistry, and now in consulting, lecturing, and as an expert witness does not look anything like retirement.  That is good news for me.

 

 

Dr. Lipinski is shown here with his rapt audience.

Dr. Lipinski is shown here with his rapt audience.

 

Note: Lipinski’s total number of rules actually equals 4.  His rules are known as the “Rule of Five” because each of them incorporates the number 5 in some way.  For all you literalists out there, “5 Rules” should be interpreted in this way.

445 Views 0 References Permalink Categories: Bioinformatics Tags: biomarkers, high-throughput, computational-chemistry, drug-discovery, pattern-recognition, rule-of-five, translational-research
1 2 Previous Next