Skip navigation
Previous Next

Accelrys Blog

October 2009
1

DTC Genetic Testing: Take 2

Posted by AccelrysTeam Oct 29, 2009
I am trying to concentrate on creating a “Biomarkers” poster for the Chemical and Biological Defense Science and Technology Conference that I am attending  next month in Dallas.  However, I have a hard time resisting my email.  Just now, I received GenomeWeb Daily News that contained a blurb about:  Amway to Sell Interleukin Genetics Health Tests, October 29, 2009.

I thought, "Is this really the Amway that my neighbor tried to get me to sell 30 years ago by telling me how great their laundry powder was?"  Yes, it is.  Can anyone have any doubt that the genomic era has arrived?

An excerpt:
“The Weight Management Genetic Test is used in a program to determine if an individual is likely to lose weight more from low-calorie or balanced diets, or from increased exercise based on genotype.

The Heart Health Genetic Test uses variations in the IL1 gene in order to determine predisposition for inflammation, which has been implicated as a risk factor for heart disease, the company said.

The Nutritional Needs Genetic Test uses variations in genes related to B-vitamin metabolism and potential cell damage due to oxidative stress, and the Bone Health Genetic Test, which is expected to be available by the end of 2009, identifies susceptibilities to spine fractures and low bone mineral density associated with osteoporosis."

This is not necessarily a new phenomenon and there are lots of folks that feel they need to protect the public from spending money on these DTC tests.  I find it interesting, however, that no one feels compelled to press the government or FDA to legislate the height of my red-spike high heels or how much my husband should be allowed to pay for them.  We know these shoes wreck havoc on my back and knees, yet my husband will happily pay hundreds of dollars if he can only get me to wear them!  And what about all those promises about the face cream that will make me look 10 years younger.

I am all for DTC genetic tests.  I am still waiting on a few specific SNPs to be incorporated in the report before I send my spit to 23andme.  Amway’s tests are very simple and, to me, are a new twist to DTC genetic testing.  It is not necessarily about medicine but choices that I as a consumer should be allowed to make.  I want to know how much will these tests cost?  I don’t gamble but I am certainly into recreational genetic tests.  Call me weird, call me Harriet, just make sure you call me eXXcited!  Bring on the soap, baby.  I’m ready.
318 Views 0 References Permalink Categories: Bioinformatics Tags: biomarkers, translational-medicine, discovery-studio, genetic-testing
0

My AIChE, Breaky Heart

Posted by AccelrysTeam Oct 29, 2009
Calling All Chemical Engineering Professionals…

AIChE, the world's leading organization for chemical engineering professionals, is holding its annual meeting at the Gaylord Opryland Hotel in Nashville, Tennessee from November 9 to 11. At booth #18, we will be showcasing the new features and enhancements found in Materials Studio 5.0. We will also be presenting at the conference:

-       Nov. 9 at 5:35 p.m.: “High-Throughput Catalysis and Information Management,” Michael Doyle

-       Nov. 12 at 10:35 a.m.: “Virtual Screening for Lithium Ion Battery Electrolyte Additives,” Nick Reynolds (Bayou B)

In addition, we’re proud to note that two organizations currently using Accelrys solutions will be presenting at this year’s AIChE:

-       Nov. 11 at 3:40 p.m.: “Chemical Erosion of Silica Nitride Ceramics,” Jenny Synowczynski, U.S. Army Research Laboratory (Delta Ballroom B)

-       Nov. 12 at 8:30 a.m.: “Utilization of Molecular Simulations in Aerospace Materials: Simulation of Thermoset Resin/Graphite Interactions,” Andrea Browning, Boeing (Jackson A)

 If you will be at AIChE, please stop by the booth and attend the presentations.  We would love to see you there!
394 Views 0 References Permalink Categories: Materials Informatics Tags: conferences, chemical-engineering, materials-studio
0

DFT Goes (Even More) Mainstream

Posted by gxf Oct 25, 2009

When I did my graduate work in quantum chemistry, doing a calculation on something the size of, say, hexatriene was a huge deal. In those days, the calculations were limited to people with expertise and perseverance - and a lot of patience. GUI? We didn't need no stinkin' GUI. We set up input files by hand, even had to work out the Cartesian coordinates of the atoms on a hand calculator. Those times have changed.

A combination of factors helped to make quantum mechanical calculations accessible to a wider range of users:


  • Computers got faster. These calculations take a long time, but computers today are around 1,000-10,000x faster than when I was in grad school.

  • Modeling methods got faster. Besides simply optimizing software performance, there were breakthrough approaches like density functional theory (DFT), which delivered reliable results with less CPU time.

  • Graphical User Interfaces (GUIs). Yes, it turns out that we do need those stinkin' GUIs. It's just not feasible to set up calculations for anything more than around a dozen atoms without a sketcher.

DFT has done particularly well over the past 10 years or so. On top of everything else, the methods have been extended to include systems with periodic boundary conditions, so chemists have started getting into solid state calculations. With this approach you can study heterogeneous catalysis on an extended (periodic) surface; predict crystal structures; or calculate elastic constants. Searching ACS Number of occurrences of 'Density Functional Theory' in ACS journals has grown by about 25% each year since 1990for "density functional theory" shows a staggering increase from 37 publications in1989 to almost 4000 so far this year.

 

Among some interesting research directions are those to compute solid-state spectra. This includes NMR, Raman, and EELS (electron energy loss spectra), all of which are part of the drive to make DFT more relevant by connecting theory with experiment. Raman is used, for example, to characterize reactions in situ. NMR can be used to discriminate among crystal polymorphs. EELS has "enabled detailed measurements of the atomic and electronic properties of single columns of atoms, and in a few cases, of single atoms." Earlier this year, there was even a workshop sponsored by the Oxford University Department of Materials to promote these computational approaches specifically to the experimental community. This combination of theory and experiment facilitates the identification of unknown compounds; the elucidation of reaction mechanisms; and the characterization of molecular & crystal structure.

 

One of the most gratifying measures of success is the appearance of articles not directed at other specialists. NewScientist recently ran an article about using DFT to predict crystal structure entirely from first principle - a sort of DFT for the common man type article.

 

As a developer and practitioner of DFT, I'm really pleased to see how far the awareness of theory in general and DFT in particular has spread. Who knows: someday there may even be an iPhone application for DFT.

521 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: atomic-scale-modeling, quantum-chemistry, density-functional-theory, spectroscopy
0
We are now rolling into the last few days of the release of Materials Studio 5.0. It has been a pretty long but very exciting and challenging release as we really started to use sprint releases and agile techniques. From a customer perspective this meant that, from February onwards, early versions of the software have been going to some customers to get feedback.

During the 5.0 release cycle, we have been focusing on several large projects covering the range of technologies from quantum mechanics by classical simulations to mesoscale modelling. There is far too much to blog about in one post so I will break this up into a few posts.

First, let’s turn back the clock to the start of the previous release! During the development of Materials Studio 4.4, we began work on the parallelization of Forcite Plus. There was no intention to release this functionality in 4.4 but we got tantalisingly close! However, we didn’t have time to include core functionality like COMPASS support and more validation was needed. During the 5.0 release we finished off the functionality and we worked closely with HP to test Forcite Plus scaling and found some interesting “features”. One of these was the Ewald summation method did not scale well above 16 CPU’s which at first we couldn’t understand. After some head scratching, the classical simulations guys worked out that we had optimized Ewald so much for scalar calculations, that this had a negative effect on scaling in parallel. We didn’t want to lose the scalar performance so we added a switch so the algorithm changes to get good parallel scaling. One of our beta test customers has now seen pretty linear scaling for Ewald sums up to 128 CPU’s!

Another big project in 5.0 has been the development of a new Amorphous Cell. Amorphous Cell is a core tool for polymer simulations enabling the generation of bulk amorphous structures and has been available in Materials Studio since 1.0. However, it needed a re-vamp as it had some serious limitations including failing to build some pretty basic structures as it was tied to a specific forcefield. In fact, a few years back a customer memorably asked me to build a box of HCl and, unknowingly, I fell into the trap. It failed as the system was not parameterized which they obviously knew!

Before starting the development, we had long discussions about the best approach to creating this essential tool and decided to re-use the packing technology from Sorption. This has some great advantages as we could do really interesting things like pack into isosurfaces. It also gave the most requested enhancement – removing the dependency on the CVFF forcefield so now you can build any amorphous material. Again, early feedback suggested that having weight percent displayed for each component was really useful so this was added in. Our beta testers were equally happy as they could use the new packing task to build materials that were previously impossible and others could pack polymers with side chains at real densities rather than having to build at a low density and compress. Of course, the first structure I built was a box of HCl – successfully this time!

Enough for now, later I’ll blog on some of the fantastic new quantum mechanics functionality.
456 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: atomic-scale-modeling, materials-studio, amorphous-cell, classical-simulations, polymers
0

Bio-IT World Europe

Posted by tmoran Oct 16, 2009
Cloud nein oder Cloud ja? Are clouds just for the birds?

This years first ever Bio-IT World in Europe was hosted by the charming city of Hanover. The cities exhibition center, with its high tech architectural engineering was a fitting venue for the IT experts that gathered from across the globe. The conference was held in conjunction with BioTechnica. A multitude of vendors provided ample exhibition of their goods.

Emerging conference themes included  large data volume storage and (externally hosted) cloud computing. There was general agreement that some enhancements to the GUI, perhaps thru a platform like Pipeline Pilot,  would bring faster adoption. Some of the live demos presented a difficult to use command line interface. Chris Dagdigian of Bioteam says Amazon looks strong as an emerging leader in cloud providers. Many attendees seem to be taking a "wait and see" attitude as early adopters begin to pave the way.  Perhaps, as we have seen in other domains such as High Content Screening, informaticians will take advantage of large data volume for predictive modeling and begin to entice others out of the nest and into the clouds.
354 Views 0 References Permalink Categories: News from Accelrys, The IT Perspective Tags: pipeline-pilot, conferences, high-content-screening, cloud-computing
0

Just back from the EUGM and Nanotech Consortium Meeting, a week of lively discussions (and foosball matches ;-) and of course our announcement of the release of Materials Studio 5.0. It’s been great finally to talk about and demo all the new features, which we are all so excited about. Getting the requests in for shipment of the new version already ... well, it won’t be long.

 

You can read more about Materials Studio 5.0 at a high level in our Press Release, or in more detail in our ‘What’s New’ document. Perhaps you have read the ‘Transforming Materials Modeling’ tag line in there: imagine the discussions we’ve had about that: “Is it really?” “What is transforming...” and so on. But honestly it is what we are aiming to do with Materials Studio, and there are many things in the 5.0 release that make a real difference.

 

My take right now from the discussions at the Consortium and User Group Meetings is that the efficiency you gain because of the integration and flexibility this new release provides is quite a step change. The new Amorphous Cell for example got some wows from Materials Science and Life Science folks alike. It’s really a kind of universal structure builder. Want to build a nanocomposite, for example with nanotubes and polymers around them: not a problem. And perhaps there is some small molecule inside the tube: easy.  And what about a protein soaked in a solution: consider it done!

 

For the second ‘transforming’ example, for me it’s Kinetix, the new Kinetic Monte Carlo module we built for the Nanotech Consortium. I alluded to Kinetic Monte Carlo development earlier, and thanks to a great collaboration with Tonek Jansen and Johan Lukkien from TU Eindhoven, you can now simulate processes such as a Fuel Cell cathode reaction in Materials Studio, over real time scales of minutes. Considering we start at femtoseconds, that’s quite a leap anyway.

427 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: nanotechnology, user-group-meeting, atomic-scale-modeling, quantum-mechanics, consortia, kinetics, materials-studio, multiscale-modeling
2

The RNA World

Posted by AccelrysTeam Oct 15, 2009
This year’s Nobel Prize in Chemistry brought back fond memories of why I chose to study the RNA in the first place for my thesis work. It was in 1989 that Tom Cech, a professor at University of CO, Boulder, had won a Nobel Prize in Chemistry for his discovery of ribozymes. His work, along with many others, had clearly redefined the central dogma that genetic information flow may not be necessarily from DNA to RNA to Proteins. That RNA wasn’t simply a passive carrier of information and that it played an important role in catalytic activity.

I had picked up a book called “The RNA World” (1st edition, 1993) a few days before I was faced with a decision to pick a project for my dissertation back in 1995. Initially, I had joined Wayne State University with an intention to study bioorganic chemistry under Mobashery, but soon realized that I needed to know more about this RNA world which had left me intrigued and interested. Little did I know that the years to follow would have me solving bits of rRNA structure in high resolution using NMR spectroscopy: the 790 loop of 16S rRNA, to be exact. It was intriguing to learn that the sequence of this relatively tiny solvent exposed hairpin loop is so highly conserved across phylogeny that any mutation in the sequence basically shuts down protein synthesis. Why does nature prefer only those nucleotides and what role do they play in protein synthesis? Only the 3D structure of that loop could unravel the mystery.

Thus, my first application of translational research began where our lab collaborated with a biologist who designed this “instant evolution” experiment which allowed us to understand the structure-activity relationship of the tolerated mutations. Structural and biophysical characterization of the hairpin loop and its variants was long and arduous task but revealing exciting results which helped us understand why nature preferred the sequence and its conservation across phylogeny. Biophysical characterization, base pairing and thermodynamic stability explained mismatched mutations.

Feeling quite proud of this year’s winners, Venki Ramakrishnan (UK), Tom Steitz (US)  and Ada Yonath (Israel), I am certain that the recognition of their efforts in understanding the structure and function of the ribosome is shared by many across the globe who have dedicated their lives in studying such a marvelous piece of machinery nature has created! Venki’s parting comment in his interview in Nature video sums it all up pretty well – it’s this kind of fundamental research that will help us discover therapies and drugs which will become billions of dollars of industry eventually.

Congratulations to the winners of 2009 Nobel Prize in Chemistry!
300 Views 0 References Permalink Categories: Modeling & Simulation Tags: life-science, translational-research, computational-biology, structural-biology
0
The Cambridge office has been in pre-EUGM fever this week, with printers (not to mention the Marcoms folks) running overtime, and towers of boxes stacking up for shipment. We are excited at numbers, beating previous years despite the tough budget restrictions on travel for many, as well as of course the agenda having shaped up so nicely.

In addition to the a la carte menu of talks, I am looking forward to the Product Gallery, where we'll be able to show some cool stuff. As a 'materials' guy, I'm dying to show off the latest in Materials Studio as we are so close to the release of Materials Studio 5.0 now. And Life Science folks, don't run away, take a look at how Materials Studio models coarse grained proteins, and drug delivery systems. So watch this space.

In the tracks, we've had one last minute change in the Materials track, so the bad news is that Evelyn Moreno cannot make it to give the talk on polymorphism and energy storage applications. Good news is that Marc Amkreutz from Fraunhofer IFAM stepped in with a really hot presentation about work just published on multiscale modeling of shrinkage in adhesives.  This is so important and pretty much everything we use today (or even fly in) is stuck together rather than screwed and riveted, basically as it saves cost, weight etc. However, we don't want the plane to drop out of the sky because the glue shrinks  and develops stresses that lead to failure. To make all of that work is actually quite a hard design problem, so I am looking forward to hear how Marc and his colleagues have addressed the issue.
334 Views 0 References Permalink Categories: News from Accelrys
0

It wasn’t until my career steered towards marketing that I was diagnosed with gephyrophobia – the pathological fear of bridges. Certainly, years of riding motorcycles in Southern California’s backcountry never triggered this anxiety disorder, but the realization thathttp://www.istockphoto.com/index.php a simple query on istockphoto for “bridge” and “puzzle” returns 56 variations of chasmcrossing stereotypisms. My guess is that if you are gainfully employed and work outside the shelter of academic or government institutions, you have been exposed: The “bridge” as a symbol. To make you work better. With your colleagues. With other departments. In general: To cross a chasm.

 

 

A car about to cross a chasm. Or is it?

A car about to cross a chasm. Or is it?

 

 

Working as a product manager for Pipeline Pilot, I had no choice but to face my fears. There are no two ways to put it: Pipeline Pilot is a “bridge," it helps scientists and departments work together - better.  Now, don’t get me wrong – I have nothing against teamwork, I think it makes it worthwhile commuting 60 miles to work every day and wondering what frequent flyer status comes after “Platinum."  My fears are empty marketing promises that create customer reactions ranging anywhere from confusion to disappointment.

 

Customers I mainly interact with work in analytical labs. Last time I checked, they are concerned with analyzing boatloads of data in the shortest time possible. Sure, they could use a bridge when it comes time to toss the results over the fence to the chemists that originally requested the work. And after quite a bit of effort from my colleagues in  Cambridge R&D, we can now process NMR data in Pipeline Pilot. And although I have no delusions of grandeur that Accelrys would invest in this project simply to address my exotic condition, I still hope that the next time I drive over the Bay Bridge to see my favorite customer, my heart will be beating at its customary 120 bmp.

 

I would also like to thank all the great people at Modgraph helping us integrate their first rate NMR prediction engine, NMRPredict, into Pipeline Pilot. Not only would our NMR release have been significantly less complete, but it makes Accelrys’ vision come alive that Pipeline Pilot is an open integration platform for scientific applications.

418 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation, Cheminformatics Tags: atomic-scale-modeling, toxicology, government, application-integration, nmr
0

HighContent

 

CHI's High Content Analysis East conference was a great success. The conference had overriding themes in data informatics, supervised and unsupervised learning and the use of predictive modeling.
Anne Carpenter shared a user friendly interface in Cell Profiler for an interactive semi supervised cell level learning tool. Neil Carragher of Astra Zeneca showed some really interesting work on predicting mechanism of action thru the use of learning on multivariate image descriptors. And CMU’s Robert Murphy presented the use of learning and modeling on protein patterns in the cell. All in all we, are really beginning to see the "high content" promise of the industry come to fruition. It is finally becoming more common to see research combining hundreds of variable readouts from the cells for extracting more sophisticated knowledge. Hopefully we will see these themes advancing at the upcoming HCA event in January, from the 11-15th at the Fairmont hotel in San Francisco,

 

Also of interest were several advances in managing High Content Screening Data. Present were Michael Sjaastad of Molecular Devices, Martin Daffertshofer of Perkin Elmer (Evotec), and Karol Kozak,, LMC-RISC, Institute for Biochemistry . They all presented promising offerings in managing HCS data. It was interesting to note that Thermo (Cellomics) did not introduce any new tools in managing their HCS data, perhaps telling of the great foresight that Mark Collins had, when originally designing Store which has stood the test of many years.

330 Views 0 References Permalink Categories: Data Mining & Knowledge Discovery, Trend Watch Tags: conferences, high-content-screening, predictive-science
0
Business Intelligence has been around for nearly 30 years. So, that means that businesses have pretty much mastered  all of their data mining and management issues. Right? Well, then why do R&D enterprises still struggle with integrating and fully leveraging their scientific data for the knowledge it contains?  Accelrys’ VP of Marketing, Bill Stevens, was recently interviewed by Mary Jo Nott, Executive Editor of the BeyeNetwork, on their Executive Spotlight Program. Bill, a BI industry veteran, exposes the unique characteristics of scientific data, and explains why it has eluded the BI umbrella of solutions for so long.

BeyeNETWORK Spotlight - Bill Stevens, Accelrys
366 Views 0 References Permalink Categories: News from Accelrys, Executive Insights, Data Mining & Knowledge Discovery Tags: data-mining, business-intelligence, publications
1

In my last posting, I touted ROC analysis as one of the best ways to evaluate and compare different methods for building classification models. To do a true apples-to-apples comparison, it also helps to have a good reference data set. In this regard, Katja Hansen et al. have done data modelers a favor by publishing a "Benchmark Data Set for in Silico Prediction of Ames Mutagenicity." Not only did they vet and make available the data, but they also provide data splits for cross-validation to help modelers ensure that their method comparisons have a common basis.

 

The authors compare several techniques, including the Bayesian classifier in Pipeline Pilot. Data junkie that I am, I couldn't resist throwing the Ames data at this and a few other Pipeline Pilot learners. Here are the results I got using the ECFP_4  molecular fingerprint as the descriptor:

 

MethodROC Score
Bayesian0.82
RP Tree0.78
RP Forest0.82
R SVM0.72
kNN0.84


These results show a few things. The best ROC scores in the table are comparable to those reported by Hansen et al. for various classifiers that they investigated. (The best score they obtained was 0.86 for an SVM model.) The results confirm the widely known fact that forest models give better predictive performance than single tree models. Finally, they confirm that molecular fingerprints are good descriptors for building classification models.

 

If you want more of the statistical details, I provide them in a posting on the Pipeline Pilot Forum at the Accelrys Community site. (Registration is free.)

445 Views 2 References Permalink Categories: Data Mining & Knowledge Discovery Tags: qsar, statistics, pipeline-pilot, data-modeling, toxicology, classification-models, machine-learning, roc-analysis