Skip navigation
0
Just 10 years ago, ELN was all about going paperless—getting rid of those paper notebooks Da Vinci used and recording research electronically. Today, though, ELN is much more, helping to foster collaboration and speed and standardize research processes. Here’s a short video that summarizes the value of ELN in the modern electronic lab. We'd love to hear what you think of it!

407 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: eln, symyx-notebook-by-accelrys, videos
0

The March/April issue of Scientific Computing includes an “Industry Insights” Q&A article on security issues for today’s ELNs. Summarized below are the responses we provided. Check back Monday for an answer NOT included in Scientific Computing for space reasons.

 

SC query: Time stamping ELN records is critical to any legal challenges, but can be especially tricky when operating across multiple locations and time zones. What steps do you recommend to minimize the risks of alterability of records and to avoid potential legal challenges?

 

Response: It is critical to use a centralized, common system for all ELN record versioning and time stamps. Time stamps should be generated at the database, reconciled to coordinated universal time (UTC), and recorded with the client’s local time zone offset (from UTC). This ensures consistent time stamps while also preserving the ability to record the local time the action was taken. Signatures should be required when committing an ELN record to the database. The signature should require the user’s credentials, reasons fro the change, and optional comments. This ensures that the correct person is committing the ELN record to the system.

 

SC query: When working with ELNs, it is essential to maintain the integrity of and track alterations to documents, including associated files created in external applications. What methods do you recommend to create and track these documents while maintaining their version integrity? Are there potential tradeoffs between methods?

 

Response: ELN records must never be overwritten, and committing a new version of the record cannot obscure prior data. An excellent approach to this problem is to store a new version each time the record is committed. All prior versions are, therefore, still available for review if needed. Storage can be a concern, but can be managed by utilizing data compression routines and archiving. When external files are incorporated into a record, they should be copied into the ELN and associated with the record. This ensures the necessary data used to form the scientist’s conclusions are preserved and not subject to change from outside the system. While it is possible to reference data externally without including it in the ELN, this can result in data integrity problems if the external data is not available at a later date or is updated independent of the ELN entry.

326 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: publications, eln
2
3D Pareto Surface

3D Pareto surface shows the tradeoffs among target properties: dipole moment, chemical hardness, electron affinity. The optimal leads are colored red, poor leads blue.

How do you search through 106 materials to find just the one you want? In my very first blog post "High-Throughput- What's a Researcher to Do?" I discussed some ideas. The recent ACS had a session devoted to doing just that for materials related to alternative energy, as I wrote here and here.

My own contribution was work done with Dr. Ken Tasaki (of Mitsubishi Chemicals) and Dr. Mat Halls on high-throughput approaches for lithium ion battery electrolytes. This presentation is available now on Slideshare (a really terrific tool for sharing professional presentations).

We used high-throughput computation and semi-empirical quantum mechanical methods to screen a family of compounds for use in lithium ion batteries. I won't repeat the whole story here; you can read the slides foryourselves, but here are a couple take-away points:


  • Automation makes a big difference. Obviously automation tools make it a lot easier to run a few 1000 calculations. But the real payoff comes when you do the analysis. When you can screen this many materials, you can start to perform interesting statistical analyses and observe trends. The 3D Pareto surface in the accompanying image shows that you can't optimize all the properties simultaneously - you need to make tradeoffs. Charts like this one help you to understand the tradeoffs and make recommendations.

  • Don't work any harder than you need to. I'm a QM guy and I like to do calculations as accurately as possible. That isn't always possible when you want to study 1000s of molecules. Simply looking through the literature let us know that we can get away with semi-empirical.

Enjoy the Slideshare, watch for more applications of automation and high-throughput computation, and let me know about your applications, too.

658 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, atomic-scale-modeling, high-throughput, lithium-ion-batteries, materials-studio, alternative-energy, green-chemistry, virtual-screening
0

Scientists and SharePoint

Posted by Dominic.John Apr 27, 2010
Microsoft SharePoint is poised to become a predominant collaboration tool in the electronic lab according to a recent survey conducted by Symyx. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.] In a community that is typically slower to adopt new informatics technology, SharePoint is gaining traction aided by awareness, corporate Microsoft deployment, and better integration with research tools.

Symyx researchers contacted 3,756 current software customers earlier this year. About a quarter of 80 respondents reported having access to SharePoint at their sites. Scientists comprised less than half the usage (16%) compared to research managers (41%), librarians (40%), and administrators (33%).

A full third of SharePoint users report using the system for document management, but just 10% reported using SharePoint for sharing experimental information. The release earlier this month of Isentris for SharePoint should further accelerate the use of SharePoint. Fully 45% of Isentris users in the survey reported also using SharePoint. Isentris for SharePoint is a collection of SharePoint web parts and supporting components collection that let scientists use Isentris search/browse/decision support capabilities within SharePoint. Together, the combined systems provide the type of collaborative “research portal” that many distributed project teams crave.

Here are two videos showing the benefits of the integration. The video below shows how to query and browse scientific data in SharePoint. 


And this next one shows how to generate live, interactive reports to share the latest project information with colleagues.

 
Visit this website to see more demos and learn more about Isentris for SharePoint.
391 Views 0 References Permalink Categories: Lab Operations & Workflows Tags: reporting, isentris, demos, videos, decision-support
0
Why roll out a hefty application for searching and browsing multimillion compound databases when the lightest of apps can do the trick?

“We were looking for a simple way for our scientists to get up-to-date compound information—from any client, browser, or platform, anywhere in the world,” says Plamen Petrov, principal scientist at AstraZeneca, who implemented the Symyx JDraw structure renderer/editor applet in AstraZeneca's web-based Chemistry Connect. Chemistry Connect provides uncomplicated web access to 10 million compounds from internal and external sources.

[Editor's note: In July 2010 Symyx merged with Accelrys, Inc. Symyx JDraw is now Accelrys JDraw.]

The Symyx JDraw Applet provides all the chemical representation and querying capabilities that customers would expect from Symyx, but in a lightweight, Java applet that can call out to any number of remote services through its flexible API. “It took us just five weeks to switch to JDraw from our previousl structure-drawing solution that combined Chime Pro and ISIS/Draw,” says Plamen.

Check out this video that Plamen recorded to see the AstraZeneca applet in action!

658 Views 0 References Permalink Categories: Cheminformatics, Lab Operations & Workflows Tags: case-studies, web-apps, demos, applets, videos
0

Accelrys has announced the commercial availability of the award winning Accelrys Biological Registration, the world's first multi-entity, fully-integrated, flexible and extensible database for biological entities. The enterprise-scalable system supports eight major biological entities: Yeast, Cell Lines, DNA, Protein, Plasmid, Vaccine, Antibody and siRNA.

 

Although registration has been a standard in small molecule researchfor many years, it is relatively new to biological sciences. With the advent of Accelrys Biological Registration, companies with innovation in biology can now:

 

  • Implement and automate the process of biological entity registration
  • Capture, secure and protect corporate intellectual property
  • Search, retrieve and utilize biological information across the enterprise
  • Enable scientists to collaborate and share biological information
  • Increase operational efficiencies and reduce costs through registering biological entities
  • Reduce the risks associated with failing to register biological entities

 

This product was developed in a pre-competitive environment with several of the world's leading biopharmaceutical companies, including Abbott Laboratories and Merck & Co., Inc. With the release of Accelrys Biological Registration, we are further demonstrating our commitment to ongoing innovation in the scientific informatics market.

 

We invite you to learn more about Accelrys Biological Registation:

 

Register for our Accelrys Biological Registration webinar series

 

Download the datasheet

424 Views 0 References Permalink Categories: Bioinformatics, News from Accelrys Tags: biologics, biological-registration
0

This post continues my conversation with Nick Lynch about the Pistoia Alliance. I’ll leave his thoughts on running Pistoia to the next entry. Here, we learn more about Pistoia’s mission and the role vendors can play in the organization.

 

 

nicklynch small.jpgFrom the start, Nick corrects the use of the word “vendor.” “We have been trying not to use that word because it almost creates an ‘us and them’ culture,” he says. Pistoia prefers the word “service provider.”


Nick then explains that in its role as standards group, Pistoia aims to deliver working standards to impact consumers of information services and the companies delivering those services. But the harder part of the organization’s job is changing what Nick describes as an underlying business model around software implementation.

 

“Life science companies no longer want products—they want a service,” Nick says. “They want ways to bring together a number of information systems rather than vertical software products. And this means Pistoia is both a lightning rod and a change agent” as it works to help organizations and service providers effect these sometimes painful changes in the business of life science research informatics.

 

Pistoia formed when its founders realized that rather than complaining whenever they got together, they could work together formally and accomplish more than they could singly. The founders quickly recognized the importance of welcoming consumers and providers in the membership of the group. Precompetitive collaboration of this type benefits everyone. “Cost of producing software and its long-term maintenance will be reduced because there is less variation among the software customers use, because key requirements are developed collaboratively,” Nick said. The service-based model Pistoia promotes also lets service providers reuse components while still providing customers the opportunity to customize.

 

Pistoia encourages members affiliated with service providers and those affiliated with customers to sit on the board, engage in working groups, and drive Pistoia activities. One of the real benefits of Pistoia is the opportunity for everyone across the informatics supply chain to gain inspiration. “Often great ideas come out of small companies or even individual customer sites, and we hope that through Pistoia some of these ideas will gain broader exposure and may even be picked up by service providers as they look to differentiate themselves,” says Lynch.

 

Pistoia has spent a lot of time since its January meeting considering “what’s in it for me” points for different membership audiences. But ultimately, Lynch says this responsibility lies with the member. “Use your own internal drivers to determine which projects to get involved in,” he says. “We don’t expect everyone to be involved in everything, but rather to get interested in those things that matter most to you. Make Pistoia relevant to you!”

349 Views 2 References Permalink Categories: Electronic Lab Notebook, Trend Watch Tags: eln, pistoia-alliance
1
Biology is undergoing a revolution and is becoming a more analytical science with the advent of omics, high content screening, next generation sequencing, and other methods.  These methods lead to the more in-depth understanding of systems biology and the discovery of new biomarkers.  This greater understanding can be used to fill-in our knowledge about pathways, to the point of building mathematical models of the multiple processes involved in any response to stimuli. All of this taken together should increase the odds of success by having better information to base decisions on.

The other area in biology that has great opportunities is in the use of biologics as drug entities. These drug entities range in complexity from antibodies, vaccines, siRNA, etc.  The value to the marketplace is in the hundreds of billions of dollars and intellectual property (IP) protection is essential.  Some of the largest patent infringement cases ever awarded are around biologics.

In the process of building-out of these analytical biology systems, and the biologics as drug entities, there are many biological innovations and inventions.  For example, new stem cell lines, antibody generation as a tool or as a drug entity, plasmids, algae strains, etc.  There is also a lot more inventory, reagents and data to track today than ever before. The best way to track data across multiple sources is the use of a consistent and meaningful key. The way this is handled in the chemical space is to use a registration system to uniquely identify an entity and give the entity a unique integer that represents the entity in every data system.  Until recently, the biologist might track a bar code for a cell line or antibody in their notebook, and a possible location for this entity in a lab-based, simple inventory system.  However, this type of system only tells the same researcher where the cell line is, not uniquely what it is.  In order for the entire company to benefit from the inventory, and protect their IP, there is the need for describing the biological entity uniquely.  This is a rather new concept for biologist which needs to be carefully considered moving forward to better protect IP, manage expensive reagents,  implement safety systems and most importantly, to ensure the query and aggregation of data.  All of this has been implemented in chemistry and shown to be of great value, now it is biology’s turn.

Does your organization have a Biological Registration system?  How could such a system add value to your organization?
522 Views 1 References Permalink Categories: Bioinformatics, Executive Insights Tags: biomarkers, biologics, antibodies, biological-registration, cell-line
0

BioIT 2010 – Join Us!

Posted by AccelrysTeam Apr 15, 2010
Learn how Accelrys is on the forefront of scientific innovation by being one of the first to preview our BioIT World award-winning application, Accelrys Biologics Registration.  Developed with leading pharmaceutical companies, the application was designed to address the challenges posed by the dynamic nature of biological entities.

Get a glimpse into the latest release of Pipeline Pilot and the Imaging Collection; or hear how Accelrys products are being used to address next-generation sequencing analysis challenges by attending “Pipelining Your Next Generation Sequencing Data,” on Wednesday, April 21, 12:00pm in Track 3.

Visit us at booth #301-303 to learn more about our leading scientific informatics solutions.

Accelrys is the official Twitter sponsor for BioIT World Conference & Expo ’10, follow us (#BioIT10) for your chance to win an Apple iPad.
444 Views 0 References Permalink Categories: News from Accelrys Tags: next-generation-sequencing, pipeline-pilot, conferences, biologics, image-informatics, biological-registration
2

On the heels of our earlier musings about academic ELNs inspired by comments over at In the Pipeline, Derek Lowe, In the Pipeline’s author, has now publicly stated that technology, namely an ELN, has changed his life.

Lowe’s confession appears in his April Chemistry World column. I encourage you to read the whole thing, but here are a few highlights:


  • “The ELN has made me into what I never would have gotten around to becoming on my own: an organized scientist.”

  • “I’d strongly consider ingesting a sand sandwich rather than go back to using paper now.”

  • “It’s a lot to ask, but what the scientific world could use would be a ‘common notebook standard,’ a minimal but usable format that every ELN could export to in a pinch.” Care to comment on this, Nick?

368 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: eln, academic-labs
0

Talk to Nick Lynch, president of the Pistoia Alliance, and it’s immediately clear that Pistoia isn’t your typical standards group. I called Nick to follow up on the Bio-ITWorld article that prefaced Pistoia’s first meeting last January. In addition to discussing where Pistoia is headed, we also chatted about running a virtual, collaborative, project-driven organization in this age of enabling technology and social media. Sounds a lot like life in the electronic lab…

 

This entry focuses on how Pistoia has defined itself following the January meeting. Next week, we'll take a look at the challenge of running Pistoia.

nicklynch small.jpgLike the head of any standards group, Nick uses the words “deliverables” and “delivery” a lot. But the resemblance ends when he apologizes for this repetition about ¾ of the way through our call. Pistoia has a bit of a split identity. Originally conceived in 2007 by a group of informatics executives from AstraZeneca, GlaxoSmithKline (GSK), Novartis, and Pfizer while they met in the Italian city of Pistoia, the organization is “partly a standards group and partly working to push or drag the industry into different ways of thinking,” Nick says.

 

The other word Nick uses a lot is “energy” and for Pistoia, energy may be a more critical resource than money. “Money is nice, but you can’t do much with it without energy from your members,” Nick says. “We formed Pistoia because we realized that rather than complaining when we got together, we could work together to effect real change that would benefit all of our organizations.” The first meeting aimed to formally define how Pistoia would operate and channel the grassroots energy of the group’s members into concrete deliverables.

 

Pistoia plans to deliver projects through working groups. Three were identified at the January meeting:

 

  • ELN query services aims to develop standards that can be used across different notebooks serving different domains (initial attention will be on chemistry data types). Note: This is the working group in which Accelrys is actively participating as a Pistoia member
  • Sequence services is exploring whether a hosted service for storing sequence data would appropriately cater to security and scalability needs in life science.
  • SESL will demonstrate the feasibility of an open knowledge-brokering framework standard that will reduce the costs of integrating disparate data sources. This working group is funded as a partnership between Pistoia and the European Bioinformatics Institute.

Higher level domains focused on business workflows and supply chains ensure that the working groups have enabling information such as common vocabularies, ontologies, and workflows. Use cases, in particular, are important to Pistoia’s effort, and working groups comprise industry members, affiliated vendors, and partners such as publishers and academic sites in order to fully describe the environment in which these standards will be deployed.

 

“We’ve kept Pistoia’s initial program deliberately small focusing on only a few projects,” said Nick. “We don’t want to overextend and end up providing little of use.”

374 Views 4 References Permalink Categories: Electronic Lab Notebook, Trend Watch Tags: eln, pistoia-alliance, standards
1
Accelrys has recently concluded a series of meetings with a specially convened Biological Registration Special Interest Group , (SIG), formed between several major pharmaceutical companies and Accelrys.  The objective of this forum was to understand some of the critical market and product requirements needed in order to build a state-of-the-art Biologics Registration system.

The success of the SIG can be attributed to the customer members being very open towards one another, in spite of being competitors, and the tremendous diligence each company put into specifying user requirements.   This open and collaborative approach to software development has become an innovative way to introduce first of a kind technology into the market.

First of a kind software is usually developed as a bespoke project for a single company and then modified over time to meet the needs of the wider market.  This can create disadvantages for early adopters as the product functionality evolves and improves with subsequent releases.    This situation can be avoided by getting a wider set of requirements through a collaborative SIG formed of a diverse and representative sample of interested parties.

The  ability to capture and prioritize a wider set of requirements through leading companies discussing and debating the relative merits and benefits of proposed features, is a more efficient and effective way of understanding market requirements than more traditional methods.  The approach also  enables the development team to capture feedback and more rapidly create a product that should be attractive to the wider market.  The anticipated result is the timely delivery of a product that is well positioned to capture both broad interest and market share.

Have you innovated through collaborative work groups?  If so, we would welcome the chance to learn from your experience.
460 Views 0 References Permalink Categories: Bioinformatics, Executive Insights Tags: consortia, biologics, biological-registration
0

In Part 1, I was just getting to the potential of a software application to be able to reliably screen computationally the possible polymorphic crystal arrangements of a blockbuster drug.

 

Accelrys’ Polymorph Predictor, dating back about 15 years and based on research at Novartis, was probably the first commercial tool that achieved a reasonable success rate in identifying likely polymorphs that organic compounds could form, and following substantial further improvements over the years, it is still widely used to help researchers better understand the polymorphic variability of a given compound.

 

But at Polymorphism and Crystallization 2010, much interest was generated by work from Marcus Neumann and co-workers, which demonstrates conclusively that ranking computationally the internal energy of polymorphs using high-accuracy dispersion-corrected quantum calculations (as opposed to traditional force-field based simulations) can go a long way towards making computational polymorph screening truly predictive, rather than simply assisting in exploring the range of potential polymorphs. The only downside is the heavy, sometimes prohibitive, demand on computational resources that such high-accuracy calculations entail.  But with increasing computer power and continuous algorithmic improvements, it is conceivable that even molecular crystals of flexible molecules, or salts and co-crystals, can be subject to routine computational polymorph screening within a few years time.

 

Improved techniques for estimating free energy contributions to polymorphic phase transitions (Sally Price) were also discussed at length, as were crystal design approaches based on hydrogen-bonding “motifs” to classify likely packing arrangements – which is rather like detecting the rhythm in the seemingly white noise of organic crystal structures. Automatic assignment and search for hydrogen bonding motifs are realised within the Motif software developed by the Cambridge Crystallographic data Centre (CCDC) – also interfaced through the Materials Studio Motif module.

 

It’s heartening to see that the polymorph problem continues to inspire new generations of researchers – driven also by very competitive crystal structure prediction blind tests organized by CCDC. It remains one of the great challenges of computational modelling, and thanks to the organizers of an excellent “Polymorphism & Crystallisation” meeting, the music of the polymorphs was heard loud and clear by all who attended.

 

To read Part 1, please click here

450 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials-studio, drug-discovery, crystallization, polymorphs, hydrogen-bonding-motifs
0

Deb Ausman, the editor of our customer newsmagazine, Molecular Connection, attended IQPC Laboratory Informatics 2010 along with me, and I asked her to share her thoughts on the conference.

deb small.jpgThis time last week, I was listening to the last talks at IQPC Lab Informatics. In my traditional communications role, I’ve depended on the true subject-matter experts to attend meetings and report back on what customers and competitors are saying. But to manage and edit Molecular Connection, I need to participate more directly in those conversations—and my colleagues, like Dominic agree.

 

Laboratory Informatics was a nice opportunity to do some listening in. The key message was moving beyond paperless to a truly integrated lab. Main topics included reporting, CDS and SDMS integration, and deploying ELNs in analytical workflows from R&D through to development, validation, and execution. (Note: We have two articles in the Spring 2010 Molecular Connection dealing with these same topics).

 

Some of my personal observations follow, and I’m hoping to publish some tidbits from a few of the speakers/attendees in the near future.

 

  • I’m still amazed (as I was in 2008 when I returned to the R&D informatics space after a five-year hiatus) how much power customers now wield. Technology is no longer the stumbling block it was when I started with MDL in 1990. Even at this small conference, we saw solutions ranging from enterprise LIMS/ELN integration to cloud-based systems for next-gen sequencing to semantic web technology in a pharmaceutical neuroscience department. The important question today isn’t can we manage something electronically, it’s whether and how to manage it.
  • Michael Elliott, Day 1 chairperson at this meeting, has described the convergence in R&D informatics, specifically in systems such as ELN, LIMS, and SDMS. We saw examples that touted each of these applications as able to take on the duties of the others. I don’t dismiss the motivation to consolidate, but I question those presentations that seemed to be promoting standardization on a single app rather than a “best of breed,” integrated approach.
  • Many ELN/LIMS presenters had questions about archival. Speakers were confident in their ability to capture and store data. But they weren’t sure how long to keep it—and neither were their lawyers. Funny that after all the legal concerns about whether to even capture electronic records at all, legal is now hoping that they won’t have to keep all of them forever.
  • Vendors were plentiful. In fact, most of the attendees either had an ELN or were selling one. So why, then, did so many talks describe how to select or implement ELNs? It seems that particularly at a smaller conference of this type, it would be possible to engage the attendees (vendors and customers) in more fruitful conversations. Have any of you seen this done well?
473 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: conferences, eln, guest-authors
0

Accelrys, Inc. and Symyx Technologies, Inc. today announced that they have signed a merger agreement.

Details of the merger include:

  • Combined company creates  a new leader in scientific informatics software
  • Deep expertise in chemistry, biology and materials science and broad scientific solutions for research, development and manufacturing
  • Positioned to meet the rapidly changing needs of scientific R&D environments
  • Combined solution uniquely delivers agile, flexible and open scientific R&D environments by managing adaptive end-to-end workflow solutions
  • Merger expected to be completed by end of June 2010
  • Max Carnecchia, chief executive officer of Accelrys, will serve as chief executive officer of the combined company

 

The combined company will have more than 1,350 customers, including 29 of the top 30 biopharmaceutical companies, all five top chemical companies, all five top aerospace companies, three of the five top consumer packaged goods companies, a number of top US Federal Government Agencies, as well as many top academic institutions.

 

For more information about the merger, please visit http://accelrys.com/about/news-pr/0410-announcement.html.

348 Views 0 References Permalink Categories: News from Accelrys Tags: merger, symyx
0

ACS members love to come to San Francisco and this spring was no exception. While I have not seen report of the final numbers, attendance was expected to be near the all-time high, with almost 18,000 people registered to attend.

 

We had very good turnout for the CINF sessions and presented several sessions virtually for those who were unable to attend. I found of particular personal interest many of the papers in the Wednesday morning session titled "The Future of Scholarly Publishing." Microsoft gave a couple of talks during which they, among other things, introduced the Chem4Word add-in for Microsoft Word, and Peter Murray-Rust was not struck by lightning after he predicted the demise of all abstracting and indexing services.

 

Our terrific sponsors ensured that CINF continued its tradition of excellent networking opportunities, including three receptions and a luncheon. Now that I am CINF Chair these activities are not as carefree for me as they used to be. I worry that we’ll run out of food or Anchor Steam Beer or that the microphone won’t be on. But I still had a wonderful time catching up with old colleagues and meeting new ones. In keeping with the green chemistry theme of the overall ACS meeting, we opted for recycled/recyclable materials at our receptions.

 

Symyx was pleased to be able to once again offer two scholarships to students pursuing advanced degrees in chemical information and related sciences. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.) We had a record number of applicants; we’ll tell you more about the winners in future entries. Strolling through the poster session at the Sunday reception, I was very impressed with all the student submissions.

 

For those who held down the forts at home, I urge you to visit the CINF Web site to catch up on what you missed.

425 Views 0 References Permalink Categories: Cheminformatics, Scientific Databases Tags: conferences
0
As I wrote last week, “Chemistry for a Sustainable World” was the theme of the ACS spring meeting. In this blog, I report on the research of 3 other speakers of the Monday morning session of CINF. All of these authors are trying to find ways to discover optimal materials from the huge selection of possibilities.

Heard of genomics? Proteomics? These methods generate a lotof data in the analysis of genes or proteins. Information management tools are needed to handle the large amounts of data. The goal, ultimately, is to be able to design new ones (genes or proteins) on the basis of the available information. The number of materials to search (the 'design space') is truly enormous. Searching for optimum materials effiently requires tools that were discussed by three of my fellow speakers in the session.

Prof. Krishna Rajan (Iowa State), discussed his "omics" approach to materials. In his presentation he discussed "a new alternative strategy, based on statistical learning. It systematically integrates diverse attributes of chemical and electronic structure descriptors of atoms with descriptors ...  to capture complexity in crystal geometry and bonding. ... we have been able to discover ... the chemical design rules governing the stability of these compounds..." To me, this is one of the key objectives for computational materials science: the development of these design rules. Design rules, empirical evidence, atomic-level insight - call it what you will, this sort of approach is necessary to make custom-designed materials feasible.

Prof. Geoffrey Hutchison (U. Pittsburgh) really did talk about "Finding a needle through the haystack." He discussed the "reverse design problem." Sure, we can predict the properties of any material that we can think up. But what we really want to know is what material will give us these properties. His group uses a combination of database searching, computation, and genetic algorithm optimization to search the haystack. It's a very efficient way to search these huge design spaces.

Dr Berend Rinderspacher (US Army Research Lab) also discussed the reverse design problem. He pointed out that there are around 10200 compounds of a size typical for, e.g., electro-optical chromophores. He unveiled a general optimization algorithm based on an interpolation of property values, which has the additional advantage of handling multiple constraints; and showed applications to optimizing electro-optic chromophores and organo-metallic clusters.

Terrific work by all the speakers in this session, who are using all methods at their disposal - whether based on informatics or atomistic modeling - to come up with better ways of looking for better materials. Next blog: a summary of my own contribution to the session.
473 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation Tags: materials, conferences, atomic-scale-modeling, combinatorial-chemistry, data-mining