Skip navigation
0
The cover story of the May 24 issue of Chemical and Engineering News highlights cloud-based scientific computing applications. While the article focuses on software-as-a-service (SaaS) delivery of LIMS (specifically our partner Thermo Fisher Scientific's debut of LIMS-on-Demand at Pittcon earlier this year), John McCarthy, vice president of product management strategy, was interviewed about Symyx's hosted ELN. I've excerpted John's portion of the article below.

****************
The advance of cloud computing in the laboratory is also likely to impact electronic laboratory notebooks (ELNs). Symyx, a leading supplier of notebooks, introduced a cloud-hosted version of its ELN (Symyx Notebook) in September, according to John McCarthy, vice president for product management strategy.

As with LIMS, cloud-hosted ELNs eliminate the need for on-site databanks, servers, and software. Applications are accessed from a computer hosting service called Switch Communications. According to McCarthy, one of the biggest benefits to researchers, especially in pharmaceutical and biotech companies, is the facilitation of partnerships with contract research organizations (CROs) and others. "If a company wants to work with a CRO in China or India, it doesn't need to punch a hole in the firewall," he says. "Symyx will allow partners access for the duration of the project."

Symyx is initially targeting small companies for cloud-hosted ELNs, McCarthy says. "Our first customers are small biopharma and biotech companies that want to bring in the IT ability without the burden on infrastructure." He adds that Symyx is in discussions with major drug companies as well, noting that big pharma also can benefit from the system's collaborative features as well as the cost savings.
567 Views 0 References Permalink Categories: The IT Perspective, Electronic Lab Notebook, News from Accelrys Tags: publications, eln, symyx-notebook-by-accelrys, hosted-eln
0

Green Chemistry

Posted by lsubramanian May 27, 2010
Green chemistry is a concept best applied right at the beginning of a project; it is hard to change a process once it is set in stone.  Increasingly, companies choose a “benign by design” method to assure the sustainability of new products, taking into account this consideration across the complete lifecycle.  Though product development strives for environmentally acceptable chemical processes and products, the application of green chemistry is fragmented and still represents only a small fraction of actual chemistry.  Series of reductions in water and energy consumptions and reduction of toxic wastes leading to economic, environmental and social benefits can only be attained by a full commitment of all parties involved in the supply chain.

Whether it is in producing cellulose based biopolymer or in redesigning a catalytic reaction with zeolites, the challenge is in maintaining the Price-Performance balance.  The main stream consumers need to change their attitude and be ready to sacrifice either price or performance in order for Green Chemistry to grow.  Since its birth in 1991, Green chemistry has come a long way, growing from a small grassroots idea into a new approach to scientifically-based environmental protection.  Several commercial and government organizations are working to transform the economy into a sustainable enterprise.  Accelrys’ contribution, small but sure in the area of Green Chemistry, is shown in this example.  In our published work, we report on the nitration of toluene, an important precursor used in dyes, pharmaceuticals, perfumes, plastics, etc.

What are your thoughts on the evolution of green chemistry and its effects on the economy?
521 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: chemistry, catalysis, green-chemistry, polymers, sustainability, toluene, zeolites
0

This is the second post in an open letter responding to this blog entry by Keith Robison at Infinity Pharmaceuticals. Keith eloquently writes about his frustration in dealing with scientific software sales reps at a recent meeting. Pierre Allemand posted an initial reply explaining one vendor's general philosophy in selling software. In this entry, he offers some tips for dealing with overlapping software solutions and touts the importance of well architected software.

 

small pierre.jpgDear Keith,

 

Another key factor in evaluting software and cutting costs is to look at products that are architected properly. There is a lot of software out there that looks really great in a demo, but that is built with a single workflow in mind and is hard to adapt. Get your IT staff to spend time with the vendors you contact—they can tell you a lot about what’s under the hood. If you can see yourself working with the software presented, and your IT confirms that it will work the way it is advertised, you are in good shape.

 

Regarding the overlap you are seeing in software solutions, you will probably not be able to avoid it. Convergence is the word of the day in scientific software. ELNs, for instance, aren't just for medicinal chemists anymore, and now it seems that everybody from LIMS to CDS to SDMS vendors is selling ELNs.

 

But you can minimize the impact of overlaps. First, consider creative licensing agreements that ensure that you don’t pay twice for the same functionality. Vendors won’t want to discuss carving out a little piece here and a little piece there, but on larger overlaps, there is a chance you can do something.

 

Second, consider the actual usage of the systems. To avoid confusion and make data management more efficient, select which one of the overlapping pieces you want to use for what and then stick to it (unless there is a major reason for change). The worst (well, worse than the license costs) is ending up with data in different systems because you didn't have a clear definition of what goes where. Stan Piper of Pfizer discusses some of the headaches associated with managing convergence in the article titled "Informatics Strategies for the Convergent Analytical Lab" from our recent newsmagazine (registration required).

 

I hope those few points will help carry this discussion further. Thanks again for expressing so clearly the issues you are facing, and that we are facing too. The dialogue is what will help. I’m sure you can understand that software vendors will not develop things they cannot sell. But if there is a trend, we will listen. Because honestly, it is simply too expensive to develop something that is not selling :-)

489 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: eln, workflow-integration, buying-software, guest-authors
0

A few weeks ago, one of our sales reps came across this blog entry by Keith Robison at Infinity Pharmaceuticals. In his write up, Keith describes attending a roundtable discussion in which several software sales reps kept asking, “Why aren’t you guys buying from us?” Robison’s entry brilliantly expresses what I think is a common frustration among scientific software customers. So I asked some of our sales reps to respond to his post. Here’s a reply from Pierre Allemand, VP, life sciences, Europe:

 

small pierre.jpgDear Keith,

 

I am one of those software vendors sales guys that you mention in your post, and I really appreciate your post because it does express perfectly the situation we, as providers, are facing.I see three main points in your post. First, you have to choose what to spend your money on and software competes with other things (like CRO work). Second, when purchasing software, some of your investment will be lost because you cannot avoid buying overlapping functionality. And finally, implementing a vision will require a lot of customization (because the software vendor's vision will not be exactly yours).

 

When we develop software, we try to build an application that covers a specific area of needs in the best possible way, taking into account the needs of different people, which sometimes conflict. So we will never manage to develop YOUR solution out of the box—that would make it a custom project, and full custom projects are too expensive to sustain. When we can deploy an application that will be used by several customers at once, we can share development costs and everyone benefits.

 

Still, there is way to spend your money and get a good return for the investment.

 

Most companies that approach us have developed a vision and then look for software that fits it. My recommendation would be to approach vendors before you have a complete vision. The idea is that software vendors will have developed a product with a vision built from many inputs. That vision could bring some new perspective you might not think about by yourself. Of course, you will need to approach several vendors to assess this. I offer more tips on how to work with vendors on large software purchases such as ELNs in this entry.

 

I would then stick with the vendor that has the most interesting vision--the one you can best identify with--and see if that vision could be implemented in your research context. Maybe some of the changes required in how you work will not be possible (and this is where consulting should come in), but maybe some will give you a real gain.

 

My reply is getting a little long, so in the next entry I'll address your points about overlapping software solutions, and also mention the importance of checking how products you are considering are architectected.

507 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: eln, workflow-integration, buying-software, guest-authors
0

ELN in Regulated Environments

Posted by dcurran May 25, 2010
Implementing an ELN in a regulated environment is uniquely challenging, because the paper-based workflows and processes are not just familiar, but associated with specific actions demanded by regulatory requirements. This codifies paper processes in ways that must be addressed if an ELN is going to be successfully accepted by an organization's scientists, quality teams, and compliance and legal experts.

Two Symposium presentations discussed implementing Symyx Notebook in regulated settings. We interviewed Adele Patterson of Bristol-Meyers Squibb and John Leonard of AstraZeneca about the particular roadblocks they encountered in their efforts to improve the efficiency of validated workflows for, respectively, GxP analytical R&D and GMP API manufacture. In both organizations, working with all stakeholders to gain clarity about what regulations demand and how to satisfy these demands in an electronic world led to successful implementations.

499 Views 1 References Permalink Categories: Electronic Lab Notebook Tags: case-studies, eln, symyx-notebook-by-accelrys, symposium, videos, validated-workflows
0

The inherently complex nature of cell lines, plasmids, proteins, antibodies and vaccines makes a biological registration system challenging. Yet such systems are needed so that researchers and companies can track these entities and their relationships, creating critical intellectual property positions as well as connections to past research and manufacturing processes.

 

Patterned on the services of registration systems for chemical entities, which are well-known and entrenched in the drug discovery process, the Accelrys Biological Registration system is an "intelligent" solution for registering, associating, searching and retrieving data for entities such as siRNA, plasmids, cell lines, proteins, antibodies, vaccines and future biological entities.

 

Join us on Wednesday, May 26 for our live webinar, “Intro to Accelrys Biological Registration,” the first in a series on biological registration.  To register or learn more, please click here.

496 Views 0 References Permalink Categories: Bioinformatics Tags: biologics, webinars, proteins, antibodies, biological-registration, cell-line, plasmids, sirna
0

Sustainability: Part 3

Posted by mdoyle May 21, 2010

Products evolve! Ok, a large number you right now will be surprised by my conjecture. But let's examine this a second, and also while we are at it, consider that there are aspects of intelligent design in products as well.

 

Case #1 - the car.  Can it not be argued that from the first "horseless carriage" through the model T, the Edsel, the 69 Camero (a personal favorite) to the Prius and Volt,the automobile has evolved, i.e. changed in an incremental fashion in response to external stimuli? Is this not like or similar to the successive modifications organisms go through in response to environmental predator or other changes?

 

Also before you say there are no genes or DNA to pass on from one model year to the next, think of the various complex modes of existence that occur in nature where forms change; i.e. the caterpillar to the butterfly, or the Amazonian ant bacterii where the life cycle consists of spore, infected ant, forcing the ant to climb to the top of a tree (totally involuntarily) and then to sit there, die and then decompose, so spreading the organism wider. A beneficial side benefit for the baccillus is if the colony is infected. Ok so ants to a model T, what is the connection?

 

Well the ideas concepts or processes involved in the producing of a car, live through many organisms and states, i.e. designers, design teams and versions. What's interesting about car evolution, as a concept, is that the cars tend to incrementally follow certain environmental niches - trucks, SUVs, compacts and family wagons or estates. They also have a wide variety of plumages or variants that survive or die out depending on the fitness to a consumer function.

 

Case #2 is computation or computers. If you follow the line of the Chinese abacus to Babbage integration engine to the ICL 3000 series into the PC-XT, the Comadore Pet and the IPAD and Blackberry, on which I am writing this note, you will see many dead ends or unfit products; but, also consistent moves towards providing or filling a diverse series of functional niches. A further point is that these products are, to a degree, bound in the reality of what can be made, what the market can distribute, service and support, and what people want, so the tradeoffs that occur are complex and lead to diverse set of solutions which are somewhat bounded and limited. This is sort of like a Mandelbrot product space with a range of solutions in a finite bound.

 

The point I am making here is that product design ideation and development are complex, interrelated processes subject to many diametric constraints and qualitative issues that involve many people and that have aspects that are usually cumulative but can involve sudden and dramatic change. Now on the intelligent design side of the card, of course when an individual is asked to design a new product, they do blue sky analysis and current market analysis and opportunity analysis and financial analysis. They then factor it all through a series of human or sometimes computer models or metrics and come to an intelligent design step. Occasionally these are not so good and die out in the market; sometimes like the cell phone they evolve rapidly and spread globally. What is common to both of these views or perspectives is the need for information and knowledge or understandings. What scientists try to do is to take all of this information and to make sense of this or to understand how and why things occur.

 

We’ll continue the discussion in Part 4 of this blog series.

 

“Sustainability: Part 3” is one posting within a “Sustainability” series by Michael Doyle.  To view all posts in this series, please click here.

382 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: sustainability, product-evolution
0

I arrived in London via Heathrow and took the express train to Paddington. What followed was an interesting, but far too ineffective, taxi ride to London City airport at the advice of an online travel forum. Fifty-some odd pounds later (not from the chocolate cake and excessive wine at a family gathering in Vegas last weekend, but rather, the British pounds that withered the muscle of my American dollars), I found myself waiting in the terminal, in conversation with a medical doctor and a systems engineer,  discussing our generations understanding of the human genome. I was excited to hear that both the doctor and the systems engineer agreed that information technology, if applied correctly, will bring us massive systems biology insight in a very short time as we start to look at patterns in the sequencing data and link that to other scientific and clinical data.

 

Systems biology will play an important role in providing a deeper understanding. No longer is it sufficient to outfit our scientific research organizations with software built for specific cameras, microscopes, or other related hardware platforms. Software systems for chemicals, imaging, biologics -  second and even third generations sequencing systems will need to reach far beyond their current myopic data processing capabilities to enable researchers to make more qualified decisions.

 

IT experts continue to explore ways to equip their teams with best in class software, while reducing costs, maximizing the value of existing technology, streamlining workflows and supporting collaboration across scientific domains -  domains that have historically not been able to join forces. Best practices include solutions that easily integrate image data with a range of other scientific data types from diverse areas including – life science research, chemistry, materials, electronics, energy, consumer packaged goods, pharmaceuticals, and aero-space. At the enterprise level, solutions in imaging need to integrate with other enterprise applications. These applications include commercial and open source imaging software applications, as well as enterprise data management systems and corporate portals in applications such as Oracle and SharePoint. Informaticians, researchers and other decision makers need a facile solution that brings together images and other associated data, like chemistry, biological sequences, text and numeric data in a unified data structure.  Association of images, or image regions with specific reported data, increases researcher productivity and cross disciplinary understanding. Automation of error-prone manual tasks like gathering images and associated data, processing, analyzing, preparing and importing data, generating reports and distributing results often required custom built applications. New solutions will need to bypass lengthy coding cycles with “on-the-fly” debugging and immediate deployment of high quality solutions.

 

Perhaps on my way from Heathrow to London City, if I had I a proper tool that allowed me to look at the available data, giving me high quality results with facile links to images (maps) so that I could verify that I was making the best choices, I could have arrived faster with less of a burden on my companies budget.

493 Views 0 References Permalink Categories: Bioinformatics, Trend Watch Tags: sharepoint, microsoft, pipeline-pilot, life-science, pattern-recognition, genomics, machine-learning, image-informatics, systems-biology
0

Hey Pistoia Alliance! Were your ears burning two weeks ago? Maybe I’m just hyper-aware of Pistoia’s mission given that we've posted three articles on this group, but talk after talk at the Symyx Symposium pointed out that without standards and partnership, life-science companies will have difficulty realizing the step-change innovation they are counting on to create the products of the future. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.]

 

Ashley George, Ulf Fuchslueger, Paul McKenzie, and Jason Bronfeld each pointed out that all life science research companies work with the same basic materials, equipment, and procedures.

 

“Sourcing,” George said by way of example, “isn’t competitive, it’s just something we need to do.” McKenzie was more explicit. “I get no proprietary advantage in running a centrifuge, unless I can run that important compound that I’ve discovered tomorrow rather than next year.”

 

But the problem articulated by Bronfeld is that many organizations get caught up in optimizing local, domain workflows and neglect the overall information “journey” that each workflow ultimately supports. This journey leads to “predictive control,” where information is abundant and processes are what Fuchslueger described as “self-documenting.” Reaching this state, however, requires organizations to “get past the bag of applications that we all have and think about how to thread them together,” said Bronfeld.

 

McKenzie went into the most detail about how myopic, domain-centered attempts to enhance R&D productivity distract organizations from innovating. He distilled every type of process—including the most complicated analytical experimentation in R&D—to baking a cake. He proposed a system-independent recipe model that represents a process through standard descriptions of raw materials, quantities of material, process parameters, procedures, and equipment. Common methods and vocabularies for the things everyone does—running a centrifuge, obtaining a Kf, or weighing a sample—will enable companies to focus their innovation on what they are studying and why, which is ultimately the thinking and work that sets them apart competitively.

 

George, a member of Pistoia, was the only speaker to mention the organization by name. But the end-games described by so many other speakers require pre-competitive partnership—the type of activity that Pistoia has been set up to support. I’d love to hear from the speakers, blog readers, and Pistoia reps about how you think these necessary collaborations can get underway.

378 Views 0 References Permalink Categories: News from Accelrys, Trend Watch Tags: symposium, pistoia-alliance, standards
0

The recent oil spill in the Gulf of Mexico is a catastrophe whose consequences will be felt for years to come. Prince William Sound, Alaska, still shows the presence of oil from the Valdez oil spill in 1989. Nevertheless, my intent in this blog entry is not to castigate the parties responsible for it or to lament the damage to the environment, but rather to discuss the chemistry that led to the initial explosion and subsequent tragedy. The problems began on 10 PM on April 20 when a surge of gas (apparently CH4) and oil shot up the drill pipe. Much mention has been made of hydrates, which seem to have contributed to this initial surge and which have subsequently impeded several attempts to cap the well.

 

 

Molecular model of gas hydrate

Molecular model of gas hydrate, from Leibniz Institut fuer Meereswissenschaften

 

 

Hydrates (as explained by the Leibniz Institut fuer Meereswissenschaften) are "non-stoichiometric compounds. Water molecules  form cage-like structures in which gas molecules are enclosed as guest molecules" (see image). These structures occur naturally in many deep ocean sites. Any number of gases can be found in the lattice, but the one of interest here is methane, which is produced by "zymotic decomposition of organic components or by bacterial reduction of CO2 in sediments." These are stable over a specific range of temperatures and pressures as described here, and shown in the image below. There is a limited zone of stability for the hydrates: they extist at high pressures and relatively low temperatures. The weight of the ocean induces a pressure of roughly 1 atm for every 10 meters. Temperature initially drops with ocean depth, but then increases beneath the sea floor. Once you get deep enough - and hot enough - the hydrates are no longer stable. Consequently, their occurrence is limited to a relatively narrow band beneath the ocean floor. The operation was taking place in 1500 m (5,000 ft) of water, and drilling to a depth of 5000 m (18,000 ft).

 

 

Stability diagram for gas hydrates

Stability diagram for gas hydrates in marine environment, from Leibniz Institut

 

As reported in the Financial Times energysource blog, "It has been suggested that they [hydrates] may have been responsible for the leakage of gas into the Deepwater Horizon’s drill riser..." Why did the gas appear then? The crew of Deepwater Horizon was in the process of capping off the well, which involves pumping concrete into the top to seal it (nice graphic of that process in the FT). It is known that this process can destabilize hydrates: a 2009 report by Halliburton, reported again in the FT's energysource blog, warned that "gas flow may occur after a cement job in deep-water environments that contain major hydrate zones." As the FT blog summarizes, gas might stop flowing from the hydrates in a few hours or days, or - if you're unlucky - it might notstop. The chemistry of concrete is explained on this site maintained by WHD Microanalysis Consultants Ltd, who mention that the curing of concrete is an exothermic process, with the period of maximum heat evolution occurring typically between about 10 and 20 hours after mixing. I can't help wondering whether heat released by the setting concrete can contribute to destabilization of the gas hydrates. Anybody got any thoughts on that?


Subsequently, gas hydrates played a role in hindering attempts to stop the flow of oil. Remember the 100-ton steel and concrete box they tried to move on top of the hole? As reported in the FT (again): 'When gas leaks out [from the well], its pressure drops and it cools... In the presence of water, light hydrocarbon liquids can react with water to form a ... hydrate. This happens quite often in gas pipelines, but the circumstances 5000 ft down on the sea bed ... make this very difficult to control.'  This says that the leak from the well is creating even more  hydrates. The hydrates clogged the container and forced a halt to the operation.

 

Incidentally, methane hydrates would be a great source of energy. Unfortunately, that's not a carbon-neutral process: there's a tremendous amount of CO2 that would be released. Still, it's really fascinating to see ice burn as the CH4 is released.

 

Fascinating chemistry. As a theoretical chemist, I've been thinking about how modeling could help. Modeling could predict (T,P) phase diagrams for CH4 in H2O lattices. Monte-Carlo simulations can predict loading curves for these structures, while molecular dynamics or DFT could predict thermodynamic and kinetic stability of the methane absorption. Ultimately, you'd like to be able to use such approaches to identify ways to stabilize these structures, or to destabilize them in a controlled manner: imagine pumping in a chemical that causes the CH4 to be released sloooowly. The advantage of computational methods, of course, is that models won't blow up no matter how much pressure you apply to them or how much methane 'escapes.'

 

Such a study might help prevent future tragedies, but the main focus of scientists & engineers now needs to be on the cleanup. More on that in a future blog.

483 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: green-chemistry, gas-hydrates, oil-spill, petroleum
0
Last week Accelrys held its annual North American User Group Meeting (UGM) in Boston, coinciding with the launch of Pipeline Pilot 8.0, the latest release of our scientific informatics platform.  The meeting was well attended and continued its tradition of very active interactions between those of us from Accelrys and our users.  One of the parts of the meeting I enjoyed the most, and which I find most valuable as a product manager, are the impromptu discussions during coffee and meal breaks, or after a particularly interesting presentation.  One key topic that came up regularly was the level of interest in the integration of Microsoft SharePoint and Pipeline Pilot.  This work was started more than two years ago, with Pfizer playing a critical role in driving the development of the Pipeline Pilot Bridge for SharePoint, which allows Pipeline Pilot protocols to be run from SharePoint and have the results deployed back as Web Parts in a SharePoint portal page.  One of the key anticipated benefits of this integration was the enablement of SharePoint collaboration capabilities with scientific content and intelligence from Pipeline Pilot.

In discussions over the last year, including several at the UGM last week, this benefit has been born out.  SharePoint is well recognized for its enterprise-scale collaboration, document management and portal-based information presentation capabilities.  However, scientific researchers have apparently been somewhat reluctant to fully commit to using SharePoint, because it didn’t inherently contain the sorts of highly advanced scientific data analysis and presentation capabilities that they use routinely in their research.  According to the discussions I had, this is what Pipeline Pilot is now providing.

It’s clear that the importance of SharePoint is ever-increasing in the life and materials sciences, as it is throughout many other industries.  In Pipeline Pilot 8.0 we have extended the integration with SharePoint so that Pipeline Pilot protocols can read from, and write to, the SharePoint document repository. We believe this is a crucial extension, since SharePoint is increasingly being used as a corporate document repository. To allow Pipeline Pilot to read from the repository and extract information and data from the documents opens up exciting possibilities, including the potential to mine scientific data (including live chemical objects, biological entities and sequences, etc.) from the documents and save this information to a scientific index, complementing SharePoint’s own enterprise search with scientifically-enabled search capabilities.  For example, the ability for a scientist to search for documents with a chemical structure, or a corporate compound ID, and find documents containing similar chemical structures, opens up new opportunities for information retrieval and utilization of existing organizational knowledge.  The ability to write the results of a Pipeline Pilot protocol to the SharePoint document repository provides further benefits for information management and later retrieval.

Of course, as always, our users will find many more ingenious ways to use Pipeline Pilot, and its integration with SharePoint, than we at Accelrys can ever predict.  And if my discussions with many of them last week bear out, we’ll be seeing some of those ideas being implemented over the next year, and perhaps being presented at the 2011 UGM.  One thing seems certain, that the combination of SharePoint and Pipeline Pilot will go a long way to really providing researchers with science at their fingertips

Should you have any questions about scientifically enabling SharePoint with Pipeline Pilot, contact Andrew LeBeau at alebeau@accelrys.com
667 Views 0 References Permalink Categories: Lab Operations & Workflows Tags: sharepoint, pipeline-pilot
0

Wendy Warr provided a series of reviews of the first day of Symposium. Here is her take on Day Two. Read on for her comments on the keynote talks, the impact that "picky customers" have on product development, and the Symyx/Accelrys merger. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.] More reviews of Symposium will be published next week. In the meantime, thanks for attending the meeting, Wendy!


WW smaller.jpgThe keynote presentation on Day Two was given by Paul McKenzie of J&J (and formerly of BMS). His theme was “lab-to-patient” development. His staff, he said, are experts in arts and crafts (i.e., cutting and pasting) but he gave that up after kindergarten. “Homo sapiens data integration” does not work: we need a recipe-based approach, and XML, underpinned by S88 standards. With a harmonized data model, recipe structures can be exchanged among systems. In the final plenary presentation on strategy driven informatics, Jason Bronfield of BMS also included comments on this sort of standardization.

 

Although the meeting was fairly heavily weighted towards development rather than drug discovery there were some highlights for those of us who have no background in process chemistry, analytical, and other development issues. On Day One Symyx made it clear how much the company actually welcomes complaints from picky customers because that’s the way that Symyx products get ever better. Since this is a family-friendly blog I won’t repeat the exact expression that was used, but Ulrik Nicolai de Lichtenberg of Leo repeated it on Day Two in his talk about ISIS to Isentris migration. Early versions of Isentris clearly did not meet his company’s requirements, and he did a nice analysis of which parts of the current system at Leo could be tweaked to answer the “performance, performance, performance” issue. Elke Hofmann of Merck KGaA also gave a very honest appraisal of Isentris. Trevor Heritage of Symyx spoke up to assure her that Pipeline Pilot will not sweep away all the good work that has been done with Isentris.

 

The question on everyone’s minds, of course, was the future of SymyxAccelrys, AccelrysSymyx, or whatever they call it. (Remember Axle Grease?) The way the market is right now, the merger was probably inevitable, but I still believe that this will be a win-win situation for the partners (and, I hope, for their customers). It is not such good news for some of the competitors.

476 Views 0 References Permalink Categories: Cheminformatics, Lab Operations & Workflows Tags: user-group-meeting, isentris, guest-authors, symposium
0

This is the final entry in a series of write ups on the Pistoia Alliance. We were proud to discover last week that our entries have received wide play: the article imaged below from a supplement to the The Times of London was lifted primarily from our blog series! We're glad to assist Pistoia in publicizing their efforts.

 

Times-Article-201004282.jpg

This entry explains the challenge of recruiting and retaining members in a virtual organization. Thanks to Nick for speaking to us so candidly about the group. And by the way, Pistoia is now looking for a part-time executive director. If you would be interested in this position, you can find a job description here. The deadline is Friday!!

 

The first face-to-face meeting of Pistoia members in January marked a turning point for the organization in many respects. Until that point, the organization had conducted all of its business virtually and electronically. And while the group still intends to maintain most of its activities electronically, it realized at that meeting the need for a formal structure to keep things on track.

 

“In this new, social-media driven environment, there’s a temptation to let things go and happen more organically,” says Nick Lynch, Pistoia’s president. “You think you’ll just get this great community that you’ll be able to mine for project ideas. But the truth is the supporting technology is still very immature. You don’t know which will be the right hub or where the commentary or innovation will come from. We’re trying to find the right balance to enable people to get involved and make connections that will lead to projects and, eventually, deliverables.”

 

Pistoia’s decision to divide members into working groups was one way to collect innovation and, as Nick says, “spread bets a bit by having different stuff in our early portfolio to see what takes off.” The organization is also experimenting with different methods of fostering communication. Nick points out that many pharmas don’t do a good job of social interaction internally, relying mainly on file-based communication. His organization, for instance, maintains a wiki, but ultimately, you have about 1% of people writing for it and 99% reading. “That really isn’t sustainable,” Nick says.

 

The organization maintains both a public website and a private-member only site (the latter was migrated in late March from Ning to Basecamp). The Basecamp site includes functionality to enable dialogue between members as well as space to post project statuses and plans. The organization will also publish regular updates quarterly and is hiring an executive director to ensure that projects stay on track.

 

“Seeing results will keep our members engaged and energetic,” says Nick. “And it is energy—along with delivery and showing the benefits of our work—that keeps us going.”

371 Views 1 References Permalink Categories: Electronic Lab Notebook, Trend Watch Tags: pistoia-alliance, standards
0
Check out this presentation on SlideShare to learn more about four key bioactivity databases. The presentation provides general information on what each database contains, information about what questions they can answer, and details about how they help scientists. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc. Symyx Bioactivity Databases are now Accelrys Bioactivity Databases.]
384 Views 0 References Permalink Categories: Scientific Databases Tags: toxicology, presentations, content-in-context, bioactivity, metabolite
0

Sustainability: Part 2

Posted by mdoyle May 12, 2010

Ah the joy of long haul flights; as in trying to remain hydrated and getting rest. I digress and wonder if someone will develop a skin cream someday to do just that!!

 

Anyway, very interestingly I notice how the lights over my seat are low power usage light emitting diodes, and that I can see where we are on a large in-seat liquid crystal display with advanced in-seat wiring functions, also I see advanced materials chemistry all over my conveyance in the sky. The fabric in the seat I am sitting on has a long life, hence, is low carbon footprint product.  The flame retardants used in it, and many of the composites and cables panels, to name a few, are designed to maximize performance and minimize aircraft down time, and therefore are sustainable since they contribute to product longevity, and have end of life or reuse plans factored and designed into them.

 

I am sure you are thinking that flying consumes a lot of fossil fuel and so is the antithesis of my theme here: well perhaps in the future not so! Firstly people really do want to travel, so demand is an inevitable fact, as in the kind lady next to me going to see her new grandchild. Therefore it is beholden on us technicians to enable this in the most efficient and sustainable way. Boeing, with the Dreamliner which shows upwards of 18% fuel efficiency improvement per passenger mile, and the recent FAA-Continental 757 successful test flight on BioAvgas (they used only one turbine on the bio fuel) show the way to satisfying the inevitable human need, as well as improving the system efficiency and contributing to sustainable technology development.

 

Now, if we follow this technology pathway into the materials that constitute our system, we see that advanced chemistry and understanding of chemistry or interactions of additives, polymers, fibers, pigments, oxides is the core or fundamental design space we can work with. Think of this dream or goal, if we could engineer materials at the atom or nano-scale, just like the IBM researchers did when writing their corporate logo on a surface with argon atoms, we could design materials that represent the optimum efficiency of form, function and usage in the most efficient and maximum life span way.  How do we get from assembling a few atoms to building sustainable planes and products?  Well, through demand and pressure, yes, but also through the understanding of leading edge materials science technology and engineering. Virtual chemistry or simulation at the quantum, atomistic and meso scale is essential in this regard as it allows us in a sustainable way, in virtuo where we don't waste material, to design develop and someday to test materials. The scientists at Yokohama recently started development on a new car tire that was "more green;” and in order to maintain their processes they looked for compatible non petroleum based oil materials. What they found was by-product orange oil from the food industry. This, with suitable compatibilizers, allowed them to produce a high performance product and yet manage all the competing factors of cost, process, ply and fiber/side wall interactions, there are over a dozen individual parts in a car tire, and yes, the tire companies are significant users of virtual polymer technology.

 

Also consider this - if we look at the food tray and packaging I was just brought, how the demand on plastics has been reduced over the last 5 years and how the design of attractive and functional products, support the reduce, re-use, recycle initiatives. So I think I will go refill my 30% thinner plastic water bottle from the galley and enjoy my meal and continue to wonder on where we will travel onward to next as an inquisitive technology motivated species.

 

"Sustainability: Part 2" is one posting within a "Sustainability" series by Michael Doyle.  To view all posts in this series, please click here.

381 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: virtual-chemistry, biofuel, polymers, sustainability, carbon-footprint, fossil-fuel, virtual-polymer-technology
0

Wondering what to make of Wendy Warr's comments on the All New DiscoveryGate? Take it for a test drive yourself. The Beta2 version of the All New DiscoveryGate is available now. If you'd like to become a tester, comment here and I'll get you hooked up.

374 Views 0 References Permalink Categories: Scientific Databases Tags: discoverygate
1

Wendy Warr has provided two reviews so far of Day 1 of Symposium. Read on for her impressions of the talk by Randy Haldeman and Russ Hillard on the All New DiscoveryGate. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.]

 

WW smaller.jpgAnd so to my comments on the "All New DiscoveryGate." My imagination was at first fired by Randy Haldeman, the new senior vice president of content at Symyx. His background is not chemistry: he comes from the Internet and mobile space and he really believes in “leveraging” the ideas of the Concept Web alliance. Like lots of folk at Symyx, alas, he thinks that “leverage” is a verb but I’ll forgive him that because he presented some interesting futuristic ideas: for example, putting a great big “Buy” button into the All New DiscoveryGate version of ACD, building a (free!) collaborative site for scientists, and adding a high-end chemistry calculator.

 

My good friend Russ Hillard then showed off the All New DiscoveryGate (beta version to be released next week). This is based on the DiscoveryGate Web Service that was released in 2008 and it has a really up to date look and feel. The search results appear mid-page (table, grid, and detail views are options) and filters appear on the left. The beta version will deliver “only” ACD but there is a “molecule as product” option, which presents a reaction in a reaction tab (with access to full text) and links to sourcing data in the molecules tab. The next step will be to add reactions: 5.5 million of them. ('Could it be that this field has recently become rather competitive?' I ask myself). E-commerce will follow (the big “Buy” button), then SCD, bioactivity content, indexed links to PubChem and ChemSpider, and DiscoveryGate forums. A potential user asked, 'What if you could drop reactions into an ELN?' 'Absolutely!' is the answer--not to mention reaction planning.

501 Views 2 References Permalink Categories: Scientific Databases Tags: discoverygate, guest-authors, symposium
0

A few months ago, I posted on model applicability domains (MAD) and why they are important, with a promise to say more in a future posting. Now that Pipeline Pilot 8.0 is out, the future is here, and I can say what I have been chomping at the bit to say for the past three months: we now have extensive MAD support for all model types in Pipeline Pilot, including R models.

 

At last week's Accelrys North American User Group Meeting, I presented research results on using some of these MAD measures to quantify the performance of both regression and binary classification models. Specifically, we can use measures of distance from test data to training data to compute error bars for regression models. For classification models, we can compute performance measures such as ROC scores, sensitivity, and specificity. The key point is that the model performance metrics calculated with the aid of the distance measures vary from test sample to test sample according to how close each sample is to the training data. The metrics are not just averages over the entire test set. (See the following picture.)

 

 

Root-mean-square error of regression models for 4 data sets versus distance quartile

RMS error versus distance quartile for regression models of 4 different data sets (automobile, LogP, fathead minnow toxicity, hERG). Results averaged over 1000 training/test resamplings. Error bars show standard deviation.

 

 

The basic idea and qualitative conclusions of this research are not new and have been reported by other researchers (see my previous posting for links). But some of the details are indeed new.

 

For binary classification models, one such interesting detail is that not only can the distance measures indicate the expected model performance for particular samples, but it may be possible in some cases to use these measures to improve the predictive performance of the model. The way we do this is by varying—according to the distance from a sample to the training data—the score cutoff that we apply to the model prediction to distinguish between the two classes. For some models, this gives better combined sensitivity and specificity than we get from applying a single cutoff value to all test samples. (The improvement was seen for one balanced data set but not for two imbalanced ones, so more work needs to be done to see whether the results for the balanced data were a fluke.)

 

For more details on this research, and on MAD support capabilities in Pipeline Pilot 8.0 and how to use them, please see my posting in the Pipeline Pilot forum.

593 Views 0 References Permalink Categories: Data Mining & Knowledge Discovery Tags: qsar, statistics, pipeline-pilot, user-group-meeting, data-mining, data-modeling, classification-models, machine-learning, roc-analysis
0

See Wendy Warr's initial post reviewing the first day of Symposium here. Here are some more thoughts from her on Day 1.

 

WW smaller.jpgI should also have mentioned Ashley’s George’s thought-provoking perspective on the cloud. (An earlier post on this blog outlined my interest in cloud computing, and you can find an article I wrote on the subject here). Ashley found me some more “-aaSs," including UCaaS (unified communications as a service) and BPaaS (business process as a service). We live in interesting times, as they say.

489 Views 2 References Permalink Categories: The IT Perspective Tags: user-group-meeting, cloud-computing, guest-authors, symposium
0
Everyone at Accelrys is excited to see that Pipeline Pilot 8.0 (PP8) is launched and soon will be in the hands of customers and partners to advance their goals in Science.  Not to be overshadowed with the science empowerment in PP8, there are a ton of great features for administrators to take advantage of as well.

While there’s a huge list of improvements, I’ll start off with a few I think are pretty interesting.

64-bit Computing with Windows Server 2008 R2

Both Microsoft Windows Server 2008 (including Windows Server 2008 R2) and 64-bit computing have been growing in adoption in many Information Technology (IT) environments due to their power and flexibility.  And let’s not forget all of the great advancements that our friends over at Intel have done with their server class chip Xeon which also makes targeting 64-bit so attractive from a performance perspective.  Pipeline Pilot 8.0 now officially supports Windows Server 2008 (Service Pack 2) and Windows Server 2008 R2 on the 64-bit platforms.  We continue to support 32-bit platforms as well on Windows Server 2003 (Service Pack 2), Windows XP, and Windows 7.  We made a decision to focus our development resources on the latest 64-bit platforms available from Microsoft to ensure longevity in support and to maximize the capabilities of the operating system.

And just as a reminder, we also support 64-bit processing on Red Hat Enterprise Linux 5.

Job Queuing

The popularity of Pipeline Pilot in many scientific organizations has created some interesting work habits because of the consumption of computing resources on a given server.  Pipeline Pilot 8 (PP8) allows you to define a threshold for the number of active protocols that can be executed at any given time.  An administrator can, for example, allow only 5 protocols to run at a given time.  While users can continue to submit jobs to run, if 5 jobs are already running the others will get added to a queue (which can also be manipulated by the administrator in real-time)

Data Management

Data Management is a new capability within the Administration Portal of PP8 that enables an administrator to define named data sources that can be leveraged by protocols.  This allows an end user to select a given named data source and use it directly within their protocol without the need to understand the connection details for that database.  This is also great when building a protocol in a development environment with test data and then move to a production server.  The named data source in the protocol remains the same and uses the appropriate data depending on the server in which it’s deployed.  Permissions (use, view, modify) to consume a given named data source can be defined on a per-user or group level.

There’s also support to create Open Database Connectivity (ODBC) connections without having to be a local server administrator.  Additionally, we now support for Java Database Connectivity (JDBC) which enables access to a number of JDBC drivers that can connect directly to a number of database vendors.

Again, there’s a lot more to cover in future blog posts (e.g. Role-base Security), but this should give you an idea of the investments we’ve made to Pipeline Pilot 8.0.

Finally, I should mention that during the Accelrys User Group Meeting 2010 in Boston this week, I had a few talks about Cloud Computing.  There was lots of excitement from our customers and partners, and we’re eager to share more details soon via our blog and forums.  More on that later.  If you’re interested in more about IT related topics for PP8 or Cloud Computing, feel free to shoot me an email.

- Conrad Agramont, Senior Product Manager, cagramont@accelrys.com , Accelrys
892 Views 0 References Permalink Categories: Lab Operations & Workflows, The IT Perspective Tags: protocols, pipeline-pilot, data-management, cloud-computing
0
Symyx Symposium has ended, but continue to watch this space for reviews, summaries, and interviews. Folks are calling this Symposium the best in Symyx's history. The content was compelling, networking time was abundant, and the venue was simply stunning. As proof, we give you these two videos that we filmed the first day. [Editor's note: In July 2010 Symyx merged with Accelrys, Inc.]

Below, attendees gather for the first general session Wednesday.



And here, Bruce Pharr, vice president of marketing, and Isy Goldwasser, CEO, welcome attendees to the event.

448 Views 0 References Permalink Categories: News from Accelrys Tags: user-group-meeting, symposium, symyx
0

My guest commentator today is Dr. Wendy Warr, who has been working in the research informatics space since its inception. In 1992, Wendy founded a consultancy offering competitive intelligence studies and reporting to a range of clients in our industry. Find more expert opinion from Wendy at her Web site, The Warr Zone.

 

WW smaller.jpgThree highlights for me of Day One of the Symposium (in reverse order).

 

3) That new TLA. The buzz word is no longer ELN; it’s now the ELE, the electronic lab environment.

 

2) A super talk by Stephan Taylor on the sentient ELN. The final question/answer was “Suppose you sent your reaction details straight from your ELN to JACS”. What a great idea. Of course it’s not just technology (or lack of) that will prevent this!

 

1) Russ Hillard’s presentation of the All New DiscoveryGate. This merits a message of its very own. Watch this space."

497 Views 3 References Permalink Categories: Electronic Lab Notebook, Scientific Databases Tags: eln, discoverygate, guest-authors, symposium
2
Today we announced the newest version of Pipeline Pilot from our User Group Meeting in Boston.  Andrew LeBeau, Senior Product Manager of Pipeline Pilot, previewed the release in the plenary session with a rockin’ demonstration of Design Mode.  This is only one of the cool enhancements that makes it easier and more efficient to design, develop, test and deploy protocols. It’s also something that delivers valuable capabilities – whether you are an expert Pipeline Pilot user or just getting started.  Don’t miss the chance to learn more about Pipeline Pilot 8.0 in our upcoming webinar seriesRead the press release here.
465 Views 0 References Permalink Categories: Lab Operations & Workflows Tags: pipeline-pilot
0

 

Last month’s BioITWorld Expo featured the presentation below by John McCarthy, vice president of product management strategy. John spoke about the latest generation of ELNs, such as Symyx Notebook, and how they serve as a fulcrum supporting the convergence of instruments, software, and workflows. You can view his presentation at Symyx’s new SlideShare page; we’ll be uploading more presentations over the next week or so.

 

[Editor's note: In July 2010 Symyx merged with Accelrys, Inc. Symyx Notebook is now Symyx Notebook by Accelrys.]

 

357 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: eln, presentations
0

ELN Security (exclusive)

Posted by dcurran May 3, 2010

The March/April issue of Scientific Computing includes an “Industry Insights” Q&A article on security issues for today’s ELNs. For space reasons, the publication was unable to include one of the answers that we provided. Here is the question that was omitted, along with Dominic's response. And once you've seen his reply, we'd love to know what YOU think about the electronic signature options that are available today.

 

SC query: ELN use in a regulated environment requires verifiable electronic signatures. Please describe the best available options and their relative benefits. Which do you recommend and why?

 

Response: We recommend that electronic signatures be generated from a single set of centralized authentication and signature services that is accessed by all scientists using the ELN. The best systems provide standard signing options that are well accepted in the industry, and they integrate with other signing systems that may be in use at the site. The Signature service should offer configurable policies so the site can administer customer workflows and policies associated with a signing event. In regulated environments, it is important to perform signature and validation not just at the document level, but at the operation or field level as well. For example, notebook forms can force sequential data entry, pre-defined signing requirements, and deviation flagging during analytical method execution. This provides the rigidity and structure analytical labs require when executing methods for release candidates.

 




369 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: publications, eln