Skip navigation
Previous Next

Accelrys Blog

October 2010
0
The recently released Accelrys Draw 4.0 addresses a key problem in working with biologics--how to represent them so that they can be databased and later searched by scientists. Check out the video below to see a demo I gave on the new features in Accelrys Draw, which can recognize the structural information underlying UniProt files.

745 Views 0 References Permalink Categories: Bioinformatics, Cheminformatics Tags: biologics, accelrys-draw, chemical-representation, uniprot-sequence
0

If you aren’t able to join us in Spain for what’s shaping up to be the most heavily attended European User Group Meeting in Accelrys’ history, follow us at #EUGM10 on Twitter. It’s the next best thing to being there, and if you’re the first to correctly tweet the answer to a question about information posted to the feed each day of the conference, you’ll win a Flip HD camcorder.

 

 

The view outside the conference rooms

 

 

The agenda features three days of proven best practices, product demos, expert user presentations, and one-on-one interaction with Accelrys executives, industry keynote speakers, fellow colleagues, Accelrys partners, and product experts. Accelrys executives will kick off the meeting by outlining the company’s new shared vision, integration plans, and product roadmap. Another highlight of the first day is a keynote address by Joseph Cesarone of Abbott titled Experiences in Collaborative Development and Deployment of a Registration System for Biologics. Abbott was a key partner in a special interest group (SIG) that assisted in the design and development of the Accelrys Biological Registration system and was the first partner to deploy the system in September 2009.

 

Three parallel conference tracks will feature 37 customer presentations, and we hope to post interviews with some of the presenters during the meeting and in the weeks immediately following. We’ve designated experts to tweet from the sessions to broadcast information on the meeting both to attendees who want to know what’s happening in the other rooms and to anyone who is unable to attend the meeting.  We also have a fun competition that we’ll be running where we’ll be giving away a Flip HD camcorder each day. Look for a tweet from @Accelrys asking a question about information given in one of the tweets that day. If you’re the first to tweet the correct answer, you’ll win the Flip camcorder! You don’t need to be present to win, but you can only be a winner once.

 

We look forward to “seeing” you, either at the conference or online!

502 Views 0 References Permalink Categories: News from Accelrys Tags: user-group-meeting, conferences
0

Consumer packaged goods (cosmetics, cleaning products and the like) are something that we can all relate to. Who doesn’t want cleaner clothes, softer skin, or shinier hair (at least those of you who have hair). The predilection among consumers for a better look has made this a billion dollar industry. As I wrote last year (Cosmetics Got Chemistry) following my trip to the Society of Cosmetic Chemists conference, the development of goods such as personal cleaning products has progressed from the barbaric (i.e., fatty acid salts) to the sophisticated, e.g., adding silicones to combine shampoo and conditioner.

 

The new trend is ‘cosmeceuticals,’ the combining of cosmetics with active pharmaceutical ingredients to deliver added benefits like moisturizers or anti-aging creams. This was discussed in a recent article Improve Your Product’s Image with Smarter Scientific Informatics by my colleague Tim Moran.

 

He focuses on the use of image analysis and data integration to streamline the development of cosmeceuticals.  Modelers generally consider things like molecular structure, analytical results (IR spectra, etc.), synthesis pathways, and of course product performance. But substantiating the performance is time-consuming and often subjective: does this cream really remove wrinkles, or can another genuinely remove age spots?

 

 

The lips on the left belong to an older individual, the ones on the right to a younger. The ones on the left are clearly more wrinkled, but by how much?

 

 

Image processing provides a new dimension to the types of data and models that R&D teams can process, making quantifiable analysis possible far more efficiently than before. Consider the two images in this blog. One set of lips clearly has more wrinkles, but how many more? How could you substantiate the difference if the results are less obvious? You could measure wrinkle lines one at a time with a ruler, slowly & tediously; or use image analysis to count the total number of wrinkles in seconds. This provides immediate feedback to the R&D team so they know when they’ve got a better product - and how much better is it. It also provides the marketing (and legal) team with objective data that can be used to substantiate product claims.

 

Measurable, quantitative, verifiable information. That’s at the heart of the scientific method, and now even unstructured data can receive benefit.

453 Views 0 References Permalink Categories: Materials Informatics, Trend Watch Tags: publications, image-informatics, consumer-packaged-goods
0

Mike Weaver, principal with The Weaver Group, a software application & consultancy specializing in workflow integration in the QA and regulatory and process analytical areas, concludes his series on digital signatures today. Last week, Mike reviewed the history of digital signature development. Today, Mike examines the role cryptography has played in the development of modern digital signatures.

 

 

Mike Weaver



Before fully proceeding into the digital world, organizations must consider cryptography. Cryptography, like identity, has long roots. The Greek word kryptos means hidden and referred to hiding information from unintended recipients. Historians such as Herodotus described different techniques on how to maintain confidentiality of classified information. A message would be ciphered using a secret key. The key would be sent to the recipient ahead of time so that when the message arrived, the recipient could use the key to translate the message back into plain text.

The problem with this system is that the secret key has to be communicated over a secure channel, and as the Romans and countless others have learned, secure channels, whether physical or virtual, are slow and difficult to maintain. Additionally, if the secret key is intercepted by a malicious recipient, all messages using that key are no longer confidential.

Modern cryptography uses asymmetric (two distinct keys) systems. In 1976, Williams Diffie and Martin Hellman illustrated an asymmetric system that keeps the encryption and decryption keys different, though mathematically dependent on each other and impossible to derive from one another. The two keys are the public key (encryption) and the private key (decryption). Researchers Rivest, Shamir and Addleman built upon this illustration and invented an algorithm called RSA to perform asymmetric cryptography. Here, a public key is posted on a server and anyone who wishes communicate with the holder of a private key must encrypt the message with holder’s public key. Therefore, only the holder can decrypt the message and the information does not need to be sent via secure channel.

This kind of cryptography is the basis for the modern advanced electronic signature or digital signature.  Modern digital signatures utilize a summary of the message to be signed to save on computational costs. Here, the signer of a message uses a cryptographic hash function to sign a summary of the original data using their private key. If other parties wish to verify this signature, they must separate the data or message using the same hash function (summary) as the sender did. The party then applies the decryption or public key from the signer to the message, which also produces a summary. If the two summary values are a match then the following is true:


  1. The message or data was indeed signed by the signer (authenticated)

  2. The message or data was not modified between the signing event and verification; otherwise the summary values would not match

There is a new era of integration and innovation focused on the front end GUI to implement the robustness of the signature event within the modern digital laboratory environment.  For example, many labs have safety gear requirements that make fingerprint or retina scanners unusable or they may also have high test volumes that make RSA pin or traditional key entry a poor option.  To remedy this, some software vendors offer additional options to effectively and efficiently protect against fraud and to exceed compliance standards that even a wet signature on paper cannot offer. For instance, pattern, gesture, or smudge recognition allows for fast and rugged identity verification in a gloved, small, mobile touch screen lab environment.

 

The industry is definitely helping organizations move away from paper with confidence!

484 Views 0 References Permalink Categories: Electronic Lab Notebook, The IT Perspective Tags: eln, guest-authors, validated-workflows, process-chemistry, regulated-workflows, data-security, digital-signatures, quality-assurance, cryptography
0
Chemistry, materials, polymers, formulations, low throughput experiments, high throughput experiments and virtual information now form the basis of an ever growing data pyramid that leads to a new product.  While much of this may be part of the overall corporate holistic information network usually housed within or presented by the PLM system, increasing product complexity and performance are generating data sets that go deeper into the chemistry of the system. In the semiconductor field—where Moore's law has held sway for over a decade (the continuous annual doubling of chip density and performance)—“More Moore” or “Beyond More” is being used to describe greater levels of complexity. Like Manhattan skyscrapers, the vertical packing of chip layers is achieving the density required by new devices such as the 2010 Nano-iPod and iPad. These chips now include such complex layers and networks that their synthesis and production, quality control, manufacturing and processing involve a major tour de force of formulations and mixture management!

The company now seeking to manufacture a product—say, the chips in the new tablet computer I have—must manage multiple generations of designs, each involving significant formulation and material complexity, and manage that information and associated data (business, analytical and testing) through the whole product ideation, design development and sun-setting life cycle.

Good luck.

The accelerating rate of product change and the increasing need for very granular details related to product ingredients and performance require both wider connectivity and deeper, more integrated informatics across the R&D, PLM and corporate decision-making landscape. With its scientifically-aware informatics and data pipelining platform, Accelrys is unique in its ability to provide the critical layer in the design-through-manufacturing processes. The platform’s ability to automate and integrate a diverse array of scientific business processes and data enables organizations to leverage chemical, analytical and virtual simulation data within the corporate PLM suite. This is a major step forward. Ten years ago nobody would have considered that virtual design was part of business process management.  Today, companies such as Boeing, General Motors, Ford, Honda, Apple and Proctor and Gamble wouldn’t think twice about incorporating their finite element designs into their PLM layer. Hence, the next iteration of innovation will lead to the addition of data related to not only the shape and distribution of plastics in the package or airplane wing, but also information on the elastomeric chemical nature of the plastics or coatings, and the formulation of the ingredients in the bottle or on the wing surface.

So, it looks like chemical information is here to stay as part of what is fed into the PLM data management layer, alongside virtual chemical or material performance evaluations and virtual product designs.  No more strange bedfellows.
408 Views 0 References Permalink Categories: Trend Watch, Lab Operations & Workflows, Executive Insights Tags: product-lifecycle-management
0
Numerous government initiatives have been launched in the last five years to address the growing need to accelerate new drug discovery and approval as well as improve the efficiency of analytical processes. George Miller’s recent Fierce BioTech IT article regarding Tox21 was spot on with his assessment of the benefits of in silico tools. We at Accelrys were proud to provide comment on this article.

Specifically, Tox21 is a federal collaboration of the Environmental Protection Agency (EPA), National Institute of Environmental Health Sciences (NIEHS), National Institute of Health (NIH), and Food and Drug Administration (FDA) which is chartered to find ways to predict how chemicals will affect human biological response; research, develop, and validate chemical assays; and compile/report the data generated through ongoing research.

The use of in silico tools can greatly enhance the efficiency of analytical processes as well as accelerate the decision process required in high content analysis and prioritization of target substances and pathways.

Using Tox21 as an example, we see three closely related areas where in silico tools can address the goals of this collaboration.

         
  1. Through modeling and simulation, researchers can identify chemically induced biological activity and mechanisms as well as develop models that can effectively predict how chemicals will affect biological responses

  2.      
  3. Through informatics and analytical applications, researchers can prioritize which chemicals need more extensive toxicological evaluation,  conduct informatics analyses needed for the innovative testing and development, and rapidly screen with high-throughput large quantities of assay data from any number of other applications

  4.      
  5. Through complex and disparate data management, researchers can integrate the data from modeling, simulation, informatics, and analytical analysis as well as reference databases that can handle the 10,000 new chemicals under study at the NIH Chemical Genomics Center

The data management and integration part is especially important because of the magnitude of data generated and the need to be able to assimilate the data for rapid decision support.

The good news is that all these tools are commercially available and ready for deployment now and the various agencies would be well served to adopt commercial off-the-shelf solutions rather than develop their own. From that perspective, industry involvement would be critical to the success of Tox21 and other like federal initiatives.
388 Views 0 References Permalink Categories: Lab Operations & Workflows Tags: toxicology, data-analysis, data-management, analytical-workflows, tox21
0

QbD FUD: YMMV IMHO

Posted by Dominic.John Oct 18, 2010
Don’t you love acronyms? I came across this great article that helps to alleviate the FUD (fear, uncertainty, and doubt) around the FDA’s QbD (quality by design) initiative. Reading this, I wondered whether the FDA should have guidelines for acronyms too.

Jokes aside, the article dwells nicely on common misconceptions in the industry and succinctly positions QbD, with the cliché quote from the FDA being "quality cannot be tested into products; it should be built-in or should be by design."

I have spent several months now looking at the QbD initiative and mapping how the electronic laboratory environment can better support scientists seeking to model and understand the design space around process and product with the goal of better product quality meeting CQA (critical quality attributes… yet another acronym to bend your mind around). This article hits home for why QbD is important beyond compliance:

Because development usually takes place over a long period of time and often involves many people, it can sometimes result in misunderstandings, errors, and redundant activity. QbD keeps an understanding of design space in the forefront of development efforts, providing the coherence and continuity that trial-and-error approaches lack. QbD can also help maintain "lifetime development"--the continued gathering of data after a product is already in production. Because only a limited amount of data accumulates during initial development, additional data, framed by the product's design space, can be helpful in deepening process understanding and continuously improving the process.

There is a huge urgency in development to get product quickly to market. After all, the ROI is huge for every day saved. A quick Google search for top 20 blockbusters, some math, and you’ll find that it costs companies $16.5 million for each day a blockbuster is held back from the market. Bottom line: If you spend a little bit more time upfront in the understanding of the product and process, it will result in faster development at lower cost with higher quality, something everyone is in favor of in any industry. This is truly a case of “less haste and more speed,” and we now have guidelines to help us focus.

QbD is really different. Scientists sometimes look at the FDA guidelines as a necessary evil. But QbD is a 180 degree different approach for doing drug development and manufacturing that not only reduces compliance burden but makes business sense irrespective of meeting compliance needs. Rightly the author points out that pharma professionals fear change and risk. But our industry is in in need of change. “How does one become a butterfly?'' she asked pensively. ''You must want to fly so much that you are willing to give up being a caterpillar.'' Similarly, when you want to get drugs to market faster, at lower cost, and with high quality, you’d better be prepared to metamorphose.
690 Views 0 References Permalink Categories: Electronic Lab Notebook, Trend Watch Tags: roi, process-chemistry, quality-by-design, drug-manufacturing, life-science-manufacturing
0

I’m pleased to once again welcome Mike Weaver, principal with The Weaver Group, a software application & consultancy specializing in workflow integration in the QA and regulatory and process analytical areas. Mike offers two posts discussing the role of digital signatures in the electronic lab. Today’s post provides a history lesson on ways cultures have secured and authenticated contracts and how this has influenced the development of digital signatures. Next week, Mike will examine the role cryptography has played in the development of modern digital signatures.

 

 

Mike Weaver

 

 

Fraud and identity theft have corporations and individuals alike concerned with electronic data security and authentication. Although this may appear to be a new trend, information integrity concerns have been around for a long, long time.

 

Texts from the fourth century depict merchants of Old Babylonia displaying commerce trust through a handshake. In fact, the Babylonians are largely credited with developing the handshake custom. As time passed, rituals like handshakes were replaced with symbols. These symbols can be found as early as 335 BCE on the Dead Sea Scroll Book of Isaiah, where rabbis signified that they had read the text by marking it with their personal symbol combination. But symbols can be easily imitated, and this led to the development of handwritten signatures, which today still bind many legal and regulatory contracts.

 

Handwritten signatures have become cumbersome in an increasingly electronic workplace, leading many organizations to adopt a hybrid model in which electronic records are digitally signed, stored electronically, printed, and then wet-signed. This perpetuates a business model concerned with filing, archiving, and retrieving paper documentation. Clearly, paper paradigms are not the best way for companies to stay competitive in a global, digital business environment. And not surprisingly, many organizations are interested in moving from this hybrid model to a fully electronic signature process that can push documents through the signature workflow electronically rather than manually, streamlining communication among members of increasingly globalized project teams.

 

The good news is that legal and regulatory bodies have issued guidance in this area and are more and more accepting of fully electronic signature practices. A digital signature is really an advanced electronic signature. Electronic signatures are perhaps best defined by the compliance organizations as any combination of text, audio, and other information represented in digital form that is attached to a record with the intent to sign the record. Digital signature adds an extra layer of confidence that the signed record has not been altered and cannot be repudiated by the signer. The European Directive 1999/93EC as well as the Food and Drug Administration (FDA) under 21 CFR part 11 states that an electronic signature must have irrefutable authentication of the owner of the signature and detection of an alteration to the content of the document linked to that signature. It should be noted that the FDA has loosened their stance on electronic signatures, but duly noted is that German Digital Signature Law (Sig G) becomes even more prescriptive by describing the actual methodology that should be adhered to when implementing electronic signatures. The global marketplace poses many interesting topics when compliance is considered.

485 Views 0 References Permalink Categories: Electronic Lab Notebook, The IT Perspective Tags: eln, guest-authors, validated-workflows, process-chemistry, regulated-workflows, data-security, digital-signatures, quality-assurance
0
Spurred by economic realities and the desire of today’s organizations to accelerate business growth, innovation and research activities are fast spreading globally outside the traditional "first world." Consider the rise of Asian pharmaceutical contract research organizations (CROs), the development of regional innovation centers in the consumer package goods industries or even global section development and manufacturing of commercial aircraft parts. Businesses seek to leverage these "extended enterprises" over geographically distributed sites, on a continuous basis, to achieve 24/7 innovation. This is in part due to the relentless pace of modern global markets, which mandate faster, better and more profitable products the world over. The recent launch of the 4th Generation iPhone™ in Thailand, at a price cheaper than the previous 3rd generation product and within a couple of months of the US launch is an excellent example.  Yet R&D processes are complex, expensive and people-centered. Globalization of these activities can bring significant cultural and management challenges.

What's the connection between PLM and MSI for organizations with increasingly global operations? Well, simply this:  when someone is managing, say, the toothpaste you used this morning, or the antacid pill you took last night, the product as stocked on the shelf consists of a packaged or assembled good, within which is a formulated product(s), which may contain a hierarchy of chemistries and alongside which there is associated packaging, labeling and claim support data. Before you say, “yes, but that's just like traditional PLM concepts,” consider how small and subtle changes in ingredient specification can have significant changes in overall product performance, require more analytical information for labeling, claim support and environmental approval (REACH etc.)… all of which are needed on a per region basis, while allowing (as much as possible under these circumstances) for “standardized” SOPs for development and global supply sourcing.  That’s a heck of a lot for PLM alone to handle, isn’t it?

Next week:  The ever-deepening levels of product complexity
347 Views 0 References Permalink Categories: Trend Watch, Lab Operations & Workflows, Executive Insights Tags: contract-research, product-lifecycle-management, global-research
0

It was nice to see someone trying to freshen up the oft-used race metaphor: You know, the idea that companies bringing new products to market are racing against their competitors? A recent article in C&EN described process chemists as the middle-leg runners in a relay race.

 

There’s no roar from the crowd for them as they set off from the starting line, and they don’t get the glory of crossing the finish. But it’s those middle legs where races are won and lost, and if the process chemists can’t keep pace, there’s no way a drug will make it to the finish line.

 

The article goes on to note that process chemists do more than simply run the race. They ensure the track is safe for the anchor runner, make certain that all team members are able to safely enjoy the race, minimize the team’s environmental impact, and keep costs as low as possible.

 

When I showed this article to one of my colleagues, she said it reminded her of a song by Cake from the 1990s:

 

 

“No trophy, no flowers, no flash bulbs, no wine.” You might call process chemists the unsung heroes of drug development.

 

The C&EN article provides some fine examples of what process chemists do. Their job is often to take something that’s not much more than a “proof-of-concept” synthesis from discovery and scale it up to meet a company’s product development goals. Process chemists must consider a range of factors including reaction yields, starting material cost, reaction conditions and equipment needs, and the “greenness” of the reaction components and methods. And they must consider these factors while working quickly and efficiently because every day a blockbuster drug is delayed from the market means fewer people helped and less revenue generated for a company.

 

So it doesn’t do for a process chemist to be caught, as Cake sings, “racing and pacing and plotting the course.” The faster process chemists can receive information on a synthesis from discovery, the faster they can start their leg. And they can proceed more efficiently if they have ways to learn what has worked before, rapidly assess greenness, source materials for synthesis, or model potential synthetic outcomes prior to running a synthesis. But gathering this information can create additional hurdles for process chemists, and the last thing a relay racer wants to do is run a steeplechase!

 

 

All this talk of sport made me wonder how the electronic lab might compare to game systems like the Wii. Tennis, boxing, bowling, and even running are going virtual and electronic, with their own tournaments and matches. Years from now, what will it mean to “run” a “race”? And does the modern ELN stand to have a similar definitional change on how organizations “race” to market?

397 Views 0 References Permalink Categories: Electronic Lab Notebook, Lab Operations & Workflows Tags: eln, process-chemistry
0


I recently attended this year’s ILMAC and MipTec events during Basel Life Sciences Week. It was a very busy four days with a large number of suppliers, industry customers, and academic researchers attending—over 17,000! Switzerland was again a wonderful host with great weather, wine, and food.

The focus of the meetings on both research and development phases of the life science market generated discussion around both the innovative and inventive parts of research as well as the more pragmatic applied technology necessary to deliver results. It also created diverse discussions around how biology, chemistry, chemicals, engineering, and technology issues are connected in today’s life science industries.

One item of particular interest on the pragmatic side was the announcement by Oracle of a product lifecycle management (PLM) system dedicated to life science. Visibility into the progress of a therapeutic against planned development cycles can help researchers on both the R and the D sides of the process, particularly when it comes to making resource decisions. We’re challenged every day to justify and explain the time (and money) needed to allow researchers to focus on various aspects of development. Yes, by connecting a PLM system with the lab ELN we create the opportunity to manage more projects more efficiently (yawn). But we also create the opportunity to directly feed laboratory data into Quality-by-Design programs. This direct connection enables researchers to have the most current information (biology, chemistry, safety, etc…) to inform experimental design decisions.

My colleague Michael Doyle has launched a series on PLM and the challenges associated with uniting PLM systems with modeling, simulation, and informatics technologies. To enable PLM systems to be more deeply deployed throughout development, software vendors in the early development, optimization, and validation space will need to adopt common standards for recipes, instruments, methods, and integrations. While S88 and S95 deliver primary standards to support tech transfer from early dev into manufacturing, we’re missing a standard to support method transfer. In the drive for QbD, the standards for analytical methods will have to be resolved through collaborative partnerships between vendors and industry. What standards do you see as important? And are you involved in any partnerships to set about creating them?
449 Views 0 References Permalink Categories: Trend Watch, Lab Operations & Workflows, Electronic Lab Notebook Tags: conferences, eln, standards, product-lifecycle-management, quality-by-design
0

What do energy materials and pharmaceuticals research have in common? They both stand to benefit from the major improvements in quantum tools now available Materials Studio 5.5.

 

Supporting the energy and functional materials field is a new method in Materials Studio’s DMol3 that’s all about light-- excited states, UV/Vis spectra, and non-linear optical properties. The measurement of these properties is essential in the design of new energy efficient lighting as well as alternative energy generation, such as photovoltaics. DMol3 optical properties are based on time-dependent density functional theory (TD-DFT) . Some of you may know, or perhaps vaguely remember, that the Schroedinger equation is time dependent, so that’s the one that is basically solved here, rather than the usual static approximation. As you can imagine, this can be costly, but I am happy to say that the implementation is actually quite fast (keeping computing costs down) and accurate (making experimentation faster and more efficient).

 

There are many more applications DMol3, for example materials for holographic discs, which have simply amazing storage capacity. Another application area which I find intriguing is in life sciences, for example helping to understand vision better, so I’ll be interested to see where DMol3 optical properties will be used next.

 

Talking about life sciences, another new quantum feature I am really excited about could have big implications for the development of drug molecule candidates into dosage forms. The crystallization and polymorphism of drug molecules needs to be well understood, and modelling has been used for a while to support that work. Polymorph prediction remains tricky however, for a number of reasons. Sampling all the possible structures is a tall order, particularly if the molecules are flexible, as many new drug molecules are. Also, it may not be sufficient to consider the thermodynamics of the structures, as crystallization kinetics can play a major role.

 

However, these considerations are in a way academic if one cannot even optimize structures and rank them by energy with high accuracy. One of the key reasons why this has been so difficult is that both classical and quantum methods have significant shortcomings in this type of application. The force-fields used in classical methods either neglect or at best approximate electronic effects such as polarization, which are often important in drug crystals. Density functional quantum methods on the other hand do that well, but are less effective at capturing more long range, dispersion interactions. Now in Materials Studio 5.5, DMol3 and CASTEP have dispersion correction terms to overcome this issue. With some pioneering work in the literature that demonstrated the power of combining density functional and dispersion methods, I look forward to many great applications of this technology.

 

 

Fermi surface of Yttrium, a rare-earth material used in lasers, displays etc

 

 

There is much more in the release (see the above picture for example) and you can hear all about it at our upcoming webinars, or see it live at the European User Group Meeting later this month.

708 Views 0 References Permalink Categories: Modeling & Simulation Tags: materials, quantum-mechanics, materials-studio, alternative-energy, quantum-chemistry, spectroscopy, ab-initio-modeling, polymorphs, optics
0
Nearly 15 years ago, analysts began touting the many benefits of PLM software to materials and manufacturing companies.  While the road to adoption has included twists and turns, PLM in its various guises—whether SAP or Siemens UGS Team Center, Dassault Systems Innovia or SimulaLink— is finally becoming a staple of modern manufacturing. These systems facilitate planning, scheduling, and management decision-making, and the reason they have taken root is very simple:  they help optimize business performance and improve efficiency and profitability.

The original drivers for PLM adoption—rapid acceleration of business cycles and globalization that requires unprecedented levels of connectivity—show no sign of abatement. What’s more, these imperatives are now driving increased demands for “complete” product information within the wider organization. This means bringing research and development (R&D) and early production processes like formulation design and management, safety testing, ingredient sourcing and quality control into the mix.

This is the first is a series of blogs that will explore both the challenges and solutions involved in more closely uniting PLM systems with modeling, simulation and informatics (MSI) technologies that generate and manage critical data related product chemistry, safety and stability, materials science and more. The goal is the integration of information management and collaborative processes that span the complete product lifecycle—from an idea in a researcher’s head to a new house paint, skin crème or airplane part.  Are PLM and MSI strange bedfellows?  They shouldn’t be…

Next week: the rise of the global research organization
377 Views 0 References Permalink Categories: Trend Watch, Modeling & Simulation, Lab Operations & Workflows, Executive Insights Tags: product-lifecycle-management
0

What a pleasure to hear the news this morning that the 2010 Nobel Prize in physics has been awarded to Andre Geim and Konstantin Novoselov. My colleague Johan Carlsson referred to these researchers in his informative post yesterday on graphene. The Nobel press release highlights the extraordinarily simple technique that Geim and Novoselov used to extract graphene from ordinary graphite:

 

Using regular adhesive tape they managed to obtain a flake of carbon with a thickness of just one atom. This at a time when many believed it was impossible for such thin crystalline materials to be stable.

 

All of us in materials science at Accelrys congratulate Geim and Novoselov on their achievement. Would it be pompous to refer to our webinar Thursday on graphene as the “Nobel Webinar”? Probably. Nevertheless, we hope you’ll attend to see the role modelling and simulation have played in helping us understand this extraordinary material.

545 Views 0 References Permalink Categories: Materials Informatics Tags: graphene, materials, nanotechnology, nobel-prize
0
At Accelrys, we believe it’s important to hear from our customers and gain insight on how you use our solutions so that we can improve our products and services. So if you use Pipeline Pilot, either as a user of developed protocols or a developer of said protocols, we invite you to take a short survey on your experiences:

http://www.zoomerang.com/Survey/WEB22B8JY2KDY9

What’s in it for you? A chance to win a free iPad in exchange for about 15 minutes of your time. We’ll select the lucky winner from the survey respondents.

What’s in it for us? Insights into how you use Pipeline Pilot and how the software benefits you and your organization. All responses will be held confidential, though we do plan to aggregate the responses to help customers compare their usage with that of their peers.

The survey will be active until October 15, 2010. I hope you’ll check it out!
566 Views 0 References Permalink Categories: Lab Operations & Workflows Tags: pipeline-pilot, roi
0

 

 

The C&EN webinar in July on Millennium’s ELN implementation generated 145 questions. So I asked Gabriel Weatherhead, who delivered the webinar, if he’d mind answering some of the most-asked questions about Millennium’s experience here. If you attended the webinar (and even if you didn’t), Gabe is open to your follow up questions. We’ll both be monitoring the comments, so feel free to post one if you’d like more clarification on any of the answers. And if you missed the presentation, a recording of the webinar is available.

 


 

Could you provide more details on your evaluation process? How did you communicate your requirements to vendors?

 

We used a formal evaluation process. We communicated a detailed RFP and subsequently correlated responses against a detailed requirements list. Requirements were prioritized according to criticality and use case. Vendor selection then proceeded in two stages. First, vendor demonstrations were assessed by the lab staff. Second, on-site installations of candidates were performed followed by customizations and configurations by Millennium informatics staff. Subsequently, testing was performed by Millennium scientific staff. The final on-site evaluation was considered a critical part of the evaluation process.

 

Were the IP folks supportive from the beginning or did they have reservations? Our IP attorneys are quite skeptical.

 

To our surprise, our IP attorneys were early advocates of the project and worked collaboratively with us to identify and implement a system that would satisfy legal requirements as well as the needs of the scientists. In fact, we have found that the ELN has improved the overall quality and completeness of our records. It’s a win/win situation for the entire business.

 

Who configures and customizes your ELN: the vendor, consultants, or Millennium staff?

 

For the system deployed in chemistry, all application features were customized by the vendor.  Integration points were engineered and configured by Millennium staff. It was a close collaboration to extend the feature set of the product. For the version implemented in biology, 95% of the configuration and template engineering was done by the Millennium informatics group.

 

How are you monitoring use, compliance, signing, and witnessing and how did you arrive at this workflow?

 

Compliance and use analysis is a vital feature of Millennium’s eNotebook platform. We have several custom tools that were developed in-house for monitoring the eNotebook environment. This workflow was developed with the input from legal, QA, and discovery groups.

502 Views 0 References Permalink Categories: Electronic Lab Notebook Tags: biology, eln, symyx-notebook-by-accelrys, implementing-software
0

 

Dr Johan Carlsson, Accelrys Contract Research Scientist

Dr Johan Carlsson, Accelrys Contract Research Scientist

 


The final presentation in our recent contract research webinar series features work on graphene, which has been described recently as the “ultimate” material for next-generation nanoelectronics development. I asked Johan Carlsson from the Accelrys contract research team to provide some background on this nanomaterial. Attend the webinar to see how simulation methods have helped and are helping unravel some of graphene’s secrets.

 

Graphene is one of the most interesting nano-materials at the moment [1,2]. With all the hype, it might seem as though graphene is a completely new material discovered just recently. And in a sense this is correct.

 

When I started to work on carbon materials some eight years ago, graphene was just an imaginary model system. Back then the excitement about the fullerenes was being replaced by what I call the “nanohype.” All the fashionable forms of carbon suddenly needed to have names including the buzzword “nano.” So we had nanohorns, nanofoams, nanoporous carbon, and, of course, nanotubes.

 

These fashion changes indicate the cyclic nature of carbon science. During the last 25 years, a new form of carbon has been discovered every five to six years. The fullerenes were discovered in 1985 [3], the nanotubes in 1991 [4], and, in 1998, graphynes were predicted [5]. That same year, thin graphite layers were grown on top of SiC wafers—what might today be referred to as the first few layers of graphene [6]. However, it took another cycle of carbon science before the graphene hype really took off, when researchers at the University of Manchester managed to extract individual graphene sheets from a graphite crystal in 2004 [7]. It’s actually about time for another carbon allotrope to emerge on the scene. Graphanes (graphene sheets fully saturated by hydrogens) have potential, but they are not strictly pure carbon structures [8]. We’ll have to see if some other new carbon material will emerge soon.

 

Yet while humans have only discovered the potential of graphene recently, graphene sheets have always been available in nature as two-dimensional layers that are the building blocks of graphite. They’ve just not been accessible, as individual graphene sheets have been very difficult to isolate. The barrier to graphene synthesis was perhaps more mental than technical, as the method that finally succeeded was incredibly simple. The reseachers in Manchester had the ingenious idea to use adhesive scotch tape to lift off the graphene sheets from a graphite crystal! [7] Of course, this method didn’t scale industrially, and since this breakthrough a number of alternative methods have been quickly developed.

 

Graphene is now a mature material. Perhaps the maturation of graphene has represented the next step in the cyclic carbon evolution? Anyway, the progress in this field has been rich because graphene science straddles two different communities: the nanotube community and the low dimensional semiconductor community. Graphene has been proposed for a variety of applications in diverse fields, including field effect transistors with graphene channels [9], gas sensors [10], and solar cells [11]. It’s even attracted interest in life science as an active membrane that can separate different molecules out of a solution [12].

 

Interestingly, the progress in graphene science is to a large extent driven by theoretical predictions and results from simulations. Graphene has been the model system of choice for theoretical investigations of sp2-bonded carbon materials. As such it has been the starting point to study graphite, fullerenes and nanotubes. Many properties of graphene were then known from theoretical point of view, before it was possible to perform actual measurements. The electronic structure of graphene for instance is known from theoretical calculations performed more than 60 years ago [13]. Only recently has it been possible to actually measure the bandstructure of individual graphene sheets. Similarly, a substantial amount of our knowledge about the structure and properties of point defects and edges of graphene was first obtained by simulations and later confirmed by experiments. This shows that simulations and experimentation go hand in hand, and theoretical methods will continue to play a major role in the further development of the graphene and other materials.

Literature


[1] A. K. Geim and K.S. Novoselov, Nature Materials 6, 183 (2007).
[2] A. K. Geim,Science 324, 1530 (2009).
[3] H. W. Kroto, J. R. Heath, S. C. O'Brien, R. F. Curl & R. E. Smalley, Nature 318, 162 (1985).
[4] S. Iijima, Nature 354, 56 (1991).
[5] N. Narita, S. Nagai, S. Suzuki, and K. Nakao,  Phys. Rev. B 58, 11009 (1998).
[6] I. Forbeaux, J.-M. Themlin, and J.-M. Debever, Phys. Rev. B 58, 16396 (1998).
[7] K.S. Novoselov, A. K. Geim, S. V. Morozov, D. Jiang, Y. Zhang, S. V. Dubonos, I. V. Grigorieva, and A. A. Firsov, Science 306, 666 (2004).
[8] J. O. Sofo, A. S. Chaudhari, and G. D. Barber, Phys Rev. B 75, 153401 (2007).
[9] Yu-Ming Lin, Keith A. Jenkins, Alberto Valdes-Garcia, Joshua P. Small, Damon B. Farmer and Phaedon Avouris, Nano Lett. 9, 422 (2009).
[10] F. Schedin, A. K. Geim, S. V. Morozov, E. W. Hill, P. Blake, M. I. Katsnelson, and K. S. Novoselov,  Nature Materials 6, 652 (2007).
[11] X. Wang, L. Zhi, and K. Mullen, Nano Lett. 8, 323 (2008).
[12] S. Garaj, W. Hubbard, A. Reina, J. Kong, D. Branton, and J. A.  Golovchenko, Nature 467, 190 (2010).
[13] P. R. Wallace, Phys. Rev. 71, 622 (1947).

835 Views 1 References Permalink Categories: Materials Informatics, Modeling & Simulation, Trend Watch Tags: graphene, materials, nanotechnology, alternative-energy, guest-authors, semiconductor