Skip navigation
0

One of the large challenges facing society today, is the need for diversity of energy supply and to adapt our current socio-economic infrastructure to support these goals.

 

One area where major in-roads are being made is in the innovation, design, development and manufacturing of solar cell systems. Here, all of the multi-scale levels of Accelrys technology can be utilized. At the micro structural level, simulation and modeling can be used to predict structural, elastic, electronic, optical and thermodynamic properties of polycrystalline aggregates. Also much has been done on the chemical or light-driven catalysis of systems that can be part of a combined cycle energy generation. For example, in the improvement of the photocatalytic activity of NaNbO3 for water splitting, the band gap and the band edges of NaNbO3 can be tailored in virtuo to match the visible part of the solar spectrum and hydrogen and oxygen redox potentials.

 

Once these devices have been innovated, then the fabrication and design phase begins. Here, like so many of the technological devices we use and take for granted there is a huge complexity in production, manufacturing and fabrication. For example, in the design of solar collectors, it has been found that cells that contain tips on their surface i.e. with minuscule cone structures can neutralize manufacturing defects, boosting efficiency of the overall system by up to 80 percent. This is very significant since as Tiffany Stecker, pointed out in Scientific American, for conventional solar panels, more than 50 percent of the charges generated by sunlight are lost due to defects of manufacturing. 

 

The Accelrys Experiment Knowledge Base (EKB) solution, which drives laboratory efficiency and support systematization of laboratory process and also the tracking of complex materials with their associated information or data, is directly applicable to the development and scale-up manufacturing processes for semi-conductor and photolvoltaic materials/systems/devices. Many of these next generation systems require multi layer materials with complex interactions and “recipies” for their performance. As each layer is added there is complexity both within the layer, the way or nature of the addition, the interaction between the layers, the cleaning and repreparation for the next layer. As the number of such “stacks” increases, as well as their dimensional complexity (see above cones), the combinations of possible errors and differences that can directly affect performance increase. In the development and scale up stages of devices production, there is much optimization that is performed to eliminate batch-to-batch manufacturing variation and to allow the production of these systems at low cost, at global sites and in high volume.

 

The EKB solution from Accelrys is a cornerstone element of the total electronic laboratory management capability that advanced companies are implementing for both their discovery, scale up and manufacturing in this and other areas. This enables them to drive both high throughput efficiencies, but also to be first to market, identify wider process windows and to provide compelling data and information to influence customer decisions. In essence being able to track the materials through myrad and inter-related production steps, each with some characterization, as the device is fabricated and then produced, can be done at higher speed, with less errors and at greater efficiency.

 

It is interesting to note that in the semiconductor field the idea of accelerating the discovery of new chemistries and configuration of systems, along side the reduction in defects, time, materials and tools, i.e., reducing major costs, is considered a compelling reason for developing such systems and using them in the day to day laboratory activity.

 

So as the phrase goes “The future is so bright I’m going to have to wear shades”

720 Views 0 References Permalink Categories: News from Accelrys, Modeling & Simulation, Scientific Databases, Lab Operations & Workflows, Data Mining & Knowledge Discovery Tags: ekb, experiment-knowledge-base, manufacturing-scale-up
0

Catalysis is the chemical way that complex reactions which are difficult and costly in terms of energy, are carried out. Catalysts allow the odification of basic materials or feedstocls to they can be made ibnto complex materials or products. Then of course, as in the case of a paint, coating, cosmetic or pharmaceutical, there are a whole series of formulation and modification steps, as well. However, the starting gate for this process is the design and development of the catalyst for the manufacture of the material.

 

Catalysts can also allow energy-producing reactions to occur in reasonable and economic conditions, possibly holding the key, perhaps, to our energy future and the sustainable use of materials. A recent article on the Visible-Light-Driven Catalytic Properties and First-Principles Study of Fluorine-Doped Ti02 Nanotubes, indicates how the chemistry of these complex systems and the morphology or structure can affect their performance as promising photocatalysts for hydrogen production through water splitting under visible light-irradiation without other modifications.

 

Nanotubes are a novel and complex species of catalysts and their production, modification and fabrication goes through many steps. As these novel systems progress from test bed ideas into the field of development and production, the needs of quality assurance, raw materials, processing step and treatment step definition and monitoring come to the fore. It is very easy for one slight variation or mis-step in a complex process to affect the whole lot or batch of a production run of materials. Further, in many of the catalyst systems, just like a cake, materials are incorporated in layers, each having a specific function and passing the products of its reaction onto the subsequesnt or lower layers. Again, due to the complexity of the interacting chemistry and species, the ability to track these materials and their samples from inception through processing, complex modification or treatment and engineering steps are essential.

 

To drive better results with this precarious and prevalent process, a system which enables the planning of experimental programs, i.e., ad hoc or designed experiments, for best material or process coverage and subsequent data mining; or the sequential tracking of step-by-step stages of experiments which gathers the data produced from instrument files, spreadsheets and databases to eliminate manual piecing together of data, is essential. Ideally the system also provides the scientist and catalyst developer with the analysis tools, reporting or data insight tools, and data mining or information and knowledge generation tools such as, neural net or recursive partitioning algorithm analysis of the data.

 

It is important to note that such catalyst information, if systematically captured, stored and then mined, can provide both key indicators of future directions for development, as well as, point to areas of current formulation and materials space that have not been covered adequately for intellectual property protection. Such information would also provide a clear link to the process, scale-up and historiam types of data so that scientists and chemical engineers can identify good systems in the laboratory earlier, that stand a high probability of working in the pilot plant in a stable and multi-site or multi-feed environment.

561 Views 0 References Permalink Categories: Executive Insights, Lab Operations & Workflows, Data Mining & Knowledge Discovery, Modeling & Simulation, Electronic Lab Notebook Tags: catalysis, green-chemistry, formulations
0

The 7th Annual PEGS summit is fast approaching (Sheraton Boston Hotel, Boston, MA., 9-13 May) and the track topics look really exciting.  The Analytical and Antibodies tracks in particular caught my eye.  Topics in these sessions include Protein stability and aggregation, immunogenicity and antibody drug conjugates.

 

Back in January, Francisco Hernandez-Guzman wrote a blog on some of the fundamental challenges around antibody design.  This discussion focused on the issues associated with using traditional homology modeling processes to accurately represent the inherently low conservation complementary determining regions (CDRs) of antibodies.  These are the key loops involved in antigen recognition.  Accurate representation of these is critical to understanding and indeed, optimization of the antibody-antigen recognition event.  In that blog, Francisco proposed that this problem can in part be resolved by the combination of template-based and de-novo based methods can increase the quality and precision of these models.  Following on from this, at 2:35 on Wednesday 11th May, Francisco will at PEGS, presenting an overview of our macromolecular modeling tools in Discovery Studio for the study and design of biologics.

 

Looking at the rest of the agenda for PEGS 2011, it is clear that there are many more challenges to the successful development of antibodies. 

 

Following on from the modeling considerations associated with working from full-length, framework or chimeric templates, there’s the optimization of the antibody itself.  This can include protein-protein docking simulations to rationalize and tailor antibody-antigen affinity, through to modulation of its pharmacokinetic profile.  As we progress a compound into development, additional challenges around stability need to be considered.  For example, we are faced with issues around mutational, thermal and pH-based stability.  Fortunately, from an in silico perspective, a level of prediction accuracy is possible for mutational and thermal stability and this has already been incorporated into the latest release of Discovery Studio (see for example, the computational stability prediction webinar given by Dr Anne Goupil, Accelrys).

 

Another major stability challenge that affects the development of antibodies is their propensity to aggregate at high concentrations in solution.  Many antibodies are stored, and indeed, used therapeutically at high concentrations.  Aggregation is a highly undesirable degradation process that has to be mitigated during formulation and development.  Identification of low propensity clinical candidates at an early stage in the development of a new agent is of obvious therapeutic and commercial value.  Again, in silico methods are actively being progressed to improve our understanding of factors affecting aggregation.  Indeed, at the recent ACS in Anaheim, Bernard Helk and co-workers presented an update on predicting protein aggregation in monoclonal antibodies (Helk, et al, 241st ACS in Anaheim).

 

Immunogenicity is another critical consideration that has to be managed.  Because the result of an unintended triggering of the human immune response is often severe, regulatory agencies are actively progressing risk-based guidelines to screen for potential immunogenic responses.  As a result, attempts at developing in silico methods to score the immunogenicity risk of a protein are progressing.  Factors that affect immunogenic risk include phylogenic distance, molecule size, and the density and relative clustering of T-cell epitopes.  Non-human proteins in particular can prove to be surprisingly immunogenic.  Thus, when developing therapies using non-human antibodies, we have additional design considerations around humanization (for a good example webinar, see the presentation by Leonardo Boras, ESBATech).

 

 

It should also be pointed out that the ability of a biotherapeutic drug to provoke an immune response in a patient can actually be a highly desirable feature.  For example, it is possible to exploit the human immune reaction for therapeutic use, such as in cancer therapies.  However, this is not the only method to target cancerous cells using antibodies.  Antibody-drug conjugates (ADCs) are an exciting area of therapeutic research that is seeing a resurgence of interest (for example, see Ducry and Stump and Dimond).  These are typically antibodies with a therapeutic payload (e.g., cytotoxic small molecules, etc).  The concept is well suited for targeted therapies.  The antibody finds and selectively binds to the cancer cell.  It is then internalized and its payload released to do the therapeutic job.  Because of this ability to selectively target diseased cells, there is good prospect to reduce common side-effects and thus achieve improved safety.  It is perhaps no surprise that the majority of development programs for monoclonal antibodies are focused on cancer.

 

I for one am really looking forward to PEGS 2011.  Hope to see you there (We’re at booth #508)!

 

Adrian.

801 Views 0 References Permalink Categories: Bioinformatics, Modeling & Simulation Tags: discovery_studio, protein_modeling, antibody, antibodies, protein-protein_docking, immunogenicity, humanization, antibody-drug_conjugates, aggregation, biotherapeutics, macromolecule
0

In a few days, we’ll be on the ground at the Accelrys User Group Meeting (UGM) in New Jersey. A new addition to this year’s UGM is a complete track focused on IT. I have two sessions that I’ll be delivering which will be a condensed & combined version of what was delivered at the Accelrys Tech Summit (ATS) in London.

 

For those that are attending or thinking of attending, here’s a quick preview of what I plan to cover and why:

 

A Lap Around Pipeline Pilot 8.5 for IT Professionals

 

There’s lots of great stuff that we’ve been working on within the next release of Pipeline Pilot, but there’s often many capabilities that IT professionals are looking for that have been in the product for many releases. During this session, I’ll go over various tools that are within the product that every IT person deploying or operating Pipeline Pilot should know.

 

For those of you thinking, “I’ve been doing Pipeline Pilot for many moons. What can I learn from this session?” There are some new things within Pipeline Pilot 8.5 that will be discussed. Regardless of your experience with Pipeline Pilot, I’m sure you’ll walk away with learning something new that you’ll put into practice.

 

Pipeline Pilot Application Lifecycle Management

 

Scientists that develop tools and solutions within Pipeline Pilot are quite familiar with how easy it is to get started building something valuable. But there’s a big step from building something useful for themselves or a small group of users to taking that to an enterprise-wide-deployment.

 

So what are the things that a creator of a Pipeline Pilot based innovation need to add to their project to make it an enterprise success? What does an IT professional need to do to provide an enterprise infrastructure to deliver those scientific innovations? Answering those questions and more are the target focus for this session. Do you have more questions that you’d like answered? Ask us within the Accelrys IT-Dev Community!

 

Pipeline Pilot Architecture Deep Dive

 

While this isn’t my session, I would like to advertise it a bit. Jason Benedict is our Senior Architect for Pipeline Pilot and he’ll be delivering this session. He did this same session at the ATS in April, and it was mind blowing. I get the luxury of sitting a few doors down from him and I get nuggets of great information about how Pipeline Pilot works deep within the product and why we did things a certain way. Many times this lands on a whiteboard, but often doesn’t get out into the wild. Well now is your chance to get this information first hand! He covers topics such as how Pipeline Pilot takes your request to perform a job, and what the system does to honor that job. This spans from authentication and security to memory management and data processing.

 

If you aren’t registered yet, you should be! Lots of great speakers and content, so what are you waiting for? Oh, the link to register? No problem, we got you covered: http://www.regonline.co.uk/Register/Checkin.aspx?EventID=927242

 

Finally, stay connected to what's going on with the event via Twitter with our dedicated hashtag: #acclugm

 

- Conrad Agramont, Sr. Product Manager, Enterprise Technologies, Accelrys

523 Views 0 References Permalink Categories: The IT Perspective Tags: pipeline-pilot, user-group-meeting