Skip navigation
1 2 Previous Next

Accelrys Blog

27 Posts authored by: mdoyle
1

As a chemist, organometallic chemistry was one of my favorite areas of science because it combined the synthetic elegance of traditional organic chemistry with the rational ordering and geometric beauty of inorganic systems. In a way, it’s the best of both worlds. Plus, there is a lot of unknown and unchartered chemical space in this branch of chemistry - from novel stereotactic control catalysts to metal-organic medical and chemotheraputic reagents.

 

Many of these types of complexes feature coordination bonds between metal and organic ligands. The organic ligands often bind the metal through a heteroatom such as oxygen, nitrogen or sulphur, in which case such compounds are considered coordination compounds. Where the ligand forms a direct metal bond, the system is usually termed organometallic. These compounds are

not new - i.e. many organic coordination compounds occur naturally. For example, hemoglobin and myoglobin contain an iron center coordinated to the nitrogen atoms of a

porphyrin ring. Magnesium is the center of chlorophyll. In the bioinorganic area, Vitamin B12 has a cobalt-methyl bond. The metal-carbon bond in organometallic compounds is generally of character intermediate between ionic and covalent.

 

One of the reasons that this area of chemistry was studied in great detail is that organometallic compounds are very important in industry, as they are both relatively stable in solutions and relatively ionic to undergo reactions. Two obvious examples are organolithium and Grignard reagents.

 

Organometallics also find practical uses in stoichiometric and catalytic processes, especially processes involving carbon monoxide and alkene-derived polymers. All the world's polyethylene and polypropylene are produced via organometallic catalysts, usually heterogeneously via Ziegler-Natta catalysis.

 

Some classes of semiconductor materials are produced from trimethylgallium, trimethylindium, trimethylaluminum and related nitrogen / phosphorus / arsenic / antimony compounds. These volatile compounds are decomposed along with ammonia, arsine, phosphine and related hydrides on a heated substrate via metal-organic vapor phase epitaxy (MOVPE) process or atomic layer deposition, vapor deposition or decomposition processes (ALD, CVALD) and are used to produce materials for applications such as light emitting diodes (LEDs) fabrication.

 

So why did I choose my title about these materials? Well, recently scientists in the UK have made a dramatic breakthrough in the fight against global warming and it involves sea urchins. A chance find by Newcastle University physicist Dr. Lidija Siller as she studied the sea creature, is being hailed as a potential answer to one of the most important environmental issues of modern times. She found sea urchins use nickel particles to harness carbon dioxide from the sea to grow their "exoskeleton". That is, they naturally complex CO2 out of their environment and with natural enzymes, use the CO2 to produce harmless calcium carbonate metal loaded materials. This was not deemed practical or possible historically so the breakthrough has the potential to revolutionize the way carbon or CO2 is captured and stored, while at the same time producing a useful material that is strong, light and water resistant.

 

In other words, this process or the ability to mimic this process in industrial environments not only reduces carbon output but also produces a useful by-product. This discovery could significantly reduce CO2 emissions, the key greenhouse gas responsible for climate change.

 

Where Accelrys technology plays a key part is the understanding of the organometallic structure - its geometry, energy barriers, electronic properties, fluxional or dynamic behavior. Also, Accelrys technology allows the screening of options, choices and chemical modifications that can be made to the framework. These changes can enhance, stabilize or industrialize the natural system in this recent discovery.

 

In the future, the application of virtual and predictive chemistry systems will have an elegant symmetry - the research and development of the catalyst which produces plastics and materials will be complemented by research and development in the remediation and end of life of those materials. Although plastics contribute to our carbon footprint, catalysts that create them can and should be used to reduce or sequester byproducts of those processes and materials. This will clearly increase the sustainability of the complete process.

1,240 Views 0 References Permalink Categories: Materials Informatics, Modeling & Simulation, Cheminformatics Tags: chemistry, materials, doyle, catalyst, global_warming
1

As I hope most people know, there is no point in storing data unless you can either search it or use it for model building and analytics. That is why at Accelrys, we have worked very hard to develop a data source agnostic platform, that can reach into multiple and diverse data sources, as well as provide extensive intrinsic and extrinsic analytic and reporting capabilities.

 

One of the pivotal decisions we made was to both develop our own algorithms, as well as integrate with the R-Statistics package in an extensible way. This package has seen incredible growth over the last four years so it is very heartening to read Robert A. Muenchen's post about the rate and technical richness of the science and mathematics that's been incorporated recently into the R suite.

 

Simply put, R’s 2012 Growth in Capability Exceeds SAS’ All Time Total as of March 2013. I know that the two packages are not directly comparable, so this comment is perhaps a little aggressive. But the fact remains that the growth of R (see below) has been so dramatic that it now clearly provides the significant technical richness that other leading packages of statistical tools are starting to envy. I believe that by combining this technical diversity, innovation and richness with the data handling and parsing / reporting capabilities in the Platform, is a significant leap in statistical and analysis lead experimentation for science differentiated companies.

 

 

Capture.PNG

For full details see:

http://r4stats.com/2013/03/19/r-2012-growth-exceeds-sas-all-time-total/

1,765 Views 0 References Permalink Categories: Data Mining & Knowledge Discovery, Trend Watch Tags: statistics, rstats, r_statistics, rstatistics
0

Last week the world changed. Well at least the United States Centric part of the world changed, and that's quite a fiscally significant part. What the Leahy-Smith America Invents Act changed was the premise behind US Patenting regulations and rules. Sure this change has been in the works since last September, but now the change is law, so its now a real situation and not just an academic exercise. Simply put the USA moved from a constitutionally enshrined position of, who invented a product first legal basis, to a much simpler and more closely aligned to the rest of the world's approach which is first to file: http://en.wikipedia.org/wiki/First_to_file_and_first_to_invent.

 

What this means is that all of the focus on dates and proving that one individual had an idea first, and the consequent need to record, date stamp, time stamp, ownership and access rights, becomes less critical. What replaces it, is to me the really interesting thing. If you think about the first to file; i.e. the approach that whomever gets to the patent office first with their paperwork wins, then speed, agility and information performance become significant factors. This is a big change and is clearly having an effect on both the corporate psyche and organization's behavior and plans. Historically, a patent granted in the U.S. would have gone to the first to invent the claimed subject matter. Where separate inventors were working on similar inventions at the same time, the "first-to-invent" system could sometimes result in interference proceedings at the USPTO to determine who was the first to come up with the claimed invention.

 

Under the new USA rules as a"first-to-file" country, the novelty provision of the Patent Act will change to create an absolute bar to patentability if the claimed invention of a patent application was "patented, described in a printed publication, or in public use, on sale, or otherwise available to the public" anywhere in the world before the patent application’s effective filing date. This is a major transformation in at least two respects:

 

  • First, with respect to activities outside the U.S., the law before the change, only limited as a bar to patentability foreign patents and printed publications. Other activities, such as public use, sale or other "public availability" outside of the U.S. were not considered. After the Act, that will no longer hold true.
  • Second, because the law before the change permitted applicants in some situations to prove that they came up with their inventions before the effective date of a cited patent or printed publication, there was recourse if another entity independently developed the same invention after you, but beat you to the USPTO with a patent application. As of March 16, 2013, what will matter is when you filed your patent application, and not when you came up with your invention.

 

 

So now this drives the need for enhanced and significantly upgraded competitive intelligence analysis and also document production. http://www.innography.com/blog/?p=606 The fomer is driven by the change in provision from externally patenetable publications to the test of "public availability" of information. In this case there is now a clear need to be aware of what, how, why and where other companies are working on, inventing and producing products, since availability could invalidate a corporate patent that has had a significant amount of research behind it. Secondly, there is the document production arms race. Why such a term? Well just think about it, say a corporation had been working for a number of years on a breakthrough technology. They had spent significant money on the work only to see a competitor scoop them for the patent and the whole opportuity and area because they are more agile and have better informatics processes.

 

The main and usual response to this is to then file a series of defensive or encircling patents around the original domain of intellectual property. This then becomes a document production exercise. The corporation will seek to retrieve all data, information and for example spectra, pertinent and relvent to the filing, and as quickly as possible generate the legal document, refine and collaborate on it and then parse the whole system through their legal process to the patent office.

 

 

To be clear, most companies outside the pharmaceutical space do not have a sufficient innovation, scientific, information management process and digital backbone to do this today. Hence the problem. These companies have invested much money, due to the business need in their development and production or supplychain processes. However, they have not invested in the same processes and tools i.e. informatics capabilities for research, due to the highly convoluted, complex and non simple data types. This is why the concept of scientific information life cycle management (SILM) has occured and is growing. Solutions that manage the scientific lifecycle like a data source agnostic platform to connect disparate workflows and tools, an easy-to-use scientific electronic laboratory journal and notebook, and specialised process-centric laboratory execution and management systems are needed even more in light of the recent Act.

 

 

So again, the world has changed, but in change are seeds of new and exciting growth and technical challenges. What are your thoughts on the opportunities and challenges ahead?

 

438 Views 0 References Permalink Categories: Trend Watch, Electronic Lab Notebook, Cheminformatics, Materials Informatics, Bioinformatics, Data Mining & Knowledge Discovery, The IT Perspective, News from Accelrys, Lab Operations & Workflows, Executive Insights Tags: notebook, innovation, eln, patent, ip, first_to_file
0

This is one of those topics which you have to be careful and judicious about. Clearly there is a lot of feeling, emotion and concern about the current spate of issues in the European meat food supply chain, and these are both significant and concerning issues. However, my interest in this area started a number of years ago. In 2006, over 200 people were seriously ill in the United States from infection with E. coli O157:H7. Because of the complex and interconnected nature of the food supply and distribution chain, this led to cases across 26 states in the USA and also led to approximately 75 million dollars of economic impact or losses. Although through post-event analysis, the contaminated produce appears to have been concentrated in 42,000 bags of Dole Baby Spinach processed during a single shift in one plant. At the time, the recall was much bigger because no one was sure how many products or processors were involved. Within seven days of the initial recall, five companies would recall produce as well.

 

As soon as the outbreak was confirmed, investigators began hunting for production records that could lead back to the point of contamination, something that had never been accomplished in 19 previous E. coli leafy-green investigations since 1995. This was the point where I was surprised. I understood that a leaf green can be diced, mixed or dumped into open shop trays and thereby cross contaminate, but the fact that we had never previously been able to track food infections to the source accurately was quite surprising. Now since that sad event, a number of produce manufacturing companies have improved their ability to track a bag of produce back to where it was grown (i.e. they can quickly track a contaminated bag back to within 30 feet of where the product was grown in the field). This is excellent, but it is very costly, and as we are now seeing in Europe, it does not address issues of mixed produce that require genomic testing to identify their differences or origins. So this is the premise of my blog.

 

We're living in a world where we can track every component in an airplane from its manufacture through its service history, every chip made in a high volume chip foundry or Fabrication plant (FAB), and we are close to being able to do point of care to formulation tracking of pharmaceutical materials. Why, then, can we not track food production and the steps and tests accurately to avert these sorts of effects or problems?

 

Well, there appear to be two issues. First, there is always the economic cost and fragile or expert practitioner skills needed to perform certain tests. Most agricultural workers are not trained to perform some of the tests that analytical and biochemical scientists do. In addition, the equipment needed is fragile, not robust, not well suited to episodic use, and quite expensive to buy, install and maintain. Also, metrology (understanding of calibration standards and functions) is very time consuming for a point or spot test. Lastly, a farmer does not really want to build a complex laboratory infrastructure that is required to maintain a series of these tests, then capture, store and manage the associated data.

 

So what can informatics do to help? Well, from the land where IKEA has recalled its famous meatballs, Accelrys has a division that has developed a hosted electronic laboratory notebook. This system is really designed for people who don’t want this sort of infrastructure. This technology is designed as low cost, low footprint, easy to use, and easy to scale. It serves as a system of record for information capture that needs to be stored for legal reasons.

 

Also, we have been fortunate to work with a company in our eco-system of partners call Oxford Nanopore Technologies. What ONT does is really exciting science - they use micro-pores to sequence genomic information, one base pair at a time. Then, using the Accelrys Enterprise Platform (AEP), they build up the data to a complete genome map. What’s especially interesting about the ONT technology is that they have, just like Moore's Law, successively miniaturized their technology. They've also simplified the user interface to their technology. The miniaturization is amazing, allowing the technology to run now on a device as small as a USB memory stick. But to me, the software is the key enabler, since this can now "dial home" to validate calibration from a field use (house, jungle, desert, site of a disaster etc.). The software can run a whole series of cascade and expert level tests automatically based on the initial readings and measurements. Workers can define easy-to-use templates and procedures in a cloud-based electronic notebook (or journal), then link those to accurate material, formulation, location, genomic, and perhaps even other laboratory-on-a-chip factors such as toxicological or attribute assays, which could help with this complex problem of food supply security.

 

So in the same way that millions of cell phone calls are mostly billed to the right callers worldwide every hour, I believe that appropriate use of technology can help us track capture, quality assure and manage the information associated with our food supply and nutrition.

2,981 Views 0 References Permalink Categories: Bioinformatics, Cheminformatics, Trend Watch, Scientific Databases, Electronic Lab Notebook, Data Mining & Knowledge Discovery, Lab Operations & Workflows Tags: ngs, eln, food, informatics, contur, ont
1

"Necessity is the father of invention," so the saying goes. While I don't personally hold with broad sweeping generalizations, it has to be said that many of the materials innovations that drive and enable our modern world have been driven by demand, need or circumstance. Examples of this range from the glass that protects and allows our phones to have dazzling displays to the batteries that power our mobile computers, pacemakers and surgical implants that enable people to have a good quality of life. Also there are non-stick cooking pans, energy efficient compact fluorescent and led lights, artificial sweeteners, advanced anti-irritant or skin-whitening cosmetics and creams, and new medical treatments.

 

Sometimes I think that we take for granted all the advanced science and materials that enable us to for example to cross almost half the world with relative comfort and safety. To land in a cold location with appropriate clothing to keep us comfortable and safe. To get there in a reasonable amount of time, and at a reasonable price in safety. To have available a car which starts and continues to function and to be able to call home to our families.

 

I was musing on this as I drank my cup of coffee and considering instant coffee. I am, and I admit it, a coffee snob. I like my coffee, I like it a certain way and so for many years while travelling I have had a constant vigil and hunt for coffee that is to my liking. With the spread of coffee chains globally, this is an easier problem, however I still I really like taking my coffee with me. This can be done with certain brands, and I remembered how instant coffee was actually invented by Dr. Hans Morgenthaler in 1938 for Nestle, but only became popularised by the American GIs in the Second World War, where weight and space were at a premium. The same issues of space and weight apply to so many things in the modern world, and as I planned my trip to winter-bound Europe, I wondered about these innovations. Of course the fact that now I can have decaffeinated coffee with caramel and that it can come in hot and cold varieties as well as a myriad range of flavors and specialty choices is all due to technical and formulation advances and the advent of new packages, processes and capabilities.

 

For my journey to Europe, I first needed a warm coat. Historically when people explored cold places they used large bulky coats and large heavy woolen gloves. Now, with the advent of high performance fibers such as those used in GoreTex where the body is maintained dry and sweat free yet the material can breathe and perspire without any snow or rain entering, I can have a lightweight jacket which is wearable, in the color I want and that lasts well when out hiking or mountain biking. So in went my hiking jacket, a smart choice because recently the UK has had one of the wettest periods of weather ever. Next, it was which bag to choose to pack my gear in. Suitcases and rolling bags or carriers have in the last decade changed due to polymers, plastics and composites out of all recognition. They have become lighter, more resistant to abrasion and damage. They have become more colorful and easier to open due to advanced plastic zippers and expandable sections. In addition, the new complex materials allow the wheels to roll more smoothly and to be honest, don't break with the frequency that the older ones did. Again, the materials technology that resists such impacts with flexible deformation and includes a smoothly lubricated bearing is really quite remarkable.

 

The next stage in my thoughts was about which route to take for my journey. This, as anyone knows in the winter is not a simple choice. It involved juggling which airports get snowed-in or have bad weather. Which (if any) airports can cope? What de-icing equipment and provisioning each has and what the cost might be. The issue of de-icing, which is basically removing the ice and coating the airplane with a complex formulation or mixture to prevent the onset of ice crystals is very complex. This is necessary since the buildup of ice crystals can rob a plane of its lift and control, which is not a desired state. The coating, however, has a whole series of design constraints that govern it. For example, it must be deployed easily and in low temperatures, and it must not damage the plane (for composite modern aircraft such as the Dreamliner this is harder than it would appear). It must be non-toxic, not affect passengers with allergies, be environmentally benign and of low cost. These are not easily balanced requirements for a formulated product that has to remain stable for extended periods of time, and like many advanced chemical products, it must be produced or deployed in many different locations with high frequency.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Of course I also wondered about the composite materials used in the airplane. Those made by advanced manufacturing and materials companies such as Boeing, have many different and diverse requirements. They need to be able to stand the intense cold at altitude, and function for extended periods of time. Airplanes spend more time flying than they do on the ground. They need to survive lightning strikes, rain, hail, de-icing fluids and the many different sources of external impact. In addition their whole raison d'etre (lighter weight), better fuel per passenger performance needs to be enabled and provided. The fuel for airplanes (Jet A1) needs to be of high purity, consistent quality and not affected by the fluctuation in temperatures from 35,000 feet in the air to ground. This product involves very careful chemical processing design and administration. Companies such as BP Aviation and other aircraft fuel providers spend a lot of time managing the changing feed and input streams to yield consistent and constant products, as well as tracking lot to lot quality. They must also have stringent product safety and testing requirements.

 

Once I got to my destination, I expected to find a rental car that started and ran, and that would allow me to be transported to my hotel of choice, where I would be warm and safe in my well lit and heated room. Those simple common travel steps I realised are all triumphs of materials science, innovation and design. The fact that my car also has a fuel that can flow and burn consistently in low temperatures is amazing. The petrochemical companies actually adjust the ratios of different chemistries in winter, and summer fuels to aid burning and volatility in low temperatures. In some cases such as for diesel fuel, which can gel or almost freeze in low temperatures, they add specific pour point depressant additives and viscosity improvement additives. The same of course occurs for engine lubricating oil, which due to modern synthetic materials can have such a wide range of viscosities that people do not need to switch from a winter to summer oil and back again.

 

The battery of my rental has a unit which converts the electrochemical potential of the battery into current that drives my starter motor. It must function at high load when the whole car is probably at it coldest, and so again it’s a complex piece of chemical, materials and electrical engineering. In hybrid or electrical vehicles, this is an even more necessary situation. Finally, I consider the tires, which hopefully would keep me on the road. These materials which have actually a very small area in contact with the ground, whether it be asphalt, slush, snow or gravel, need to function in all seasons and in a whole different range of temperatures, irrespective of load and speed. Tires which are a significant contributor to the vehicles overall energy use, have to be engineered to have the right performance characteristics while maintaining safety and keeping costs, noise and performance within bounds and managed.

 

At the hotel, I hoped I would find a warm, well-lit room in the evenings with plenty of hot water. This requires ability to insulate the building and manage its heat loss, and to distribute electricity in very cold or warm temperatures for cooling or lighting. The ability to distribute natural gas is a triumph of modern science and materials development – natural gas is hard to find and harder to produce. Another critical development for the growing planet from a sustainability perspective is reducing the energy burden for housing. Building development with advanced materials, such as self-cleaning glass and lightweight insulation removes the need for excessive air conditioning or heating. Other amazing uses of advanced materials technology are low energy bulbs that use less energy, provide excellent light for reading, and can be easily recycled.

 

As I finished my planning for the trip at the beginning of the year, I wondered what would change in 2013 and how travellers in the future would see things when they too reflect on the things that go unnoticed but make all of their travelling so much easier.

3,460 Views 0 References Permalink Categories: Materials Informatics, Executive Insights, Modeling & Simulation, Electronic Lab Notebook, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, forcite, dmol3, catalysis, materials, nanotechnology, atomic-scale-modeling, materials-studio, computational-chemistry, green-chemistry, quantum-chemistry
0

So its January, the month of snow, cold, broken resolutions and the CES or consumer electronics show. This show is a wonderland and mecca for techies and takes place in Las Vegas and features most of the worlds leading electronics or electronic tool and gadget manufacturers. Also a number of their major suppliers are there and launch products at the show.

 

One of the highly talked about products comes from Corning Glass and is called Gorilla Glass 3 (http://www.engadget.com/2013/01/03/corning-gorilla-glass-3/).  This glass material that is present in a whole series of devices such as phones (iPhones), tablet computers, televisions and notebooks. In fact its currently available in 975 different models or devices and has sold more than 1Billion units to date. Gorilla glass was actually developed way back in 1960 but mothballed until Steve Jobs needed something scratch-proof for the screen of the original iPhone. Jobs had to convince Corning that the glass could be manufactured in sufficient quantities for the iPhone’s launch. It’s now used in almost all high-end smartphones, regardless of brand. The key point of these glass surfaces is to be scratch, crack or damage resistent, and to be of amazing optical quality and low weight. Corning set the performance standards for Gorilla Glass3 to be a three fold reduction of scratches in use, a 40% reduction in the number of visible scratches and a 50% boost in strength retained after a scratch occusrs on the surface, which means that partially scratched devices no longer spiderweb or crack in use after the initial scratch. In short the Corning aim was to "Shatter the status quo" of glass enabling larger, more durable, less sctrached or shattered devices.

 

This proces of demanding more from materials is at the heart of the technical advances that drives many advanced companies to investigate the chemical and structural nature of the materials that are at the heart of their new products. For example Corning a long time user of Materials Studio tehcnology from Accelrys (http://accelrys.com/resource-center/case-studies/pdf/corning.pdf) has used the technology to drive their product differentiation many different areas including nanotechnology, glass like materials, and a number of other applications including glass coatings, glass bonding and adhesion. http://accelrys.com/resource-center/case-studies/pdf/corning-case-study.pdf.

 

So it is very interesting to see in statements before the show (David Velasquez, Gorilla Glass' director of marketing and commercial operations) that Corning's team, by studying glass' atomic structure and bonding properties, were able to "invent another kind of glass" that made the substance less brittle, and less prone to scratches. Such innovations powered by scientific understanding and insights enables market disrupting change to be developed and gives companies competitive edges in an increasingly challenging and evolving market. Here is the introductory slide set that Corning used for the previous generation of Gorilla glass (http://www.corninggorillaglass.com/)  and we all look forward to their announcements and product launch at CES this week.

 

If you'd like to learn more about Accelrys Materials Studio, view our on-demand webinars that cover topics from new formulations approaches, materials degradation and much more.

3,017 Views 0 References Permalink Categories: Modeling & Simulation, Materials Informatics, Data Mining & Knowledge Discovery Tags: materials_studio, materials_visualizer, analysis, visualization, dmol3, simulation, materials, nanotechnology, conferences, atomic-scale-modeling, materials-studio, computational-chemistry, quantum-chemistry
0

Today at the Materials Research Society meeting in Boston, Cyrus Wadia, the Assistant Director for Clean Energy and Materials Research and Development at the White House Office of Science and Technology Policy challenged the audience to “think as a network” of collaborators, not individual researchers, and to “learn from colleagues” how to better handle large data sets.

 

The recently publicized presidential initiative for a Materials Genome aims to insert or provide new materials for the production and technology advantage of US industry. This initiative mirrors exactly the focus Accelrys has on integrated multi-scale modeling and laboratory-to-plant optimization. The heart of these initiatives is improving the estimation and management of innovation and design of materials from the nano-scale right up to the process and macro production scale.

 

As many Accelrys users know, multi-scale modeling where the results of atomistic and micro-scale simulation can be combined with larger scale processes, high throughput screening and testing, and laboratory information yields valuable insights into materials design. The combination of computational and real analytical instruments alongside the organization of laboratory data allows the use of models at all length scales and can be considered as driving the innovation cycle of many companies.

 

The MGI initiative clearly recognizes this function and the idea of developing a framework and ontologies where information, models and data can be exchanged at different levels and different time scales. The value of making information, more widely available to the engineer and materials scientist for collaboration will help drive more accurate and systematic materials development and usage. Accelrys is very excited about this initiative and is sure its technology will provide clear aid in the acceleration of the innovation lifecycle of new materials—which is a clearly stated goal of this initiative.

 

How do you feel about this initiative?

763 Views 0 References Permalink Categories: Trend Watch, Data Mining & Knowledge Discovery, Modeling & Simulation, Materials Informatics Tags: materials, nanotechnology, pipeline-pilot, atomic-scale-modeling, high-throughput, materials-studio, cloud-computing
2

Turning Over Rocks to Find Oil

Posted by mdoyle Oct 13, 2011

As many of you know or have noticed, the price of petrol or gasoline is increasing. This is due to the fact that we have reached or passed the peak of easily available and exploitable oil reserves. Of course, there are fields which become economically producible depending on different oil prices, however, the majority consumption of gasoline and other petrochemical products is now coming from increasingly costly and difficult reserves. Finding, locating, quantifying, producing and then operating oil reserves is a complex multi-disciplinary process. There is a huge amount of science and technology devoted to the processing and analysis of seismic and geo-seismic data. However, I am focusing here on the drilling, evaluation, completion and production areas of the oil extraction and production process from shale.

 

Shale is the most abundant of the various types and volumes of rocks where oil is found #Shale is the most abundant source of hydrocarbons for oil and gas fields. Shale is commonly considered to be the hydrocarbon source mineral from which the majority of the oil and gas in conventional reservoirs originated. Shale is a sedimentary (layered) rock that has thin layers of clay or mud. There are many variations in shale geology, geochemistry, and production mechanisms. These variations occur between wells or even between or within the same shale area.  Shales are dissimilar one to another and each shale deposit offers its own set of unique technical challenges and learning curves. This being said, Shale gas was characterized recently as a "once-in-a-lifetime opportunity" for U.S. manufacturing, by Bayer Corp. CEO Gregory Babe.

 

Shale is unique in that it often contains both free and absorbed gas. This results in initially high production rates, quick decline curves, and then long-term steady production. Formed millions of years ago by silt and organic debris accumulations on lake beds and sea bottoms, the oil substances in oil shale are solid and cannot be pumped directly out of the ground. To be produced effectively, oil shale is first mined then heated to a high temperature. An alternative, but currently experimental process referred to as in situ retorting, involves heating the oil shale while it is still underground, and then pumping the resulting liquid to the surface. When producing gas from a shale deposit, the challenge and difficulty is in allowing and enhancing the access to the gas stored within the shale's natural fracture system.

 

The complexity of the materials' interactions which occur sub-surface and the advanced challenges within the system require that formulators and product designers in the service companies providing drilling fluids, muds and completion fluids investigate all the complex interactions and trade-offs in formulation, recipe and ingredient space.

 

766 Views 0 References Permalink Categories: Materials Informatics, Data Mining & Knowledge Discovery, Modeling & Simulation, Electronic Lab Notebook, Trend Watch Tags: pipeline_pilot, materials-studio
1

In a recent article, David Axe of Wired Science rightfully points out many of the pros and cons and bids farewell to the NASA shuttle program, fondly calling it "the most visually impressive high-tech boondoggle in American history".

 

 

The space shuttle, like any iconic and classical piece of engineering and science, has a finite life and operating span. For its time, that the shuttle flew so well for so long and in such amazing situations was truly an amazing tribute to the legion of aerodynamics, space engineering, production and the aeronautical materials and chemical engineers and scientists. Just consider the conditions the shuttles endure on re-entry for one second: This is a speed, i.e., drag, across the shuttle's surface, of over Mach 5 which creates incredible forces that stress the surface, the tiles and the materials.

 

 

 

Further, this flow challenge for the shuttle's materials happens simultaneously with other challenges such as plasma temperatures over 5000 degrees Celsius and the transition from vacuum to atmosphere. Pretty tough environment for anything to cope with, let alone fly afterwards. The fact that we as a species can consider, design and build the system and the materials that constitute this, is a tour de force of science.

 

 

 

 

This is just part of the amazing heritage the shuttle brings.  There is also the incredible materials science that goes into the leading edge coatings, the tiles, the thrusters, the propellant and yes, even the solid rocket boosters. Just remember each tile is made, tested, coated and affixed in place. Consider the heat cycle that the leading edge of the shuttle goes through and how amazing the chemical understanding of the heat mediated grain segregation and zone diffusion is. These parts create a visible rainbow of beautiful colors due to the effect of extreme heat and pressure have on the alloy structure. Being able to predict and understand the performance of these parts under stringent space flight safety limits is again a triumph of materials science and engineering.

 

 

 

So the shuttle, rather than representing a failure or waste of engineering time, truly represents the triumph of our materials science and engineering innovation.

 

As we look forward to the next decade and the new multi-purpose crew vehicle (MPCV) derived from the NASA Gemini project, there are many new challenges for longer duration missions.

 

 

 

First, is thermal protection for a larger re-entry vehicle. With more astronauts the vehicle is enlarged and consequently, on atmospheric re-entry requires more deceleration and hence more heat to be dissipated. Next of course is the radiation encountered for the long haul or extended duration mission to Mars, which requires advanced shielding to protect both the crew members, as well as, the sensitive command and control subsystems in the MPCV. Further, the altitude control and motor systems have to function and continue to work after long period idle in temperatures from 100 degrees below zero, to baked in the sun. And finally, the MPCV will have different missions, in different orbits and of different durations which then require a degree of flexibility without compromising the system capabilities and its fundamental safety profile and performance.

 

 

 

For example, one might consider the use of advanced graphene based or Bucky tube based composites, such as are used in the new generation of commercial aircraft. The difficulty or tradeoff there is that the use of composites in exposed areas is complicated by atomic oxygen levels in certain orbits or at certain oxygen levels. This atomic oxygen is known, as Dr. Lackritz of Lockheed Martin said, to "chew up Bucky tubes and graphene" at an amazing rate, so this causes the design challenges of these materials to increase.

 

 

 

If you then think about the next generation of solid rocket boosters and attitude control systems, these too need higher performance. They need to reduce size, extend their operating lifetime, survive longer, i.e., resist space radiation longer and have a lower environmental impact. A long list of somewhat contradictory or opposing challenges.

 

 

 

 

 

So the nature, demand and need for advanced materials will inevitably continue. Perhaps I would be so bold as to say it is going to increase as our demand for faster low orbit communications, quicker travel and even space tourism takes off. This is the true nature of the challenge we face:  Increasing our expertise and knowledge of materials, advanced processing capabilities while simultaneously increasing our knowledge capital and skill in designing and optimizing these materials from their inception in the lab, to their use in a design system (conceptual and practical), to their integration in a production plant, to their use in space.

 

 

 

 

I feel that the closing of the space shuttle program is not a failure or a negative event, at all. It merely marks the next chapter or the first step in the continuation of good old fashioned, scientific innovation. This innovation is what I believe will, as did developments like penicillin and the light bulb, move us as a society forward to greater achievements and greater accomplishments.

 

 

 

What do you think?

678 Views 0 References Permalink Categories: Trend Watch Tags: materials, materials-studio, materials-science, aerospace
0

One of the large challenges facing society today, is the need for diversity of energy supply and to adapt our current socio-economic infrastructure to support these goals.

 

One area where major in-roads are being made is in the innovation, design, development and manufacturing of solar cell systems. Here, all of the multi-scale levels of Accelrys technology can be utilized. At the micro structural level, simulation and modeling can be used to predict structural, elastic, electronic, optical and thermodynamic properties of polycrystalline aggregates. Also much has been done on the chemical or light-driven catalysis of systems that can be part of a combined cycle energy generation. For example, in the improvement of the photocatalytic activity of NaNbO3 for water splitting, the band gap and the band edges of NaNbO3 can be tailored in virtuo to match the visible part of the solar spectrum and hydrogen and oxygen redox potentials.

 

Once these devices have been innovated, then the fabrication and design phase begins. Here, like so many of the technological devices we use and take for granted there is a huge complexity in production, manufacturing and fabrication. For example, in the design of solar collectors, it has been found that cells that contain tips on their surface i.e. with minuscule cone structures can neutralize manufacturing defects, boosting efficiency of the overall system by up to 80 percent. This is very significant since as Tiffany Stecker, pointed out in Scientific American, for conventional solar panels, more than 50 percent of the charges generated by sunlight are lost due to defects of manufacturing. 

 

The Accelrys Experiment Knowledge Base (EKB) solution, which drives laboratory efficiency and support systematization of laboratory process and also the tracking of complex materials with their associated information or data, is directly applicable to the development and scale-up manufacturing processes for semi-conductor and photolvoltaic materials/systems/devices. Many of these next generation systems require multi layer materials with complex interactions and “recipies” for their performance. As each layer is added there is complexity both within the layer, the way or nature of the addition, the interaction between the layers, the cleaning and repreparation for the next layer. As the number of such “stacks” increases, as well as their dimensional complexity (see above cones), the combinations of possible errors and differences that can directly affect performance increase. In the development and scale up stages of devices production, there is much optimization that is performed to eliminate batch-to-batch manufacturing variation and to allow the production of these systems at low cost, at global sites and in high volume.

 

The EKB solution from Accelrys is a cornerstone element of the total electronic laboratory management capability that advanced companies are implementing for both their discovery, scale up and manufacturing in this and other areas. This enables them to drive both high throughput efficiencies, but also to be first to market, identify wider process windows and to provide compelling data and information to influence customer decisions. In essence being able to track the materials through myrad and inter-related production steps, each with some characterization, as the device is fabricated and then produced, can be done at higher speed, with less errors and at greater efficiency.

 

It is interesting to note that in the semiconductor field the idea of accelerating the discovery of new chemistries and configuration of systems, along side the reduction in defects, time, materials and tools, i.e., reducing major costs, is considered a compelling reason for developing such systems and using them in the day to day laboratory activity.

 

So as the phrase goes “The future is so bright I’m going to have to wear shades”

722 Views 0 References Permalink Categories: News from Accelrys, Modeling & Simulation, Scientific Databases, Lab Operations & Workflows, Data Mining & Knowledge Discovery Tags: ekb, experiment-knowledge-base, manufacturing-scale-up
0

Catalysis is the chemical way that complex reactions which are difficult and costly in terms of energy, are carried out. Catalysts allow the odification of basic materials or feedstocls to they can be made ibnto complex materials or products. Then of course, as in the case of a paint, coating, cosmetic or pharmaceutical, there are a whole series of formulation and modification steps, as well. However, the starting gate for this process is the design and development of the catalyst for the manufacture of the material.

 

Catalysts can also allow energy-producing reactions to occur in reasonable and economic conditions, possibly holding the key, perhaps, to our energy future and the sustainable use of materials. A recent article on the Visible-Light-Driven Catalytic Properties and First-Principles Study of Fluorine-Doped Ti02 Nanotubes, indicates how the chemistry of these complex systems and the morphology or structure can affect their performance as promising photocatalysts for hydrogen production through water splitting under visible light-irradiation without other modifications.

 

Nanotubes are a novel and complex species of catalysts and their production, modification and fabrication goes through many steps. As these novel systems progress from test bed ideas into the field of development and production, the needs of quality assurance, raw materials, processing step and treatment step definition and monitoring come to the fore. It is very easy for one slight variation or mis-step in a complex process to affect the whole lot or batch of a production run of materials. Further, in many of the catalyst systems, just like a cake, materials are incorporated in layers, each having a specific function and passing the products of its reaction onto the subsequesnt or lower layers. Again, due to the complexity of the interacting chemistry and species, the ability to track these materials and their samples from inception through processing, complex modification or treatment and engineering steps are essential.

 

To drive better results with this precarious and prevalent process, a system which enables the planning of experimental programs, i.e., ad hoc or designed experiments, for best material or process coverage and subsequent data mining; or the sequential tracking of step-by-step stages of experiments which gathers the data produced from instrument files, spreadsheets and databases to eliminate manual piecing together of data, is essential. Ideally the system also provides the scientist and catalyst developer with the analysis tools, reporting or data insight tools, and data mining or information and knowledge generation tools such as, neural net or recursive partitioning algorithm analysis of the data.

 

It is important to note that such catalyst information, if systematically captured, stored and then mined, can provide both key indicators of future directions for development, as well as, point to areas of current formulation and materials space that have not been covered adequately for intellectual property protection. Such information would also provide a clear link to the process, scale-up and historiam types of data so that scientists and chemical engineers can identify good systems in the laboratory earlier, that stand a high probability of working in the pilot plant in a stable and multi-site or multi-feed environment.

565 Views 0 References Permalink Categories: Executive Insights, Lab Operations & Workflows, Data Mining & Knowledge Discovery, Modeling & Simulation, Electronic Lab Notebook Tags: catalysis, green-chemistry, formulations
0

Sustainability and green chemistry, are key aspects of the future of cosmetic chemistry, science and products, as well as, for a range of other consumer packaged goods. These aspects or design principals are very important and I suggest worth considering as we witness dramatic growth in the interest and corresponding consumer dollars driving significant demand for them.

 

The good news is that these principals do not necessarily require a change of product formulation or performance; rather they are about how a product is made, and how from discovery to development and manufacturing, the product moves through “Green Stage Gates” for reaction mass, solvent and catalyst use. As Dr Liliana George, executive director of strategic developments in R&D at the Estee Lauder Companies recently discussed, the ability of green chemistry to reduce the impact of a product, without necessarily changing the product itself is key, and is a challenge faced by many cosmetic formulators. “No one is asking you to change the end product, but how you get there can be altered and improved”.

 

It is in this light that catalysis and reaction planning and pathway or synthesis step substitution become very important. The EPA green principals can be applied as a very good benchmark to developments like these:

 

  1. Prevent waste
  2. Design safer chemicals and products
  3. Design less hazardous chemical syntheses
  4. Use renewable feed-stocks
  5. Use catalysts, not stoichiometric reagents
  6. Avoid chemical derivatives
  7. Maximize atom economy, reduce wasted side reactions or mass
  8. Use safer solvents and reaction conditions
  9. Increase energy efficiency
  10. Design chemicals and products to degrade after use
  11. Analyze in real time to prevent adverse reactions, excessive reactions and pollution
  12. Minimize the risk or potential for accidents

 

The advent of virtual chemistry and formulation design, allow the research and evaluation of alternatives, within optimal boundaries such as cost, performance, processability, as well as, time to market. This always has been the aim of modeling and simulation technologies, i.e. to generate understanding and information. However, the need for novel and improved solutions, as Lillian says and as Dr Steve Collier of Codexis presented in Barcelona, Spain at the Organic Process Research & Development Meeting recently, in his presentation entitled, "The Development of Commercial Biocatalytic Processes to Simvastatin and Other Molecules using Highly Evolved Enzymes", the need for complex understanding of chemical and bio-molecular entities is growing. In many cases the competing needs of discovery or idea generation science are now being filtered or aided by information and knowledge extracted from development processing and assembly so that systems can be brought faster to market as well as more accurately and reproducibly. The new generation of tools where models, data and information are treated on an equal footing enables and supports these workflows, while freeing up time for scientist and engineers to do more high value activities such as product creation and delivery (http://ir.codexis.com/phoenix.zhtml?c=208899&p=irol-newsArticle&ID=1440102&highlight=)

869 Views 0 References Permalink Categories: Executive Insights, Modeling & Simulation, Trend Watch Tags: simulation, catalysis, green-chemistry, sustainability, formulations, cpg
0

As Anderson says in his post in wired, the Petabyte Age is upon us. Just consider that in the last 30 years we have gone from ferrite 32K memories in room filled machines, through VMS and paper tape, to my new laptop with a Terabyte array. Similarly we have gone from initial stores to RDBMS and now Google obfuscating and federating many data sources. So, what’s next?

 

Anderson makes a solid argument that a holistic information map is now an unachievable goal and that what we need is locally descriptive analyses. Although right in much of what he says, this is a debatable point. Since the failure of chess like programs in real world scenarios is linked to the lack of contextual capability in computer science, it therefore follows that a wholesale abandonment of context of information will lead to a form of data Alzheimer’s. Where I totally agree with him is the quote of George Box (Box Hunter and Hunter book fame) where he says, "All models are wrong, but some are useful." This is very true and makes the argument for consensus models and consensus decision trees. Anthropomorphically think about the ideas of trial by jury as an analogue. Multiple models have the ability if housed in an accessible framework to allow and enable decision making to be free from local bias or local variances.

 

What also is very interesting of Anderson’s thesis is the idea that although we can and will continue to build multiple taxonomies. The idea like Esperanto of a universal language is further flawed. It is in my opinion as a child of the 90’s and the fall of monolithic states, that people will adapt and morph like a flowing river and that, ideas such as, universal taxa, cannot in any way provide the overall context. Therefore it’s in the provisioning of tools to make these interchangeable and in a platform that makes interactions trivial that we will move forward towards our goal of universal information.

 

Consider what I saw yesterday: There was a lady pushing two kids at Newark airport in a stroller. Both children were using IPAD kiddy programs. They were both, drawing, learning and, in fact, sharing information. Now this is from children that are no older than say 5 years. These represent the future and how we will move. In their world the idea of information having boundaries and limits is completely nonsensical. So we have to understand that the explosion of data is just beginning and start to plan accordingly.

700 Views 0 References Permalink Categories: Data Mining & Knowledge Discovery, Trend Watch Tags: pipeline_pilot
0

In the business world which as we know "never sleeps", there is a constant drive to optimize performance, cost, supply chain and of course, profits. The pace of technology, not just locally but globally, continues along on an all-pervasive “tear” to support these goals. As such, the complexity and interconnectedness of business organizations has increased.

 

Recently there has been a C-change in information strategies that represents a significant switch from reactive information management to pro-active or predictive information management. Many leading corporations are seeking to optimize their processes beyond simple 3 sigma levels and are even pushing beyond 6 sigma levels. To do this they need to develop actionable insights about their corporate data and information.

 

More and more we see businesses moving in the direction of intelligent lead optimization and modeling, or predictive lead research. A key driver of this is, in fact, that all of these processes actually need to interconnect the business decision making processes, functions and layers with the research process and materials optimization layer in order to achieve much sought after efficiencies. For example it is now not enough to know how to manufacture a charge coupled sensor for a camera, because the process, production and testing results need to be linked to the customer response system for predictive or early warning of modifications or tuning of the system that is leading to high levels of product returns due to in-use failures.

 

This shift to quality by or in design rather than just prescriptive testing is the same as happened in the pharmaceutical area over 5 years ago. The FDA process analytical technology (PAT) laid the foundation for the Quality by Design (QbD) initiatives that seek to improve and provide guidance for an industry mind shift to quality by understanding and anticipating, rather than quality by just testing—after the fact.

 

Clearly systems undergo changes along their life cycle, and raw materials can vary at an atomistic or molecular level. So in fact, efficiencies can be driven if those changes can follow right alongside the process and production property and performance changes in a system. This is why a scientific information platform is so needed: Because eventually, these “concept–to--customer” systems must be integrated across “borders” to the product lifecycle and ERP systems layer and downstream to the computational, process and lab information feeds or sources to accomplish meaningful gains in lower costs and time to market.

697 Views 0 References Permalink Categories: Executive Insights, Trend Watch Tags: product-lifecycle-management, business-optimization
0
Chemistry, materials, polymers, formulations, low throughput experiments, high throughput experiments and virtual information now form the basis of an ever growing data pyramid that leads to a new product.  While much of this may be part of the overall corporate holistic information network usually housed within or presented by the PLM system, increasing product complexity and performance are generating data sets that go deeper into the chemistry of the system. In the semiconductor field—where Moore's law has held sway for over a decade (the continuous annual doubling of chip density and performance)—“More Moore” or “Beyond More” is being used to describe greater levels of complexity. Like Manhattan skyscrapers, the vertical packing of chip layers is achieving the density required by new devices such as the 2010 Nano-iPod and iPad. These chips now include such complex layers and networks that their synthesis and production, quality control, manufacturing and processing involve a major tour de force of formulations and mixture management!

The company now seeking to manufacture a product—say, the chips in the new tablet computer I have—must manage multiple generations of designs, each involving significant formulation and material complexity, and manage that information and associated data (business, analytical and testing) through the whole product ideation, design development and sun-setting life cycle.

Good luck.

The accelerating rate of product change and the increasing need for very granular details related to product ingredients and performance require both wider connectivity and deeper, more integrated informatics across the R&D, PLM and corporate decision-making landscape. With its scientifically-aware informatics and data pipelining platform, Accelrys is unique in its ability to provide the critical layer in the design-through-manufacturing processes. The platform’s ability to automate and integrate a diverse array of scientific business processes and data enables organizations to leverage chemical, analytical and virtual simulation data within the corporate PLM suite. This is a major step forward. Ten years ago nobody would have considered that virtual design was part of business process management.  Today, companies such as Boeing, General Motors, Ford, Honda, Apple and Proctor and Gamble wouldn’t think twice about incorporating their finite element designs into their PLM layer. Hence, the next iteration of innovation will lead to the addition of data related to not only the shape and distribution of plastics in the package or airplane wing, but also information on the elastomeric chemical nature of the plastics or coatings, and the formulation of the ingredients in the bottle or on the wing surface.

So, it looks like chemical information is here to stay as part of what is fed into the PLM data management layer, alongside virtual chemical or material performance evaluations and virtual product designs.  No more strange bedfellows.
410 Views 0 References Permalink Categories: Trend Watch, Lab Operations & Workflows, Executive Insights Tags: product-lifecycle-management
1 2 Previous Next