The UK Home Office released new statistics on animal testing in the UK last week (21 July 2009). As reported by the BBC , the figures show a strong upward jump. In 2008, 3.7 million procedures using animals were carried out in the UK, representing an increase by 455,000 or 14% in 2007. Digging a bit deeper into the statistics shows some interesting trends.
The increase in number comes almost entirely from a strong increase in the number of fish procedures (278,000), followed by mice (197,000).
Whereas others have remained stable or declined:
According to the report, these seem to be strongly related to increases in biological research. That’s got to do with the increasing attention and promise of personalized medicine and genetic targeting.
It’s also interesting to note that, apart from breeding related activities, medical research and pharmaceuticals safety dominate by far. Procedures relating to ecology, and substances used for example in industry, agriculture and food only sum to about 100,000, or just a few percent of the total. Also, these procedures have seen a decline overall, for example, by 20% for substances used in industry. There have been no tests at all on cosmetic substances, in fact none since 1998.
While similar trends hold for substances used in industry, it will be interesting to follow this trend in 2009 and 2010, as the majority of substance assessments for the submission of dossiers due to the REACH legislation are taking place. By anecdotal evidence, talking to folks during the recent SETAC conference, labs are getting busy, and in some cases completely booked already with REACH related work. However, that work load also includes massive amounts of data gathering, literature searching, analytical testing, as well as in-silico methods such as QSAR and read-across. Quote: “Some people will soon get in a panic about closing the data gaps.”
The question is, how are organisations accessing, processing and handling the information, and making best use of data and information in the literature that’s already out there? How about a web based ‘workbench’ that’s geared up to support toxicologists and other scientists as well as managers in the field to gather, process, share and report that information? We’ve built out a proof-of-concept for anyone to take a look and try, which includes examples of different functions, from database and document searches, and predictive toxicology analysis to facility monitoring. The trick is that as it is built on the basis of Pipeline Pilot protocols, it’s highly configurable and extensible with almost any third party tool. So it can be built or re-modelled to match the user’s routine practices.