News

US National Research Council calls for the replacement of animal toxicity tests with superior human based tests.

US National Research Council calls for the replacement of animal toxicity tests with superior human based tests.

Toxicity Testing in the Twenty-first Century: A Vision and a Strategy

By: the National Research Council Committee on Toxicity Testing and Assessment of Environmental Agents
Report in brief: http://dels.nas.edu/dels/rpt_briefs/Toxicity_Testing_final.pdf

On 12 June a far-sighted report, commissioned by the US Environmental Protection Agency (EPA) and compiled by the US National Research Council’s (NRC) Committee on Toxicity Testing and Assessment of Environmental Agents, was published. Comparing the coming revolution in toxicity testing to the discovery of penicillin, the elucidation of the DNA double helix, and the development of computers (p18), the Committee outlined their vision for the future of testing chemicals in the environment for their ability to harm humans, and it is one which relies on human-relevant in vitro tests, computers and epidemiology (the study of populations).

The Committee notes that there are a multitude of chemicals in the environment which have not been thoroughly tested, due to the cumbersome, expensive and uncertain nature of the animal tests on which current toxicity testing is founded (p37). Moreover, the evaluation of a wide range of chemical mixes, representing more realistic exposure scenarios, is impossible using current cumbersome (animal-based) methods because of the huge variety of combinations and quantities of chemicals which could potentially be assessed (p37). However, the new “high throughput” in vitro tests should make this possible (p70).

The report repeatedly acknowledges that animal tests are of dubious relevance, due to difficulties in projecting effects seen in lab animals, often at unrealistic doses, onto human populations (e.g. p81). The authors are clear that the emphasis must shift from unwieldy whole animal studies to rapid, comparatively inexpensive, relevant tests, using (preferably) human cells, exploiting our increasing understanding of how damage occurs at the genetic and cellular level. Moreover, the authors point out that in vitro tests, including human tissues, are supported by extensive scientific publications and proven contributions to many areas, such as cancer research (p105). Indeed, the Committee notes that sophisticated tests looking at how toxic damage comes about are already increasingly relied upon in toxicity testing, and that we need to move towards accounting for and anticipating individuals’ differing responses to chemicals (p82). The Committee advocates making the most of computer modelling expertise and recognises the importance of monitoring populations to discover whether exposure to certain chemicals makes illness more likely.

The authors expect the “paradigm shift” to encounter resistance, as toxicological testing practices are “deeply ingrained” (p25). They state that in order to succeed, the implementation of “policies designed to overcome tendencies to resist novel approaches and maintain the status quo will be important” (p103). Regarding the validation of the newer technologies, the report acknowledges that it does not make sense to compare results from human cells to results from animal tests (p90), and advocates conducting comparisons with chemicals whose effects in humans are already known, and probably using parallel testing systems (i.e. including the old tests) in the early stages of the transformation. Notwithstanding, the Committee envisions that environmental toxicity testing will be radically overhauled over the next 10 years, with the animal testing component virtually if not actually eliminated within the next 20 years (p96).

The Committee recommends that as everybody stands to benefit from their improved vision for toxicity testing: the public, the environment, businesses and laboratory animals, all interest groups should be involved in moving the toxicity testing process forward. The report also mentions that much of the research required in order to bring these changes about is already underway for medical and biotechnological purposes (p96).

Indeed, there are clear parallels with the safety assessment of new medicines, where regulators insist on animal studies, despite decades of evidence that animal tests are not predictive of drug safety in humans. No fewer than 92% of drugs fail in clinical trials following successful completion of the regulatory animal test regime. Europeans for Medical Progress, backed by 83% of GPs and 250 MPs, is calling for an independent scientific evaluation of the use of animals as surrogate humans in drug safety testing and medical research. This new report closely mirrors our suggestions for a battery of state-of-the-art tests based on human biology to assess drug safety and is a tremendous endorsement of the merit of our case for reform of drug safety testing.

 

Excerpts of particular interest from the Report:

P1
Change often involves a pivotal event that builds on previous history and opens the door to a new era. Pivotal events in science include the discovery of penicillin, the elucidation of the DNA double helix, and the development of computers. All were marked by inauspicious beginnings followed by unheralded advances over a period of years but ultimately resulted in a pharmacopoeia of life-saving drugs, a map of the human genome, and a personal computer on almost every desk in today’s workplace.

Toxicity testing is approaching such a scientific pivot point. It is poised to take advantage of the revolutions in biology and biotechnology. Advances in toxicogenomics, bioinformatics, systems biology, epigenetics, and computational toxicology could transform toxicity testing from a system based on whole animal testing to one founded primarily on in vitro methods that evaluate changes in biologic processes using cells, cell lines, or cellular components, preferably of human origin.

The potential benefits are clear. Fresh thinking and the use of emerging methods for understanding how environmental agents affect human health will promote beneficial changes in testing of these agents and in the use of data for decision-making. The envisioned change is expected to generate more robust data on the potential risks to humans posed by exposure to environmental agents and to expand capabilities to test chemicals more efficiently.

P2
The committee envisions a new toxicity-testing system that evaluates biologically significant perturbations in key toxicity pathways by using new methods in computational biology and a comprehensive array of in vitro tests based on human biology.

P4
Over time, the need for traditional animal testing should be greatly reduced and possibly even eliminated.

P8
The vision for toxicity testing in the twenty-first century articulated here is a paradigm shift that will not only improve the current system but transform it into one capable of overcoming current limitations and meeting future challenges.

P17
Current approaches to toxicity testing rely primarily on observing adverse biologic responses in homogeneous groups of animals exposed to high doses of a test agent. However, the relevance of such animal studies for the assessment of risks to heterogeneous human populations exposed at much lower concentrations has been questioned. Moreover, the studies are expensive and time-consuming and can use large numbers of animals, so only a small proportion of chemicals have been evaluated with these methods… Current tests also provide little information on modes and mechanisms of action, which are critical for understanding interspecies differences in toxicity, and little
or no information for assessing variability in human susceptibility.

P19
The current system, which relies primarily on a complex set of whole-animal-based toxicity-testing strategies for hazard identification and dose-response assessment, has difficulty in addressing the wide variety of challenges that toxicity testing must meet today.

The large volume of new and current chemicals in commerce is not being fully assessed. One reason for the testing gaps is that the current testing is so time-consuming and resource intensive. Furthermore, only limited mechanistic information is routinely developed to understand how most chemicals are expected to produce adverse health effects in humans. Those deficiencies limit the ability to predict toxicity in human populations that are typically exposed to much lower doses than those used in whole-animal studies. They also limit the ability to develop predictions about similar chemicals that have not been similarly tested.

P20
In many cases, daily doses in animal toxicity tests are orders of magnitude greater than those expected in human exposures. Thus, the use of high-dose animal toxicity tests for predicting risks of specific apical human end points has remained challenging and controversial. Inferring effects at lower doses is difficult because of inherent uncertainty in the nature of dose-response relationships… The vision proposed in this report offers the potential to obtain direct information on toxic effects at exposures more relevant to those experienced by human populations.

Other concerns arise about the relationship between the biology of the test species and the heterogeneous human population.

Current toxicity-testing approaches have been criticized because of their failure to consider coexposures that commonly occur in human populations. Because animal toxicity tests are time-consuming and resource-intensive and result in the sacrifice of animals, it is difficult to use them for substantial testing of chemical mixtures

P22
The use of a comprehensive array of in vitro tests to identify relevant biologic perturbations with cellular and molecular systems based on human biology could eventually eliminate the need for whole-animal testing and provide a stronger, mechanistically based approach for environmental decision-making. Computational models could also play a role in the early identification of environmental agents potentially harmful to humans, although further testing would probably be needed. This new approach would be less expensive and less time consuming than the current approach and result in much higher throughput.

P28
In the committee’s vision, in vitro mechanistic tests provide rapid evaluations of large numbers of chemicals, greatly reduced live-animal use, and results potentially more relevant to human biology and human exposures.

P42
Animal models of asthma have been plagued by important species differences, which limit the utility of standard toxicity-testing approaches.

P56
Differences in biotransformation and other pharmacokinetic processes can introduce error and uncertainty into the extrapolation of toxicity from animals to humans.

P77
Additional benefits and research products anticipated for use in the near term include the following:
A battery of inexpensive medium- and high-throughput screening assays that could be incorporated into tiered-testing schemes to identify the most appropriate tests or to provide preliminary results for screening risk assessments. With experience, the assays would support the phase-out of apical end-point tests.
Early cell-based replacements for some in vivo tests, such as those for acute toxicity.

P88
Testing with cellular systems derived from human tissue and from non-mammalian systems is backed by an impressive scientific literature and has a long history that includes major contributions to cancer research and the Human Genome Project.

P91
The vision for toxicity testing in the 21st century articulated here represents a paradigm shift from the use of experimental animals and apical end points toward the use of more efficient in vitro tests and computational techniques.

Search the site

Help promote Rat Trap