Toxicity Testing in the 21st Century

Home / New Perspectives / Overarching Challenges / Toxicity Testing in the 21st Century

Overarching Challenges

Toxicity Testing in the 21st Century

Melvin E. Andersen, The Hamner Institutes for Health Sciences
Daniel Krewski, Institute of Population Health
Ellen Mantus, National Academy of Sciences
Lauren Zeise, California Environmental Protection Agency1

Published: March 31, 2008

Toxicity testing has traditionally relied on studies of adverse health outcomes observed in animals at high doses, with subsequent extrapolations to expected human responses at much lower doses. These approaches date back to the 1950’s, when knowledge of the biology underlying toxic response was primitive. A recent report from the National Research Council (NRC), Toxicity Testing in the 21st Century (1), has proposed fundamentally new directions for toxicity testing in light of advances in understanding biological responses to chemical stressors.

The core of the committee’s vision for the future involves the mapping of toxicity pathways in human tissues, and the identification of critical pathway perturbations responsible for toxic responses. Dose-response relationships for pathway perturbations can then be described quantitatively through biologically based modeling of toxicity-pathway circuitry and human pharmacokinetics. When the vision is fully implemented, regulation will be based on avoidance of biologically significant perturbations of key human toxicity pathways, rather than on the current practice of assessing human health risks based on high-dose responses in animals and the use of questionable assumptions to extrapolate such findings to predict low-dose risks in people.

The main elements of the committee’s vision are shown in Figure 1. Chemical characterization involves the collection of data on physical and chemical properties, use, environmental concentrations and stability, likely routes of human exposure, the potential for bioaccumulation, metabolites and breakdown products, molecular interactions with cellular components, and potential toxic properties.

Today, cell biologists working in many fields are enhancing knowledge of cellular-response networks and elucidating the manner in which environmental agents perturb pathways to cause changes in cell behaviors. The NRC report defined toxicity pathways as biologic pathways that, when sufficiently perturbed, can lead to adverse health outcomes. Despite this new terminology, toxicity pathways are actually normal cellular-response pathways that can be targeted by environmental agents. A parallel exists in the field of carcinogenesis, in which genes that code for proteins involved in cell growth are designated as oncogenes or tumor suppression genes.

The vision emphasizes the development of suites of predictive, high-throughput assays that use primary cells or cell lines, preferably of human origin, to evaluate relevant perturbations in key toxicity pathways (2,3). Those assays may measure relatively simple processes, such as binding of environmental agents with cellular proteins and changes in gene expression caused by that binding, or they may measure more integrated responses, such as cell division and cell differentiation. Although the majority of the anticipated toxicity tests are expected to use high-throughput methods, other tests could include medium-throughput assays for integrated responses, such as cytotoxicity, cell proliferation, and apoptosis. Functional genomics will play a prominent role in identifying critical alterations in gene expression involved in toxicity-pathway perturbation (4,5). Computational biology (6) will play a central role in supporting dose-response modeling of toxicity-pathway function. Coupling high-throughput testing with pathway-based models will provide the basis for a quantitative assessment of human health risk from exposure to environmental agents.

Over time, the need for traditional animal testing would be greatly reduced, and possibly eliminated. Until prediction of metabolism can be more reliably accomplished with computational toxicology and in vitro testing, targeted testing using whole animals will likely be needed to identify toxic metabolites that require evaluation by high-throughput testing. Other in-life testing may still be required to clarify substantial uncertainties in the interpretation of toxicity-pathway data, to understand effects of representative prototype compounds from novel classes of materials, such as nanoparticles, or to fill gaps in the toxicity-pathway testing strategy to ensure that critical toxicity pathways and end points are adequately covered.

Dose-response and extrapolation modeling should provide integrative tools for interpreting toxicity-testing data (7). Dose-response analysis will be greatly informed by the use of computational methods in systems biology to describe toxicity-pathway function (8). Pharmacokinetic (PK) modeling (9) can be used to determine environmental exposures corresponding to human tissue concentrations comparable to those associated with perturbations of toxicity pathways in vitro (10). As our understanding of the factors affecting interindividual variability in response increases, host susceptibility factors can also be incorporated into PK and dose-response models. Although tools for PK and dose-response modeling are already relatively well-developed, their widespread application is often limited by a lack of adequate data for model construction.

The vision emphasizes the generation and use of population-based and human exposure data for interpreting toxicity-test results and encourages the collection of such data through human biomonitoring, environmental health surveillance, and targeted epidemiologic studies, particularly those involving molecular and genetic components (11). In vitro toxicity tests conducted in human cells can help identify specific biomarkers of exposure, biologic change, or susceptibility that can be investigated directly in human populations.

Risk management decision-making may vary according to the risk context that creates the need for toxicity-testing information. Commonly encountered scenarios include evaluation of potential environmental agents, existing environmental agents, sites of environmental contamination, environmental contributors to a human disease, and the relative risk of different environmental agents. Although outside the scope of the committee’s report (1), risk decisions based on regulatory, advisory, economic, community based, or technological interventions will involve consideration of a range of extra-scientific factors, including psychosocial and economic factors (12).

The vision for toxicity testing in the 21st century articulated by the NRC (1) represents a paradigm shift away from adverse effects observed in experimental animals at high doses toward identifying and avoiding biologically significant perturbations of key toxicity pathways. The vision advocates new approaches to toxicity testing within each of its main components which, collectively, will ultimately lead to a transformation in the way the potential health risks of environmental agents are assessed. The vision takes full advantage of scientific advances in toxicity testing that enhance our understanding of how environmental agents can affect human health within a comprehensive, integrated framework. In addition to the scientific approaches to toxicity testing noted previously, systems biology is anticipated to play a prominent role in integrating cellular, molecular, genomic, and dosimetry data from different sources (13). The use of these tools will facilitate a more mechanistic understanding of dose-response relationships across a range of doses relevant to humans, including those reflective of human exposure patterns, and the delineation of dose ranges of concern with respect to human health.

A number of challenges will need to be met to ensure successful implementation of the vision. There will be technical challenges in developing and refining the toxicity-testing tools and technologies that will provide the toxicity-pathway assays on which the vision rests. The reliance on toxicity-pathway perturbations as the basis for toxicologic risk assessment will require sufficient understanding of such pathways to permit the shift away from apical outcomes in animals to occur with confidence. The identification and mapping of toxicity pathways of relevance to humans will require a concerted effort among multiple scientific disciplines, a challenge somewhat analogous to the commitment made to mapping the human genome.

Regulatory authorities will need to consider how current risk assessment practices can be adapted to make use of the types of toxicity-testing data underlying the committee’s vision. Lawmakers will need to encourage flexibility in the interpretation of regulatory statutes such as TSCA, or possibly update them, to reflect the reliance on biologically significant perturbations of key toxicity pathways, rather than adverse effects associated with maintenance of these perturbations over extended periods of time in experimental animal tissues.

The vision for toxicity testing in the 21st century articulated by the National Research Council (1) represents a paradigm shift away from adverse effects observed in experimental animals at high doses toward identifying and avoiding biologically significant perturbations of key toxicity pathways. The vision advocates new approaches to toxicity testing within each of its main components which, collectively, will ultimately lead to a transformation in the way the potential health risks of environmental agents are assessed. These improvements will strengthen our ability to protect people from the potential risks posed by environmental agents and permit continuing incorporation of new knowledge about toxicity pathways and their function into a modern toxicity-testing paradigm geared toward quantitative assessment of human health risks at relevant exposures.

Figure 1. The proposed vision for toxicity testing includes chemical characterization, toxicity testing, and dose-response and extrapolation modeling. At each step, population-based data and human exposure information are considered in the context of the data needed for decision-making. Source: NRC (1). Reprinted with permission; copyright 2007, National Academy Press.

1This communication is based in large part on the final report of the National Research Council (NRC) Committee on Toxicity Testing and Assessment of Environmental Agents (1). The committee was comprised of the following members: Daniel Krewski (Chair), Daniel Acosta, Jr., Melvin Andersen, Henry Anderson, John Bailar III, Kim Boekelheide, Robert Brent, Gail Charnley, Vivian Cheung, Sidney Green, Karl Kelsey, Nancy Kerkvliet, Abby Li, Lawrence McCray, Otto Meyer, D. Reid Patterson, William Pennie, Robert Scala, Gina Solomon, Martin Stephens, James Yager, Jr., and Lauren Zeise. The NRC project director was Ellen Mantus.
©2008 Melvin Andersen, Daniel Krewski, Ellen Mantus, and Laren Zeise

  1. National Research Council (NRC). (2007). Toxicity Testing in the 21st Century: A Vision and A Strategy.National Academy Press, Washington, DC.
  2. Inglese, J., Auld, D.S., Jadhav, A., Johnson, R.L., et al. (2006). Quantitative high-throughput screening: a titration-based approach that efficiently identifies biological activities in large chemical libraries. Proc. Natl. Acad. Sci. USA.103, 11473.
  3. Austin, C.P., Brady, L.S., Insel, T.R. & Collins F.S. (2004). NIH Molecular Libraries Initiative. Science.306, 1138.
  4. Hoheisel, J.D. (2006). Microarray technology: beyond transcript profiling and genotype analysis. Nat. Rev. Genet.7, 200.
  5. National Research Council. (2007). Validation of Toxicogenomic Technologies: A Workshop Summary.National Academy Press, Washington, DC.
  6. Aldridge, B.B., Burke, J.M., Lauffenburger, D.A. & Sorger, P.K. (2006). Physicochemical modelling of cell signalling pathways. Nat. Cell Biol.8, 1195.
  7. Andersen, M.E., Yang, R.S., French, C.T., Chubb, L.S. & Dennison, J.E. (2002). Molecular circuits, biological switches, and nonlinear dose-response relationships. Environ. Health Perspect.110 (6), 971.
  8. Alon, U. (2006). An Introduction to Systems Biology: Design Principles of Biological Circuits.CRC Press, Boca Raton, FL.
  9. Reddy, M.B. Yang, R.S.H, Clewell III, H.J. & Andersen, M.E., Eds. (2005). Physiologically Based Pharmacokinetics: Science and Application.John Wiley & Sons, Hoboken, NJ.
  10. Blaauboer, B. (2003). The integration of data on physico-chemical properties, in vitro-derived toxicity data and physiologically based kinetic and dynamic as modelling a tool in hazard and risk assessment. A commentary. Toxicol. Lett.138, 161.
  11. National Research Council. (2006). Human Biomonitoring for Environmental Chemicals.National Academy Press, Washington, DC.
  12. Krewski D., Hogan, V., Turner, M., Zeman, P., et al. (2007). An Integrated Framework for Risk Management and Population Health. Hum. Ecol. Risk Assess.13, 1288.
  13. Andersen, M.E., Thomas, R.S., Gaido, K.W. & Conolly, R.B. (2005). Dose-response modeling in reproductive toxicology in the systems biology era. Reprod. Toxicol. 19, 327.