A Paradigm Shift in Toxicity Testing is Inevitable

Home / New Perspectives / Overarching Challenges / A Paradigm Shift in Toxicity Testing is Inevitable

Overarching Challenges

A Paradigm Shift in Toxicity Testing is Inevitable

Kim Boekelheide, Brown University

Published: August 11, 2008

About the Author(s)
Kim Boekelheide is Professor of Pathology and Laboratory Medicine at the Brown University School of Medicine. He received his B.A. from Harvard University, and M.D. and Ph.D. from Duke University. His research examines fundamental molecular mechanisms by which environmental and occupational toxicants induce testicular injury. Current projects include the study of co-exposure synergy using model testicular toxicants and the effects of in utero endocrine disruptor exposure on steroidogenesis and a predisposition to cancer. He is Director of the Brown University Superfund Basic Research Program. His research has been continuously funded by the National Institute of Environmental Health Sciences since 1985 and he has received several awards including a Burroughs Wellcome Toxicology Scholar Award (1994-1999).

Dr. Kim Boekelheide
Brown University
Division of Biology and Medicine
Box G-A
Providence, RI 02912
E-mail: kim_boekelheide@brown.edu

In the future, toxicity testing will utilize emerging technologies from the ongoing revolution in understanding biological processes, to identify the effects of chemicals on toxicity pathways, using in vitro approaches. The interpretation of chemically-induced alterations in toxicity pathways will depend upon sophisticated modeling that extrapolates from the measured dose-response in cell-based systems to human exposure. This essay asks these questions: Why should we change from the descriptive high dose animal-based toxicity testing of today to a mode of action-based future testing paradigm? What will the new testing paradigm look like? And how long will it take to get there?

Why should we change?

Change is inevitable because the current system is not based on fundamentally sound science. Our existing commercial and regulatory enterprises are all geared to produce and to accept descriptive data from high dose animal tests, and the process of interpreting this information has, to a large extent, effectively protected our health and safety for many decades. The extrapolations — across species, from high test doses to low exposures, and from descriptive endpoints in animals to their possible human correlates — are expertly performed using well established criteria, but these extrapolations are handicapped by the lack of underlying mechanistic information. To compensate, safety factors and the most conservative assumptions are employed to protect us from potential harm. It is clear that at some time in the future, this will change; just measuring phenomenology without a critical grasp of causation is not good enough. In addition, our current approach is too expensive and too slow, capable of only limited throughput because of the complexity of the study designs and the reliance on animal studies that take a long time and assess apical (phenotypic) endpoints.

What will the new testing paradigm look like?

The National Research Council of the National Academies report entitled “Toxicity Testing in the 21st Century: A Vision and a Strategy” (1) articulates the challenges and provides a roadmap for the transition in toxicity testing that we face. The basic proposal is to re-orient testing to the molecular level, rather than observing phenotypic responses at the level of whole organisms. The panel that generated this report highlighted the concept of “toxicity pathways” within cells, as the way forward to understand and to interpret the toxicant-induced mode of action. Pathway responses are dose-dependent. At some low dose, a pathway may begin to be disrupted by a toxicant exposure, but the pathway will continue to function, due to a homeostatic response (an “adaptive” behavior). At a higher dose, the adaptive response is overwhelmed, and an adverse effect takes place. While some degree of toxicant-induced adversity can be repaired, ultimately a dose will be reached that causes an irreversible change in pathway function with severe consequences for the cell. The concept of toxicity pathways is very similar to that of oncogenes and tumor suppressor genes as the pro-carcinogenic and anti-carcinogenic pathway components in carcinogenesis. For toxicity, there are protective response pathways, for example the heat shock protein response pathways, and there are pathways that produce adverse consequences, such as pro-apoptotic pathways. The characterization, integration, and interpretation of the dose-dependent changes in protective and adverse toxicity pathways are at the core of the new toxicity testing paradigm.

The NRC report’s vision identified a sequence of steps for evaluating toxicants, including: 1) chemical characterization, 2) assessment of toxicity pathway responses and targeted testing, 3) dose-response and extrapolation modeling, 4) benchmarking to population and exposure data, and 5) decision-making within risk contexts. Each of these steps requires the generation of new tools and approaches, creating a coherent system that leads to an exposure guideline. The implementation of this new strategy will require the development of: 1) a comprehensive suite of in vitro tests, preferably based on human cells, cell lines, or components, 2) computational models of toxicity pathways to support application of in vitro test results in risk assessments, 3) infrastructure changes to support the basic and applied research needed to develop the tests and pathway models, 4) validation of tests and test strategies, and 5) evidence justifying that the toxicity pathway approach is adequately predictive of adverse health outcomes in humans to use in decision-making.

How long will it take to transition to this new paradigm of toxicity testing?

For guidance on this question, it is worth returning to the cancer analogy. We have been looking at oncogenes and tumor suppressors since the war on cancer began 35 years ago, and have made very substantial progress. Indeed, the pace of discovery and understanding has been consistently accelerating throughout this time. We can robustly predict the behavior of a cancer based on its molecular characteristics. Molecular biomarkers are now commonly used in clinical practice as prognostic indicators, and as guides to specific therapeutic interventions. In many cases, these molecular characteristics are on par with, or for some tumor types (leukemia and lymphoma come to mind), more valuable than traditional histopathology endpoints in contributing to disease management. Pharmaceutical companies have been investing for decades in the discovery of therapeutics targeted to those molecular pathways that are altered in cancer, and these new drugs work — think about Gleevec (imatinib) for the treatment of chronic myelogenous leukemia.

Predicting how long it will actually take to transition to the new toxicity testing paradigm is pure guesswork, but it is doable in 20 years. Changing the paradigm for toxicity testing from one based on high dose descriptive studies in animals to one based on a synthetic molecular understanding of toxicity pathways will be much easier than the war on cancer has been. We have the advantage of starting this paradigm shift with powerful molecular tools and an appropriate intellectual framework already in place. We also have the advantage of knowledgeable people in leadership positions being aware of the issues, and being willing to act. The recent publication of a Policy Forum in Science entitled “Transforming Environmental Health Protection” (2) by F.S. Collins, G.M. Gray, and J.R. Bucher of the National Human Genome Research Institute, the U.S. Environmental Protection Agency, and the National Toxicology Program of the National Institute of Environmental Health Sciences, respectively, highlights the commitment of these agencies and their leaders to this path forward.

So, how do we get from here to there?

Sound science, basic research, applied research, training a new generation of interdisciplinary scientists, investment, and hard work — the same way that other BIG challenges of the past have been tackled and overcome. Think of building the Panama Canal, carrying out the Manhattan Project, sending humans to the moon, and sequencing the human genome. Modeling of in vitro dose-response behavior, distinguishing adaptive from adverse effects, and extrapolating to human exposure levels will allow exposure guidelines to be developed from toxicity pathway effects. Getting from here to there is a big challenge that will take buy-in from numerous constituencies, a significant and persistent commitment, and lots of resources.

Note: While the author was a member of the NRC Committee on Toxicity Testing and Assessment of Environmental Agents, the views expressed here are his opinion alone and should not be construed as reflecting those of the committee or its report.
©2008 Kim Boekelheide

  1. National Research Council. (2007). Toxicity Testing in the 21st Century: A Vision and A Strategy. Washington, D.C.; National Academy Press.
  2. Collins, F.S., Gray, G.M. & Bucher, J.R. (2008). Transforming environmental health protection. Science. 319, 906-907.