Highlights of “FutureTox: Building the Road for 21st Century Toxicology and Risk Assessment Practices”

Home / In the Spotlight / Highlights of “FutureTox: Building the Road for 21st Century Toxicology and Risk Assessment Practices”

In the Spotlight

Highlights of “FutureTox: Building the Road for 21st Century Toxicology and Risk Assessment Practices”

George P. Daston (Procter & Gamble) and Catherine Willett (Humane Society of the United States)

Published: November 13, 2012

The Society of Toxicology held a recent Current Concepts in Toxicology workshop on the Future of Toxicology, FutureTox: Building the Road for 21st Century Toxicology and Risk Assessment Practices. The intent of the workshop was to present current thinking on how the advent of new biological, analytical, and computational approaches to toxicology has the potential to change the way predictive toxicology and risk assessment are done.

Scientists from regulatory agencies, industry, and academia presented overviews and examples of new approaches in hazard identification, exposure assessment and risk assessment practices; presentations in each session were followed by a panel discussion. The first session – introduced by Thomas Hartung (CAAT) and Andy Maier (TERA) with presentations by Ila Cote (US EPA), Alan Boobis (Imperial College, London) and Warren Glaab (Merck) – stressed the need for change based on the necessity of analyzing tens of thousands of compounds, mixtures, and low-dose effects, as well as the need to overcome some of the limitations of whole-animal testing. The presentations touched on the importance of relating the new approaches to human data, including discovery and use of biomarkers, applying objective standards to weight-of-evidence evaluations, interagency collaboration, and problem formulation in fit-for-purpose applications (designing the approach based on the required level of certainty).

In the second session, introduced by Ray Tice (NIEHS/NTP), speakers presented examples of applied use of new types of information.  Richard Judson (US EPA/NCCT) showed appropriateness of use of ToxCast information for prioritization and chemical grouping, Russell Thomas (Hamner Institutes) discussed the application of toxicogenomics information to chemical grouping and in-vitro-in-vivo extrapolation (IVIVE) that can actually be used for preliminary risk assessment.  Richard Becker (ACC) stressed the importance of integrated strategies to focus testing. Shashi Amur (US FDA-CDER) described the use of biomarkers in FDA’s Drug Development Tool Qualification Program. And William Pennie discussed experience at Pfizer in developing a compound safety evaluator based on the premise that toxic compounds generally function through a variety of common pathways and safer compounds are generally more restricted in function.

The third session, introduced by Mike Dellarco of the US National Children’s Study, focused on new approaches to exposure assessment.  John Wambaugh, US EPA/NCCT pointed out that uncertainty in current exposure models can span 8 orders of magnitude; however, there can be 6 or more orders of separation between toxicity and predicted exposure estimates, so the models don’t have to get much better. Dr. Wambaugh suggested that exposure modeling could be improved by incorporation of physical/chemical information along with both NHANES (National Health and Nutrition Examination Survey) data and better biomonitoring and use modeling. Harvey Clewell (Hamner Institutes) described application of a pathway based approach to quantitative IVIVE (QIVIVE) that requires in vitro dose-response and biokinetic behavior information to determine a point of departure, and is dependent on the availability of bio monitoring information. Amin Rostami-Hodjegan (University of Manchester) discussed exposure models that address specific organ exposures as well as inter-individual variability (e.g. pregnant women). He pointed out that use of modeling for drugs is facilitated by the fact that the dose is known and that simulations using in vitro data for gut lumen absorption, gut metabolism, and first pass metabolism are fairly well established. James Bus (Dow) questioned the appropriateness of the high-dose, bolus exposure used in animal testing for understanding effects at real-world exposure levels, especially for endocrine-related effects. It is important to characterize not only real-world exposure but also actual in vivo and in vitro test concentrations. Sean Hayes (Summit Toxicology) introduced the concept that concentrations of substances vary much more within an individual over time than concentrations vary between individuals; therefore, as a first-pass risk assessment for many chemicals, IVIVE could be adequately performed using pooled human blood samples. Linda Birnbaum (Director, NIEHS) discussed various projects within NIEHS and collaborations between NIH and other agencies focused on changing approaches to toxicological assessment.

The fourth and final session, introduced by Laurie Haws (ToxStrategies, Inc.), focused on new approaches to improving risk assessment. Bette Meek (University of Ottawa) described the increasing demands of global regulations to provide adequate regulatory information for large numbers of chemicals, and the importance of pragmatic approaches that involve initial problem formulation and fit-for-puropose solutions. Dr. Meek also emphasized the importance of adequately evaluated (e.g. using Bradford-Hill critera) mode-of-action information in determining inter-species relevance as well as for? increasing certainty in risk assessment. Craig Rowlands (Dow) discussed toxicogenomics variability between rat and human with respect to dioxin exposure, Ivan Rusyn (University of North Carolina) discussed the use of cell lines to explore variations between human populations, and Paul Watkins (University of North Carolina and Hamner Institutes) discussed the importance of toxicity – particularly cardiac and liver toxicity – in determining the efficacy window for drugs, and the need for improved human predictivity. Dr. Watkins described the DILI-SIM project as a solution for improving predictivity of human liver injury.

George Daston (Procter & Gamble) provided final thoughts for the workshop, noting that we have entered the Third Phase of biology: the first was classification and inference, the second was experimentalism and reductionism, and the third is synthesis, systems biology and prediction.  Previously it was thought that the spectrum of mechanisms or modes of action was so vast as to be unknowable and the only way was to perform empirical experiments; however, it is increasingly becoming clear that there are a defined set of primary mechanisms that can be described to create a predictive toxicology based on a deep fundamental understanding of biology.

In summary, a great deal of effort has been expended in the development of high-throughput assay systems, such as ToxCast and Tox21, to determine how well they predict toxicity. There seems to be a growing consensus that these assays might be useful in prioritizing large numbers of chemicals for more in-depth testing, particularly when coupled with a screening-level exposure assessment. Consensus was lacking as to whether these high-throughput assay suites could be used for more definitive hazard assessment; some presenters suggested that high-information content assays (e.g., toxicogenomics) might be more reliable, and examples were given on how this data stream can be used for dose-response information as well.

The workshop emphasized the importance of exposure assessment in 21st century risk assessment. Computational methods, along with novel experimental design for in vitro studies, is making it possible to extrapolate in vitro data in such a way that it can better predict the internal dose and administered dose that are likely to cause adverse effects in vivo. These methods will be crucial in moving in vitro toxicity assays from screening tools for hazard to more definitive support for quantitative risk assessment. One presenter cautioned that this might mean going from high-throughput to medium-throughput in chemical assessment. Others felt that this was not much of a concern, as medium throughput would fulfill the needs of most regulatory agencies.

Regarding risk assessment practices, the new technologies are already bringing more information about mode of action, and about factors such as inter-individual variability that are currently handled almost exclusively by the use of default uncertainty factors.

A summary of the workshop is scheduled to be published sometime in 2013.