Emerging Technologies
Computational Toxicology
Published: December 6, 2007
Robert J. Kavlock, PhD
National Center for Computational Toxicology
Office of Research and Development
US EPA
B205-01
Research Triangle Park, NC 27711
E-mail: kavlock.robert@epa.gov
Key advances leading the field include:
- construction and curation of large-scale data repositories necessary to anchor the interpretation of information from new technologies (Richard, 2006);
- the introduction of virtual and laboratory-based high throughput assays on hundreds to thousands of chemicals per day and the development of high content assays with hundreds to thousands of biological endpoints per sample for the identification of toxicity pathways (Inglese et al., 2006; Dix et al., 2007); and
- the latest advances in computational modeling that are providing the tools needed to integrate information across multiple levels of biological organization for characterization of chemical hazard and risk to individuals and populations (Di Ventura et al., 2006).
Collectively, these advances reflect the wave of change that is encompassing and reinvigorating toxicology, just in time to facilitate the vision of toxicology in the 21st century that was recently released by the National Research Council (NRC) of the National Academy of Science (National Research Council, 2007). The NRC report’s overall objective is to foster a transformative paradigm shift in toxicology based largely on the increased use of in vitro systems that will (1) provide broad characterization of chemicals, chemical mixtures, outcomes, and lifestages; (2) reduce the cost of and time for testing; (3) use fewer animals and cause minimal suffering to the animals used; and (4) develop a more robust scientific base for assessing health effects of environmental agents. The NRC report describes this effort as one that will require the involvement of multiple organizations in government, academia, industry, and the public.
Spurred on by far-reaching advances in biology, chemistry, and computer sciences, the tools needed to open the veritable black boxes that have prevented significant achievements in the predictive power are becoming readily available. A good deal of this technology was developed by the pharmaceutical industry for use in drug discovery (Houck & Kavlock, 2007). Environmental chemicals differ from drug candidates in a number of important ways. For example, drugs are developed with discrete targets in mind, conform to physico-chemical properties that assist in absorption, distribution, metabolism and excretion, have well understood metabolic profiles, and have use patterns that are known and quantifiable. In contrast, environmental chemicals generally are not designed with biological activity in mind, cover extremely diverse chemical space, have poorly understood kinetic profiles, and are generally evaluated at exposures levels well in excess of likely real world situations. The challenge to successfully employ these new experimental and computational technologies in toxicology will be considerable, given that they have yet to yield the significant increase in the pace of drug discovery that was expected. On the other hand, while the goal of drug discovery is to find the “needle in the haystack” using targeted screening tools, the goal of predictive toxicology is to use these tools more broadly to discern patterns of activity with regard to chemical impacts on biological systems and hence may be more achievable.
It will take a concerted effort on the part of government, academia, and industry to achieve the transformation that is so eagerly awaited. Success will depend on building a robust chemo-informatics infrastructure to enhance the field, on conducting large-scale proof-of-concept studies that integrate diverse data sources and types into more complete understanding of biological activity, on developing a cadre of scientists comfortable with both molecular tools and mathematical modeling languages, and on convincing risk managers in regulatory agencies that the uncertainties inherent in the new approaches are sufficiently smaller or better characterized than in traditional approaches.
The rewards from such a success would be significant. More chemicals will be evaluated by more powerful and broad based tools, animals will be used more efficiently and effectively in the bioassays designed to answer specific questions rather than to fill in checklists, and the effects of mixtures of chemicals will be better understood by employing system-level approaches that encompass the underlying biological pathways whose interactions determine the responses of the individual and combined effects of components of mixtures.
Clearly all of this will not happen soon, or without significant investment. The NRC (2007) estimates a 10-20 year effort at about $100 million per year will be required for the paradigm shift they envision. This is probably several-fold more than is being invested currently in the area and, in most cases, those funds have not been specifically guided by an overarching strategic vision such as put forth by the NRC. Nonetheless, there are pockets of progress occurring and the first success will likely be seen in the ability to detect and quantify the interactions of chemicals with key identifiable biological targets (e.g., nuclear receptors, transporters, kinases, ion channels) and to be able to map these potentials to toxicity pathways and phenotypic outcomes using computational tools. Later successes will be seen in modeling responses that require ever greater understanding of system-level functioning that will ultimately take us to the understanding of susceptibility factors (be they for the individual, life-stage, gender or species). All of these new methods, capabilities and advances offer great promise for the predictive component of toxicology.
Disclaimer: The United States Environmental Protection Agency through its Office of Research and Development funded the research described here. It has been subjected to Agency review and approved for publication.
©2007 Robert Kavlock
Dix, D.J., Houck, K.A., Martin, M.T., Richard, A.M., Setzer, R.W. & Kavlock, R.J. (2007). The ToxCast program for prioritizing toxicity testing of environmental chemicals. Toxicol. Sci. (Forum), 95, 5-12.
Houck, K. & Kavlock, R.J. (2007). Understanding Mechanisms of Toxicity: Insights from Drug Discovery. Toxicol. Appl. Pharm. (in press).
Kavlock, R., Ankley, G.T., Collette, T., Francis, E., Hammerstrom, K., Fowle, J., Tilson, H., Toth, G., Schmieder, P., Veith, G.D., Weber, E., Wolf, D.C. & Young, D. (2005). Computational toxicology: framework, partnerships, and program development. Reprod. Toxicol. 19, 265-280.
Inglese, J., Auld, D.S., Jadhav, A., Johnson, R.L., Simeonov, A., Yasgar, A., Zheng, W. & Austin, C.P. (2006). Quantitative high-throughput screening: a titration-based approach that efficiently identifies biological activities in large chemical libraries. Proc. Natl. Acad. Sci. USA. 103, 11473-11478.
Richard, A. (2006). The Future of Toxicology – Predictive Toxicology: An Expanded View of “Chemical Toxicity.” Chemical Research In Toxicology. 19(10), 1257-1261.
National Research Council. (2007). Toxicity Testing in the Twenty-first Century: A Vision and a Strategy. The National Academies Press, Washington, DC.