Computational Toxicology

Home / New Perspectives / Emerging Technologies / Computational Toxicology

Emerging Technologies

Computational Toxicology

By Robert Kavlock, US Environmental Protection Agency

Published: December 6, 2007

About the Author(s)
Bob Kavlock received his PhD in Biology from the University of Miami in 1977 and has been with the US Environmental Protection Agency since that time. Since 2005, he has been the director of the newly formed National Center for Computational Toxicology within EPAs Office of Research and Development (ORD). Prior to that he spent 15 years as the Director of Reproductive Toxicology Division in NHEERL/ORD and he has held a variety of responsibilities for EPAs research programs on reproductive toxicology and endocrine disruption. Dr. Kavlock’s research interests are oriented toward the development of improved hazard and risk assessment approaches, particularly for non cancer effects. He has published more than 160 scientific papers, 16 book chapters, and edited three books. On the national level, he was a member of the Endocrine Disruptor Working Group of the Committee on the Environment and Natural Resources with the OSTP. On the international level, he has co-organized EDC workshops with the European Union and the Japanese Ministry of the Environment, was a co-editor of the Global Assessment of the State-of-the-Science of Endocrine Disruptors (WHO, 2001), and a chapter coordinator for IPCS Environmental Health Criteria Document on “Principles for Evaluating Health Risks in Children Associated with Exposure to Chemicals” (IPCS, 2006). He is active in the Society of Toxicology, where he is a past president of the Reproductive and Developmental Toxicology Specialty Section and the North Carolina Regional Chapter and he was President of the Teratology Society (2000-2001). He was a member of the ALTX4 Study Section of NIH (1997-2001), holds adjunct appointments at Duke University and North Carolina State University, and has been on the editorial boards of the Journal of Toxicology and Environmental Health, Toxicological Sciences, and Birth Defects Part B: Developmental and Reproductive Toxicity.

Robert J. Kavlock, PhD
National Center for Computational Toxicology
Office of Research and Development
US EPA
B205-01
Research Triangle Park, NC 27711
E-mail: kavlock.robert@epa.gov

Computational toxicology is a growing research area that is melding advances in molecular biology and chemistry with modeling and computational science in order to increase the predictive power of toxicology data (Kavlock et al., 2005). This discipline of computational toxicology offers the possibility of greater efficiency and effectiveness in determining the hazards of the many environmental stressors that must be regulated, and for deciding what types of information are most needed to decrease uncertainties in the protection of human health and the environment. Computational toxicology differs from traditional toxicology in many aspects, but perhaps the most important is that of scale. Scale in the numbers of chemicals that are studied, breadth of endpoints and pathways covered, levels of biological organization examined, range of exposure conditions considered, and in coverage by assays of life stages, genders, and species. It will take considerable technological progress in all these areas to make toxicology a broadly predictive science.

Key advances leading the field include:

  • construction and curation of large-scale data repositories necessary to anchor the interpretation of information from new technologies (Richard, 2006);
  • the introduction of virtual and laboratory-based high throughput assays on hundreds to thousands of chemicals per day and the development of high content assays with hundreds to thousands of biological endpoints per sample for the identification of toxicity pathways (Inglese et al., 2006; Dix et al., 2007); and
  • the latest advances in computational modeling that are providing the tools needed to integrate information across multiple levels of biological organization for characterization of chemical hazard and risk to individuals and populations (Di Ventura et al., 2006).

Collectively, these advances reflect the wave of change that is encompassing and reinvigorating toxicology, just in time to facilitate the vision of toxicology in the 21st century that was recently released by the National Research Council (NRC) of the National Academy of Science (National Research Council, 2007). The NRC report’s overall objective is to foster a transformative paradigm shift in toxicology based largely on the increased use of in vitro systems that will (1) provide broad characterization of chemicals, chemical mixtures, outcomes, and lifestages; (2) reduce the cost of and time for testing; (3) use fewer animals and cause minimal suffering to the animals used; and (4) develop a more robust scientific base for assessing health effects of environmental agents. The NRC report describes this effort as one that will require the involvement of multiple organizations in government, academia, industry, and the public.

Spurred on by far-reaching advances in biology, chemistry, and computer sciences, the tools needed to open the veritable black boxes that have prevented significant achievements in the predictive power are becoming readily available. A good deal of this technology was developed by the pharmaceutical industry for use in drug discovery (Houck & Kavlock, 2007). Environmental chemicals differ from drug candidates in a number of important ways. For example, drugs are developed with discrete targets in mind, conform to physico-chemical properties that assist in absorption, distribution, metabolism and excretion, have well understood metabolic profiles, and have use patterns that are known and quantifiable. In contrast, environmental chemicals generally are not designed with biological activity in mind, cover extremely diverse chemical space, have poorly understood kinetic profiles, and are generally evaluated at exposures levels well in excess of likely real world situations. The challenge to successfully employ these new experimental and computational technologies in toxicology will be considerable, given that they have yet to yield the significant increase in the pace of drug discovery that was expected. On the other hand, while the goal of drug discovery is to find the “needle in the haystack” using targeted screening tools, the goal of predictive toxicology is to use these tools more broadly to discern patterns of activity with regard to chemical impacts on biological systems and hence may be more achievable.

It will take a concerted effort on the part of government, academia, and industry to achieve the transformation that is so eagerly awaited. Success will depend on building a robust chemo-informatics infrastructure to enhance the field, on conducting large-scale proof-of-concept studies that integrate diverse data sources and types into more complete understanding of biological activity, on developing a cadre of scientists comfortable with both molecular tools and mathematical modeling languages, and on convincing risk managers in regulatory agencies that the uncertainties inherent in the new approaches are sufficiently smaller or better characterized than in traditional approaches.

The rewards from such a success would be significant. More chemicals will be evaluated by more powerful and broad based tools, animals will be used more efficiently and effectively in the bioassays designed to answer specific questions rather than to fill in checklists, and the effects of mixtures of chemicals will be better understood by employing system-level approaches that encompass the underlying biological pathways whose interactions determine the responses of the individual and combined effects of components of mixtures.

Clearly all of this will not happen soon, or without significant investment. The NRC (2007) estimates a 10-20 year effort at about $100 million per year will be required for the paradigm shift they envision. This is probably several-fold more than is being invested currently in the area and, in most cases, those funds have not been specifically guided by an overarching strategic vision such as put forth by the NRC. Nonetheless, there are pockets of progress occurring and the first success will likely be seen in the ability to detect and quantify the interactions of chemicals with key identifiable biological targets (e.g., nuclear receptors, transporters, kinases, ion channels) and to be able to map these potentials to toxicity pathways and phenotypic outcomes using computational tools. Later successes will be seen in modeling responses that require ever greater understanding of system-level functioning that will ultimately take us to the understanding of susceptibility factors (be they for the individual, life-stage, gender or species). All of these new methods, capabilities and advances offer great promise for the predictive component of toxicology.

Disclaimer: The United States Environmental Protection Agency through its Office of Research and Development funded the research described here. It has been subjected to Agency review and approved for publication.
©2007 Robert Kavlock

References
Di Ventura, B., Lemerle, C., Michalodimitrakis, K., & Serrano, L. (2006). From in vivo to in silico biology and back. Nature. 443(5), 527-533.

Dix, D.J., Houck, K.A., Martin, M.T., Richard, A.M., Setzer, R.W. & Kavlock, R.J. (2007). The ToxCast program for prioritizing toxicity testing of environmental chemicals. Toxicol. Sci. (Forum), 95, 5-12.

Houck, K. & Kavlock, R.J. (2007). Understanding Mechanisms of Toxicity: Insights from Drug Discovery. Toxicol. Appl. Pharm. (in press).

Kavlock, R., Ankley, G.T., Collette, T., Francis, E., Hammerstrom, K., Fowle, J., Tilson, H., Toth, G., Schmieder, P., Veith, G.D., Weber, E., Wolf, D.C. & Young, D. (2005). Computational toxicology: framework, partnerships, and program development. Reprod. Toxicol. 19, 265-280.

Inglese, J., Auld, D.S., Jadhav, A., Johnson, R.L., Simeonov, A., Yasgar, A., Zheng, W. & Austin, C.P. (2006). Quantitative high-throughput screening: a titration-based approach that efficiently identifies biological activities in large chemical libraries. Proc. Natl. Acad. Sci. USA. 103, 11473-11478.

Richard, A. (2006). The Future of Toxicology – Predictive Toxicology: An Expanded View of “Chemical Toxicity.” Chemical Research In Toxicology. 19(10), 1257-1261.

National Research Council. (2007). Toxicity Testing in the Twenty-first Century: A Vision and a Strategy. The National Academies Press, Washington, DC.

Leave a Comment

*