A New Perspective on the Past, Present and Future of Toxicity Testing

Home / New Perspectives / Overarching Challenges / A New Perspective on the Past, Present and Future of Toxicity Testing

Overarching Challenges

A New Perspective on the Past, Present and Future of Toxicity Testing

Sunita Shukla, NIH Chemical Genomics Center Published: December 4, 2009
About the Author(s)
Dr. Shukla received her B.A. in 1999 from Saint Louis University with a major in English and minor in Biology. Upon graduation, she completed a Master of Public Health at Saint Louis University in 2001 with a focus on Epidemiology and Environmental/Occupational Health. During her MPH, Sunita held a fellowship at Washington University School of Medicine looking at worker compliance with isoniazid therapy. From 2002-2007, Sunita earned a PhD in Human Genetics, with a focus on pharmacogenetics of anticancer agents, from the University of Chicago in the lab of Dr. Eileen Dolan. During this period, Sunita contributed to several peer-reviewed publications that reported on the use of lymphoblastoid cell lines in order to identify genetic determinants of resistance or toxicity to various chemotherapeutic agents, including cisplatin and carboplatin. During the last year of her PhD, Sunita held an internship at Abbott Laboratories and worked on clinical trial protocols pertaining to concurrent therapies. In 2007, Sunita began a postdoctoral fellowship at the NIH Chemical Genomics Center and is working in various areas including pregnane X receptor, signaling pathways, and high content imaging under the guidance of Drs. Menghang Xia and Douglas Auld. In the area of toxicology, Sunita is working on in vitro toxicological assays which may be useful to generate predictive models. These models may be used to predict toxicity of chemical compounds in vivo in order to develop more cost effective alternatives to animal testing.

Sunita J. Shukla, MPH, Ph.D.
NIH Chemical Genomics Center
9800 Medical Center Drive
Rockville, MD 20850
Email: shuklasu@mail.nih.gov

(Editors’ note: In December 2008, the founders of AltTox—the Procter & Gamble Company and the Humane Society of the United States—bestowed the North American Alternatives Award on the principal architects of the US government’s “Tox21” program (North American Alternative Awards Announced). The winners included Christopher Austin of the NIH Chemical Genomics Center (NCGC), Raymond Tice of the National Toxicology Program (NTP), and Robert Kavlock of the Environmental Protection Agency (EPA). They decided to use the $25,000 prize money to partially fund a post-doctoral fellow working on the Tox21 program at the NCGC (Tox21: Putting a Lens on the Vision of Toxicity Testing in the 21st Century). The AltTox Management Team recently invited this post-doctoral fellow—Sunita Shukla—to contribute an essay to The Way Forward series, providing her perspective as a young scientists contributing to the transformation of toxicology as it moves into the 21st century.)

Introduction

This is an exciting time to be young post-doctoral fellow at the NIH Chemical Genomics Center (NCGC), especially as we carry out our share of the research associated with the Tox21 program. I have the opportunity to reflect on a unique point in time with regard to toxicity testing. Regardless of the advancements made in recent years, the resonating goal in the international toxicology community is the reduction of in vivo testing and the development of alternative in vitro methods.

Traditionally, animal models have been used to extrapolate potentially harmful events in humans. These models have been specifically developed to evaluate toxicological endpoints such as oral, dermal and ocular toxicity; immunotoxicity; genotoxicity; reproductive and developmental toxicity; and carcinogenicity. However, there are major initiatives, outlined below, which have sought to combine existing methods with new technologies in order to utilize in vitro methods as predictive models for in vivo response.

As someone who has a background in public health and genetics, I agree with the need to move the field towards translational toxicology. In other words, the current paradigm should focus on using new tools and technologies to make population risk decisions regarding hazardous environmental and chemical agents. Broadly speaking, this paradigm shift will ultimately require an intersection between toxicology, genetics, informatics, and public health.

The intersection of various disciplines should enable researchers to focus on a battery of in vitro methods that will test detect changes in the underlying biology involved in toxicity pathways resulting from environmental perturbation (Andersen and Krewski, 2009). Furthermore, this paradigm shift may be a welcomed movement away from traditional high dose hazard studies in vivo (Andersen and Krewski, 2009). In order to appreciate the scientific and technological advancements which are shaping toxicity testing today, it is important to appreciate where this fits in the context of historical testing.

A Review of Traditional Testing Methods

As a newcomer to the field of toxicology, I have been reading about the challenges and proposed solutions to reducing animal testing. Previous toxicity testing has relied on animal models treated at high doses over a period time, with the results extrapolated to human health outcomes at lower doses. This approach dates back to the 1950s, when the utilization of animal models and knowledge of the underlying response was primitive (Krewski, et al., 2009). Furthermore, in vivo testing has been costly, time consuming and low throughput (Xia, et al., 2008). Previously, complete toxicological profiling of one chemical in standard whole-animal toxicity assays consisted of 8 toxicity tests: acute, sub-chronic, and chronic toxicity; reproductive toxicity; developmental toxicity, ocular and skin irritation, hypersensitivity; phototoxicity; and toxicokinetic studies (Goldberg and Frazier, 1989).

Despite the disadvantages associated with animal testing, the majority of our understanding regarding chemical toxicity has come from animal data (Zurlo, et al., 1993). However, extensive animal testing did not provide a mechanistic understanding of chemical toxicity, and knowledge concerning adverse risks to humans was still inadequate (Rowan, 1983). Hence, a need for more mechanistic data and “theoretical framework for rational decision making” was noted in the early 1980s (Rowan, 1983).

More recently, there have been numerous studies highlighting the intra- and inter-species differences in mammals, including humans. Williams and Weisburger (1993) pointed out that intra-species differences between different strains of mouse affect the severity and incidence of neoplasms and can affect extrapolation of various cancers from mice to humans. Inherent resistance to spontaneous and malignant tumors in nonhuman primate models has also led to the variation in the manifestation of disease across these species (Beniashvili, 1994). In addition to inter and intra-species differences in disease models, other species differences which affect disease outcome and extrapolations include differences in basal metabolic rate, metabolic pathways, cancer type (sarcomas in mice versus carcinomas in humans), genetic aberrations associated with tumors, and telomere biology (Rangarajan and Weinberg, 2003).

In order for a new generation of toxicologists to understand how new approaches will advance the field, it is appropriate to understand previous strategies and origins of current regulatory agencies (National Research Council (NRC), 2007). In 1938, the Food, Drug, and Cosmetic Act (FDCA) was passed in order to monitor drug safety prior to marketing, based on animal toxicity studies (NRC, 2007). In the following years, clinical trials became more prevalent and the Food and Drug Administration (FDA) developed its first protocols pertaining to toxicity testing during the 1950s (NRC, 2007). Since this time, there have been several critical pieces of legislation which have established standards for food and pesticide levels.

Agencies such as the Environmental Protection Agency (EPA, 1970) and the National Toxicology Program (NTP, 1978) were created to evaluate pesticides and environmental pollutants, coordinate toxicology-based testing programs within the federal government and provide scientific information regarding toxicity testing to appropriate health agencies, professionals and the public (NRC, 2007). The role of the NTP is especially pertinent because of its ability to evaluate toxicity testing strategies based on new tests and approaches, such as its initiation of the development of high throughput screening to assess a large number of chemical entities. Furthermore, the NTP and its activities could be seen as a successful program that takes a multi-disciplinary approach regarding the initiation and implementation of new methods in toxicity testing.

NIH Roadmap for Medical Research and Toxicity Testing in the 21st Century: A Paradigm Shift

A recent report from the National Academy of Sciences termed Toxicity Testing in the 21st Century (NRC, 2007) has become the vision and roadmap for toxicology by calling for the development and utilization of more predictive in vitro models of biological response. Furthermore, the advent of technological innovations in molecular biology has prompted the NTP to incorporate these advances into new testing strategies encompassing broader scientific knowledge from other sources. The NTP created its roadmap titled “A National Toxicology Program for the 21st Century” (National Toxicology Program, 2004) where the goal is to take advantage of emerging research opportunities and provide a framework for setting research priorities. The NTP Roadmap has identified three main areas for toxicity testing in the 21st century including refining traditional toxicology assays, developing rapid mechanism-based predictive screens, and improving the overall utility of NTP products for public health decisions (National Toxicology Program, 2004). The NTP Roadmap places an increased emphasis on the use of alternative assays for identifying key pathways and molecular mechanisms linked to disease. These less expensive, higher throughput assays can be used to screen a larger number of compounds and prioritize them for testing programs. Furthermore, high throughput testing of compounds may allow for the rapid and more accurate prediction of in vivo biological responses.

In order to meet the goals outlined in the Roadmap and to address the needs outlined in the NRC report, the NTP partnered with the NIH Chemical Genomics Center (NCGC) in 2004 and the EPA in 2006. The major goals of the collaboration rest upon the identification of mechanisms involved in chemical-induced biological activity, prioritization of chemicals of further toxicological evaluation and the development of more in vitro predictive models of in vivo response (Austin, et al., 2008).

A central aspect of the Tox21 collaboration is NCGC’s ability to utilize its robotic automation technology to test hundreds of thousands of compounds in titration-based format over a short period of time (Austin, et al., 2008). To fully appreciate the scope of NCGC’s role in Tox21, it is important to understand its relationship with the Molecular Libraries Initiative (MLI). The MLI is a component of the NIH Roadmap for Medical Research, established in 2004, to respond to challenges facing biomedical research and expand the availability and use of chemical probes (Austin, et al., 2004; http://nihroadmap.nih.gov/molecularlibraries/). The MLI was born from the need for new approaches to determine function and therapeutic potential for newly sequenced genes and the need to accelerate the pace at which basic research is translated into new therapeutics (Austin, et al., 2004).

Additional components of the MLI include: (i) the establishment of a nation-wide consortium of small molecule screening centers, including NCGC, with access to over 300,000 chemically diverse small molecules; (ii) a comprehensive database of chemical structures and their biological activities from HTS assays (www.pubchem.ncbi.nlm.nih.gov); (iii) technology development supporting the development of new chemically diverse libraries, the development of new high-throughput instrumentation and assay diversity enabling the design of pharmacologically relevant tools to explore cellular function (http://nihroadmap.nih.gov/molecularlibraries/); and (iv) development of datasets and algorithms for improved prediction of ADME/toxicity of small molecules in various assays/tissue types (Austin, et al., 2004).

Traditional biological assays have been low-throughput, employing animal models and labor-intensive testing of samples. However, the growth of small molecule collections required the development of high throughput screening (HTS) technologies (Schnecke and Bostrom, 2006). NCGC has built upon the HTS technology and developed its own quantitative high throughput screening (qHTS) paradigm, in which a titration-based screening approach is used to test upward of a million small molecules, while reducing false negative and positive rates. Furthermore, qHTS was developed to increase the rate and efficiency of chemical probe development in addition to providing valuable publicly available data through PubChem against various cell-based and biochemical assays. As outlined above, NCGC is an example of how technology-driven translational research has been successfully used to characterize toxicity endpoints in cell-based assays.

Various assays that are part of Tox21 are performed at NCGC. These assays are primarily focused on general toxicity/cytotoxicity (Huang, et al., 2008; Xia, et al., 2008), pathway-specific toxicological mechanisms (Xia, et al., 2009a; Xia, et al., 2009b; Xia, et al., 2009c) and target-specific toxicological mechanisms. In one example study, cytotoxicity across various human and rodent cell lines was assessed for 1,353 compounds from the NTP collection (Xia, et al., 2008). Within the subgroup of active compounds, multiple effects from the compounds were identified within and across compound types, cell types and species. For example, there were human and rodent-derived cells that were most sensitive to compound-induced cytotoxicity, whereas human fibroblast and skin cells were the least sensitive. Overall, rodent cells demonstrated more sensitivity than human cells. A striking finding was the lack of concordance in the patterns of compound activity in cells derived from the same tissue but from different species (there were also instances where cells with similar tissue origin in the same species showed discordance in compound activity profiles), highlighting intra-tissue differences in response. Variation in cellular differentiation may be one reason why specific cell types are more sensitive to compounds than others.

To examine the specific mechanism of cytotoxicity in various tissue types, two different endpoints (cell viability and caspase activation) were assessed with regard to testing 1,353 compounds in human and rat cell types using a qHTS platform (Huang, et al., 2008). Hierarchical clustering based on compound EC50/IC50 patterns revealed similar clustering based on endpoints rather than cell type, indicating that viability and caspase assays provide distinct sets of information and that most compounds induce toxicity through mechanisms in addition to caspase activation.

Although attractive, target-based screens may lead to the identification of active compounds which do not retain their activity in a physiological environment (Xia, et al., 2009b). Thus, cell-based cell signaling assays offer an alternative assay format in which the readout is dependent on specific components acting on a single signaling pathway. Consequently, NCGC has utilized qHTS in such assays, which was most recently demonstrated in the identification of compounds that potentiate the cyclic-AMP Response Element Binding (CREB) pathway (Xia, et al., 2009b). CREB is a protein involved in learning and long-term memory through binding to cAMP Responsive Elements (CRE) in order to regulate the transcription of genes. Over 73,000 compounds were screened in 7-15 concentration points for CREB enhancer activity in a cell-based CRE reporter gene assay. Approximately 1,800 compounds were classified as potentiators of CREB activity, allowing for structure-activity relationships (SAR) to be deciphered on active compounds. Several chemical series were identified containing compounds with known and novel biological activities. Furthermore, multiple mechanisms of the CREB signaling pathway were interrogated using qHTS to better characterize compound mechanism of action. This particular study highlights the use of qHTS coupled with various assay formats (cell-based and enzymatic), follow-up assays, SAR and multiple pathway targets in order to accelerate the development of chemical probes for drug discovery in this pathway.

Future Directions and Responsibilities

The MLI and Tox21 endeavors have become increasingly important initiatives by paving the way for new collaborations and technologies to accelerate scientific discoveries. In addition, these initiatives generated public interest in next-generation testing methods and in alternative approaches to existing ones. In the midst of these initiatives, NCGC is playing a critical role in utilizing and validating in vitro approaches for modeling in vivo responses.

The examples given here are only a few of the studies that utilize qHTS and various in vitro models to study specific endpoints or pathways in humans and rodents. Furthermore, the combination of qHTS and in vitro methods has allowed the screening of hundreds of thousands of compounds in millions of wells in an efficient manner and short amount of time. Most importantly, the production of concentration response curves for every compound tested provides a data rich resource for SAR analysis, computational modeling and chemical prioritization for more extensive toxicological evaluation. NCGC has recently developed a fragment-based approach to modeling toxicity, which is designed to achieve good prediction with structurally diverse sets of compounds (Huang, et al., 2009). Ultimately, this process enables the rapid development of chemical probes which may ultimately become novel therapeutics in a variety of diseases.

Through collaborative efforts and expertise found at NCGC, future goals include the inclusion of new platforms for qHTS (such as high content screening and nanotechnology), assessment of genetic variation involved in human and rodent toxicity, prioritization of toxicity assays which assess key human toxicity endpoints, expansion of compounds tested, and creation of public databases harboring the aforementioned screening data.

The future of toxicology and testing in the 21st century will be a community-wide effort and will warrant continual improvements in technology and development of alternative animal models to better predict public health outcomes. We are headed in the right direction, but must not lose sight of how to bridge the enormous gap between data and regulatory practice. Thus, it is especially important for young toxicologists to continually interrogate models, methods and data from all platforms. Additionally, it is important to become involved in international efforts of refining practices and incorporating new strategies which may one day become the very foundation for toxicity testing.

©2009 Sunita Shukla

References
Andersen, M.E. & Krewski, D. (2009). Toxicity testing in the 21st century: Bringing the vision to life. Toxicol. Sci. 107, 324-330.

Austin, C.P., Kavlock, R. & Tice, R.R. (2008). Tox21: Putting a Lens on the Vision of Toxicity Testing in the 21st Century. Tox21: Putting a Lens on the Vision of Toxicity Testing in the 21st Century

Austin, C.P., Brady, L.S., Insel, T.R. & Collins, F.S. (2004). NIH Molecular Libraries Initiative. Science. 306, 1138-1139.

Beniashvili, D. (1994). Experimental Tumors in Monkeys, CRC Press.

Goldberg, A.M. & Frazier, J.M. (1989). Alternatives to animals in toxicity testing. Sci. Am. 261, 24-30.

Huang, R., Southall, N., Cho, M.H., Xia, M., Inglese, J. & Austin, C.P. (2008). Characterization of diversity in toxicity mechanism using in vitro cytotoxicity assays in quantitative high throughput screening. Chem. Res. Toxicol. 21, 659-667.

Huang, R., Southall, N., Xia, M., Cho, M.H., Jadhav, A., Nguyen, D.T., Inglese, J., Tice, R.R. & Austin, C.P. (2009). Weighted feature significance: A simple, interpretable model of compound toxicity based on the statistical enrichment of structural features. Toxicol. Sci. 112, 385-393.

Krewski, D., Andersen, M.E., Mantus, E. & Zeise, L. (2009). Toxicity testing in the 21st century: Implications for human health risk assessment. Risk Anal. 29, 474-479.

National Research Council (NRC). (2007). Toxicity Testing in the 21st Century: A Vision and a Strategy, National Academy Press, Washington, D.C.

National Toxicology Program. (2004). A National Toxicology Program for the 21st Century: A Roadmap for the Future. http://ntp.niehs.nih.gov/files/NTPrdmp.pdf

Rangarajan, A. & Weinberg, R.A. (2003). Opinion: Comparative biology of mouse versus human cells: Modelling human cancer in mice. Nat. Rev. Cancer. 3, 952-959.

Rowan, A.N. (1983). Alternatives: Interaction between science and animal welfare. In: Product Safety Evaluation (ed. AM Goldberg), pp 113-133. Mary Ann Liebert, Inc., New York.

Schnecke, V. & Bostrom, J. (2006). Computational chemistry-driven decision making in lead generation. Drug Discov. Today. 11, 43-50.

Williams, G. & Weisburger, G. (1993). Chemical carcinogenesis. Toxicology, The Basic Science of Poisons, McGraw-Hill, New York.

Xia, M., Huang, R., Witt, K.L., Southall, N., Fostel, J., Cho, M.H., Jadhav, A., Smith, C.S., Inglese, J., Portier, C.J., Tice, R.R. & Austin, C.P. (2008). Compound cytotoxicity profiling using quantitative high-throughput screening. Environ. Health Perspect. 116, 284-291.

Xia, M., Guo, V., Huang, R., Inglese, J., Nirenberg, M. & Austin, C.P. (2009a). A Cell-based beta-Lactamase Reporter Gene Assay for the CREB Signaling Pathway. Curr. Chem. Genomics. 3, 7-12.

Xia, M., Huang, R., Guo, V., Southall, N., Cho, M.H., Inglese, J., Austin, C.P. & Nirenberg, M. (2009b). Identification of compounds that potentiate CREB signaling as possible enhancers of long-term memory. Proc. Natl. Acad. Sci. U.S.A. 106, 2412-2417.

Xia, M., Huang, R., Sun, Y., Semenza, G.L., Aldred, S.F., Witt, K.L., Inglese, J., Tice, R.R. & Austin, C.P. (2009c). Identification of chemical compounds that induce HIF-1alpha activity. Toxicol. Sci. 112, 153-163.

Zurlo, J., Rudacille, D. & Goldberg, A.M. (1993). Animals and Alternatives in Testing: History, Science, and Ethics, Mary Ann Liebert, Inc.