Novel Tests for Preclinical Toxicity from Toxicogenomics Gene Expression Studies

Home / New Perspectives / Emerging Technologies / Novel Tests for Preclinical Toxicity from Toxicogenomics Gene Expression Studies

Emerging Technologies

Novel Tests for Preclinical Toxicity from Toxicogenomics Gene Expression Studies

Jennifer Fostel, US National Institute of Environmental Health Sciences

Published: January 7, 2009

About the Author(s)
Jennifer M. Fostel supports the development of the Chemical Effects in Biological Systems (CEBS) Knowledgebase at the National Institute of Environmental Health Science (NIEHS) in Research Triangle Park, North Carolina, USA through a contract with SRA International. Before joining the NIEHS CEBS effort in 2004, she directed research in mechanistic toxicology and anti-infectives discovery research at Abbott Laboratories (North Chicago, Illinois, USA) and Pharmacia Corporation (Kalamazoo, Michigan, USA). During her work in industry, she developed methods to analyze, annotate and integrate -omics data from transcriptomics, proteomics and metabonomics in various model organisms.

Jennifer Fostel
NIEHS
Division of Intramural Research
Building 101 – Rall Bldg, F184
111 T.W. Alexander Dr.
Research Triangle Park, NC 27709
Email: fostel@niehs.nih.gov

An important reason for failure of novel pharmaceuticals is concern about safety, underscoring the need for improved mechanisms for identifying toxicity earlier and more accurately during drug development. The field of toxicogenomics is the application of microarray technology to toxicology. Microarrays measure gene expression of thousands to tens of thousands of transcripts simultaneously, and create a snapshot of the activity of biological networks. Toxicogenomics promises novel toxicity tests that could (1) detect injury with greater sensitivity than is possible with current tests and (2) detect toxicities currently lacking an approved clinical test and which are often found only on necropsy of lab animals or with the onset of serious illness in humans. This is expected to greatly enhance both human health and the process of drug development. Significant progress has been made on several fronts.

Measures of changes in gene expression first observed using a microarray to survey tens of thousands of transcripts can be converted into a test of a small number of transcripts using RT-PCR or other technologies. Such a test is a gene expression biomarker, a set of informative genes whose transcript level changes in concert with or before the onset of biological changes of interest such as a particular toxicity. These novel biomarker tests can be performed in target tissue or peripherally, and could be used to predict or reflect a biological change of interest in an endpoint that is more difficult or more costly to measure. One example would be a set of gene expression biomarkers in blood that could be used as an indicator of organ damage, and which would be preferable to an organ biopsy if proven as accurate and sensitive. Another example would be the development of novel tests that could reduce the length and number of pre-clinical animal studies, especially two-year carcinogenicity studies. While considerable work has been accomplished in pre-clinical safety, the extension to clinical tests remains a goal.

It is important to consider the relative newness of the field of toxicogenomics, and the hurdles it has had to overcome before toxicogenomics approaches could be applied to developing novel tests for toxicity and injury. The field of toxicogenomics was defined in 1999, in a paper by Nuwaysir, Bittner, Trent, Barrett, and Afshari entitled “Microarrays and toxicology: the advent of toxicogenomics”. Shortly thereafter, in 2000, the Health and Environmental Sciences Institute (HESI) of the International Life Sciences Institute (ILSI) formed a technical committee to study the Application of Genomics to Mechanisms-Based Risk Assessment. In 2004, the HESI Genomics Committee reported on hepatotoxicity, nephrotoxicity and genotoxicity studies showing (1) points of commonality among data derived from different microarray platforms and (2) the identification of biologically informative pathways of responses following treatment with model toxicants.

Biomarkers are most useful if they have been tested against a variety of conditions to maximize selectivity, sensitivity and specificity, and have been validated under a number of conditions in different laboratories. The HESI committee established that collaborations among scientists from different laboratories, pooling studies performed under different conditions, using different technologies, could produce data that can be shared and analyzed together to enhance understanding of molecular events underlying toxicity. Since biomarkers of toxicity are of wide utility in medicine and pharmaceutical development, many industrial laboratories are joining consortia to share data to rapidly advance the validation of novel safety biomarkers.

In parallel to workings of the HESI Committee, scientists from industrial laboratories described the characterization of libraries of toxicants, and reported fingerprints of gene expression changes capable of classifying chemicals with different toxicological mechanisms. Unfortunately, the data used to derive these biomarkers was not originally reported, with a few exceptions, and so could not be reproduced or leveraged by other scientists. One important exception to this lack of reporting was the 2005 deposition by scientists from Johnson & Johnson of microarray data from tests of a library of 104 hepatotoxicants into the public toxicogenomics database CEBS (Chemical Effects in Biological Systems). This data was originally used to identify signatures of different toxicities which can be used to classify novel agents and thus facilitate drug development. Now published, the data can be used by regulators and other scientists to evaluate additional signatures for selectivity and specificity.

The validation of a novel biomarker will require regulatory approval, and in 2005 the US Food and Drug Agency (FDA) released the Guidance for Industry: Pharmacogenomic Data Submissions, describing the voluntary submission of genomics data, with the aim of supporting the validation of novel biomarkers based on gene expression changes. The European Medicines Agency (EMEA) has the Pharmacogenetics Working Party to provide recommendations on all matters relating to pharmacogenetics and has issued a joint statement with the FDA regarding the voluntary submission of genomics data. In 2007 a genomic biomarker set was submitted to the FDA from the public-private partnership Predictive Safety Testing Consortium (PTSC). In parallel, a fully public effort sponsored by ILSI-HESI also submitted qualification data for novel renal biomarkers to the FDA and EMEA. The PSTC has also submitted a panel of renal biomarkers for validation in 2008.

In eight short years the field of toxicogenomics was defined, the utility of consortia and data sharing was established, and regulators established a mechanism for the submission of toxicogenomics data for review. The race to identify novel gene expression biomarkers for regulatory approval has been best advanced by consortia, supporting the argument that issues of safety can be well addressed by pooling data and resources. Moving forward, it is anticipated that novel biomarkers for cardiotoxicity will be the next up, again supported by a Technical Committee from HESI, with biomarkers for muscle toxicity, hepatotoxicity and cardiovascular toxicity under study by the PSTC.

These important steps are just the beginning. While safety in pre-clinical toxicology is likely to be improved with the acceptance and validation of new and better markers of toxicity, the need for more sensitive clinical tests remains, as does the recommendation to reduce animal testing. For these endpoints scientists must move the primary discovery work from animals to humans and cell culture or discover underlying commonalities of biology that will permit extension of current work into these different systems. The goals of producing valid, accepted tests that are more sensitive and less expensive to perform is being met in pre-clinical safety, and will most likely hold in the clinic as well.
©2009 Jennifer Fostel

Leave a Comment

*