‘Omics – Emerging Science & Policy

Omics, Bioinformatics, Computational Biology

Emerging Science & Policy

Last updated: July 3, 2014

Two major areas of emerging research and policy relevant to -omics, bioinformatics, and computational toxicology are their applications in drug product development and safety, and in predictive toxicology for the hazard and risk assessment of chemicals, pesticides, and consumer products.

Regulatory Science at the US FDA

The US Food and Drug Administration’s (FDA) Critical Path Initiative and the related Critical Path Institute were established as collaborative efforts with industry, academia, and government partners to stimulate scientific and regulatory innovation for the development and evaluation of FDA-regulated medical products. One of the early plans from this initiative, the Critical Path Opportunities List, provided “examples of how new scientific discoveries—in fields such as genomics and proteomics, imaging, and bioinformatics—could be applied during medical product development to improve the accuracy of the tests used to predict the safety and efficacy of investigational medical products.” Another industry collaboration, the International Serious Adverse Event Consortium, is comprised of seven pharmaceutical companies working together and consulting with the FDA to develop toxicogenomic tests to predict human liver toxicity and other side effects from drugs. The consortium is identifying genetic markers or variations in human DNA that indicate drug-induced adverse effects such as liver toxicity. These markers can then be developed into genetic tests to determine a person’s specific risk before taking a drug, a new development known as personalized safety (echoing the field of personalized medicine). The findings will be public domain, so that all companies can use the results in developing safer drugs.

One of the eight priorities identified in the FDA’s Strategic Plan for Regulatory Science (August 2011) is “Modernize toxicology to enhance product safety.”

Recognizing that new technologies and an understanding of toxicity mechanisms and pathways could improve preclinical safety predictions, this priority will focus on:

  • Developing cellular, animal models and assays that better predict patient response; includes understanding toxicity mechanisms at multiple levels of biological organization, (genes, proteins, pathways, and cell/organ function);
  • Identifying and evaluating biomarkers for monitoring toxicities, side effects, and abnormalities that can be used in non-clinical and clinical evaluations; and
  • Developing and using computational tools and in silico modeling to integrate and draw conclusions from a wide range of preclinical safety data types.

The FDA’s National Center for Toxicological Research (NCTR) supports a number of programs that use omics, bioinformatics, and computation biology approaches for drug and/or chemical safety and risk assessment, especially the following two divisions: Bioinformatics and Biostatistics and Genetic and Molecular Toxicology. The NCTR also provides bioinformatics tools and databases useful in toxicology research, and is collaborating with pharmaceutical companies to devise liver toxicogenomic approaches for screening drug candidates in preclinical studies.

An important tool developed by NCTR is ArrayTrack™ tool “for storing both microarray data and experiment parameters associated with a pharmacogenomics or toxicogenomics study. Many statistical and visualization tools are available with ArrayTrack™ which provides a rich collection of functional information about genes, proteins, and pathways for biological interpretation.” Harris et al. (2009) explain the use of ArrayTrack™ and omics data in FDA submissions. An interesting article about omics data for herbal products also discusses implications for regulatory submissions of omics data (Pelkonen et al., 2012).

The FDA is also working with the National Cancer Institute to identify best practices in the “development, evaluation, and translation of omics-based tests” for clinical applications (IOM, 2012).

Regulatory Science at US EPA

The US EPA has several programs in place for the development and assessment of omics, bioinformatics, and computational biology data and tools for regulatory applications. The EPA’s Interim Policy on Genomics (2002) stated that genomics data may be submitted to the EPA for consideration in the decision-making process, but that these data alone are not sufficient. The document Potential Implications of Genomics for Regulatory and Risk Assessment Applications at EPA (December 2004) was developed by the EPA’s Genomics Task Force to stimulate discussion on the implications of genomics technologies to EPA programs and policies. Since that time, a number of new programs and policies have evolved.

The EPA’s National Center of Computational Toxicology (NCCT) Computational Toxicology research program (CompTox) conducts research on new methods for the purpose of more efficiently managing the safety of chemicals. CompTox research “integrates advances in molecular biology, chemistry, and computer science to identify important biological processes that may be disrupted by chemicals and tracing those biological disruptions to related dose and human exposure to chemicals.” The goal is to develop relevant information for prioritizing chemicals for more in depth testing. EPA CompTox projects include (1) research projects in high-throughput chemical screening, exposure prediction models, and virtual simulation models (Virtual Liver and Virtual Embryo); (2) computational toxicology models and databases, including ACToR and ToxCast; and (3) grants and funding to other research centers through the Science to Achieve Results (STAR) program.

The US EPA’s Toxcast™ Program was established in 2007 with the goal of prioritizing chemicals for further evaluation of their potential toxicity using a large number of in vitro assays and model organisms. One component of ToxCast involves screening a chemical library using high-throughput biochemical and cell-based screening assays to measure “the chemical perturbation of critical cellular signaling pathways that may represent potential modes of chemical toxicity” (Houck et al., 2009; Kleinstreuer et al., 2014). The EPA indicates the ToxCast screening assays are being used to “help inform chemical prioritization” for chemicals in the Endocrine Disruption Screening Program, testing under the Toxic Substance Control Act, and the Safe Drinking Water Act’s candidate contaminant lists.

The report Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (External Review Draft; September 2013), explains the EPA’s Next Gen program and how the new types of molecular and systems biology data and computational approaches will be used in making risk assessments. “The goal of this effort is to advance risk assessment by facilitating faster, less expensive, and more robust assessments of public health risks by EPA’s Office of Research and Development. The specific aims of the program are to:

  • demonstrate proof of concept that recent advances in biology can better inform risk assessment;
  • understand what information is most useful for particular purposes (value of information);
  • articulate decision rules for use of new types of data and methods to inform risk assessment; and
  • identify important data gaps.”

Prototypes or case studies of various types of methods were used as examples in examining the specific aims. “The prototype results presented in this report demonstrate proof-of-concept for an integrated approach to risk assessment based on molecular, computational, and systems biology. In addition, they explore which types of information appear most valuable for specific purposes and articulate some decision considerations for use of data. Based on lessons learned from this effort, near-term and longer term implications for risk assessment are also discussed.”

Basic Research

Cellular phenotyping is an approach developed to analyze gene function and drug action at the cellular level. DNA microarrays have been successful in mapping changes in gene expression. Cellular phenotyping extends this technology to evaluating changes in the cell at the level of protein expression by using RNA interference (RNAi) to study genome-scale loss-of-function. For example, Cenix Bioscience developed a proteome-level profiling technique based on “broad, quantitative surveys of protein levels in RNAi- and drug-treated cells using antibody-independent, mass spectrometry-based analyses.” The European Mitocheck Project extended the utility of this gene-silencing approach by employing high-throughput time-lapse live-cell imaging to capture cellular phenotypic changes over time. Mitocheck collaborators used time-lapse imaging, high-throughput RNA interference (RNAi) assays, and computational image processing to identify human genes/proteins involved in mitosis (cell division). These cell-based assay techniques are being used by other researchers to further our understanding of the role of specific proteins in cellular processes. The EU FP7 MitoSys Project will build on the previous studies to develop an understanding of “mitosis from a systems biology perspective…to figure out how mitotic proteins function and interact with each other in a mitotic cell.”

Another RNAi approach in cells based on antibody detection was used by Friedman and Perrimon (2006) to develop a high-throughput assay to define signaling pathway regulation. A more recent report from this lab expands their effort to fully identify all components of the ERK pathway by conducting in parallel genome-wide RNAi screens and protein-protein interaction (PPI) mapping (Friedman et al., 2011). Their “combined systematic approach using complementary functional genomic and interactome technologies,” anticipated “to uncover direct regulators and more completely describe the landscape of a signaling pathway,” did identify new ERK regulators.

The utility of integrative approaches to molecular screening for pathways analysis has been described in many other studies, including: Major et al., 2008; Berndt et al., 2009; Falschlehner et al., 2010; Seyhan & Rya, 2010; Damotte et al., 2014; Cinghu et al., 2014.

This page covers only a few of the many research and regulatory science initiatives in the field of molecular and computational biology that are emerging as useful approaches for application in hazard testing and/or risk assessment. If you have a specific topic or program you would like to see covered, please send it to info@alttox.org.

New Perspectives

For some new perspectives on -omics, bioinformatics, and computational biology, read the following invited commentaries:

Author(s)/Contributor(s):
Sherry L. Ward, PhD, MBA
AltTox Contributing Editor

AltTox Editorial Board reviewer(s):
George Daston, PhD
Procter & Gamble