OpenTox USA 2013: Conference Summary

Home / In the Spotlight / OpenTox USA 2013: Conference Summary

In the Spotlight

OpenTox USA 2013: Conference Summary

Marilyn Matevia, Humane Society of the United States

Published: January 11, 2014

An increasingly effective approach to reducing the use of animals in toxicity testing, while improving the data foundations for risk assessment and predictivity, is to expand and facilitate access to existing stores of publically available chemical, bioinformatics, and animal test data.  The data can be used to prioritize screening and testing, reduce duplication of testing, and enrich predictive computational models.  One project working to advance these capabilities is OpenTox, originally funded in 2008 as an FP7 EU Research Project.  Though no longer seeded, OpenTox continues to develop and refine a universal framework for data-sharing and building web-based predictive toxicology applications.

In late October 2013, OpenTox project coordinators – in collaboration with ToxBank (a web-based toxicity data warehouse) – held a 2-day conference, OpenTox USA, at the North Carolina Biotechnology Center in Research Triangle Park.  Conference organizers aimed to build on the exchange of information and ideas shared a month earlier at OpenTox Europe (Mainz, Germany, September 2013).  At the end of each day’s extensive slate of presentations, participants were encouraged to discuss the talks and brainstorm about specific challenges in data sharing, modeling, extrapolation, and risk assessment, during informal 90-minute “knowledge cafes.”

In the meeting’s opening session, “The Data Foundation for Predictive Toxicology,” presentations offered introductions and overviews of several publicly available databases, and discussed progress and challenges toward unifying the terms, concepts, and standards that will enable more effective data and model-sharing.  Barry Hardy (OpenTox; Douglas Connect) described the philosophy and design principles shaping the OpenTox Framework and gave an overview of the applications and services it currently supports.  Scott Auerbach (US National Institute of Environmental Health Sciences [NIEHS]) provided a detailed orientation to the DrugMatrix Database, a comprehensive and freely available database of thousands of toxicology studies on rats or rat hepatocytes.  Matt Martin (US Environmental Protection Agency [EPA]) explained how the National Center for Computational Toxicology (NCCT) applies the BioAssay Ontology to annotate and classify ToxCast assays and endpoints, and runs the output through an 8-level Data Analysis Pipeline.  Both of these components of the workflow aim to make the use and interpretation of ToxCast data more consistent and transparent.  Asif Rashid (NIEHS) showed participants how to view, filter, and visually mine studies in the relational Chemical Effects in Biological Systems (CEBS) database.  Kristina Thayer (US National Toxicology Program [NTP]) outlined the process by which the NTP Office of Health Assessment and Translation (OHAT) conducts systematic literature reviews to evaluate evidence and determine whether or not additional research is needed to reach a conclusion about hazard identification.  Finally, Dimitar Hristozov (Molecular Networks GmbH) presented an overview of the COSMOS database, including a case study that demonstrated how the database can help to reveal mechanistic pathways to adverse outcomes.

The second session of the day focused on the more “holistic” perspectives of building and using predictive models.  Andre Kleensang (Center for Alternatives to Animal Testing [CAAT]) led off this session with the case for Evidence-Based Toxicology (EBT) – with its insistence on transparency, consistency, and objectivity – as a quality-assurance mechanism for data and models in predictive toxicology.  Glenn Myatt (Leadscope) described how ToxBank is using the OpenTox framework to construct a data warehouse supporting SEURAT-1 projects, and demonstrated the tools under development to facilitate data analysis.  Pau Sancho-Bru (Institut d’Investigacions Biomediques August Pi I Suyner) presented preliminary findings and challenges for the HeMiBio project, toward creating a microfluidic bioreactor capable of simulating human liver functions.  Stefan Kramer (Johannes Gutenberg University of Mainz) presented the results of an analysis of ToxCast data that evaluated several approaches to predictive model-building. Richard Berger (US Food and Drug Administration [FDA]) then described how his group has enhanced the predictive strength of 3D-SAR (three-dimensional spectral data-activity relationship) models.

Wednesday’s sessions focused on the use of predictive toxicology models in extrapolation and risk assessment.  Harvey Clewell (The Hamner Institutes for Heath Sciences) described the challenges of in vitro-to-in vivo extrapolation – especially those that affect the concentration of a chemical in vivo (absorption, distribution, metabolism, and excretion) – and showed how physiologically-based pharmacokinetic (PBPK) modeling of in vitro results can account for those processes to enable more realistic extrapolation estimates.  John Wambaugh (EPA) presented an overview of ExpoCast, a project that developed a method for estimating the risk of environmental exposure to Tox21 chemicals and uses the information to prioritize additional testing on the basis of probable risk.  Ivan Rusyn (University of North Carolina) introduced HAWC, a web-based interface that allows users to create detailed and customized reports on the human health effects of chemicals from multiple information sources: literature reviews, data extracts and models, and (eventually) exposure data.  Tom Knudsen (EPA) discussed the development and use of Virtual Tissue Models to predict the impact of chemical perturbations at multiple levels of biological organization (molecular, cellular, multi-cellular, tissue, organ) and stages of development.

In the final session of the conference, Katya Tsaioun (Safer Medicines Trust) reviewed the poor track record of animal toxicity studies in predicting human safety, and described her organization’s proposal for a “comparative validation” study to assess the predictive value of human cell-based in vitro studies compared to traditional animal-based toxicity tests.  Luis G. Valerio, Jr. (FDA), presenting remotely, described how the FDA’s Center for Drug Evaluation and Research is building and using in silico (quantitative) structure-activity relationship ([Q]SAR) models to screen new drugs for their potential to lengthen the heart’s QT interval, a phenomenon that is associated with ventricular arrhythmia and is an indicator of cardiotoxicity.  Pertti Hakkinen (US  National Library of Medicine [NLM]) gave an orientation to NLM’s online resources with a special focus on TOXNET – which offers extensive bibliographic information, “canned” (prepared) bibliography searches, and access to numerous toxicology databases.  Barry Hardy and Asish Mohapatra (Health Canada) described the Remediation Technology Exposure Check List Tool devised by Health Canada’s Contaminated Sites group to evaluate and manage human exposure risk at sites where chemical contamination has taken place.  The tool’s flowcharts and decision matrix guide users through a set of predictive toxicology tools (such as SmartCyp, Bioclipse, and ToxPredict) for characterizing the potential risks and determining the most effective remediation strategies.  Finally, Tim Pastoor (Syngenta) presented the Health and Environmental Sciences Institute’s (HESI) RISK21 Roadmap, a tiered-approach to risk assessment in which exposure estimates are factored in at the beginning of the process, so that the need for additional information is driven by both the likelihood of exposure as well as an estimate of the potential hazard.  He presented data on two case studies, one “rich” (with pre-existing data on a new pesticide) and one “poor” (a group of chemicals in drinking water), and showed that the exposure-driven roadmap approach was equally effective in prioritizing the chemicals needing additional work.

The conference ended with additional time devoted to open discussion and collaboration.  In co-chair Scott Auerbach’s view, “The meeting highlighted the need to develop methods for more seamless data exchange and to work on approaches to integrate the different data streams in a way that maximizes the complementarity of the data coming from the different experimental platforms.  Finding solutions to these challenges will help advance the science of toxicity testing and the development of alternative methods for hazard characterization.”  Barry Hardy noted that the organizing committee plans to publish the combined proceedings of OpenTox Euro and OpenTox USA.  Presentation slides will be made available on the OpenTox website.