Genotoxicity and Carcinogenicity: Which Alternative Strategy(ies)?
Dr. Gladys D. Ouedraogo
L’Oréal Advanced Research
1 Avenue Eugène Schueller
BP 22 – 93601
Aulnay sous bois, cedex
Why is it so hard to phase out in vivo genotoxicity?
Genotoxicity is a crucial endpoint in safety assessment. In current guidelines (International Congress on Harmonization [ICH], International Working Group on Genotoxicity Testing Procedures [IWGT], Scientific Committee on Consumer Products [SCCP]), positive in vitro results are further investigated using in vivo assays. Although numbers of in vitro assays have been around for decades, in vivo genotoxicity testing still has to be phased out. The main reason for that is the low specificity (many in vitro positives are irrelevant to human) of the in vitro genotoxicity assays. In the last couple of years, this issue has been discussed in several organizations (European Centre for the Validation of Alternative Methods (ECVAM), European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC), European Cosmetic Toiletry and Perfumery Association (COLIPA), International Life Sciences Institute’s Health and Environmental Sciences Institute (ILSI/HESI, IWGT, ICH) and different projects aimed at bringing solutions are ongoing.
Combining in silico and in vitro tools: a step toward predictive toxicology
Another way to tackle this issue would be to combine in silico and in vitro tools. It is worthwhile pointing to the fact that while performances of in vitro assays are usually checked, those of in vivo ones are not readily available. Similarly to in vitro assays, the bioassays are providing information on hazard and not risk. So, why aren’t in vitro results used for risk assessment? Exposure data, toxicokinetics and proper metabolic capability are lacking in simple cell systems. These are the some of the types of information needed in addition to in vitro genotoxicity assays in order to perform risk assessment.
Most of the models (including the in vivo ones) do not provide information about the sequence of events leading to the observed endpoints: they are descriptive. What has been identified as key to using alternative methods is getting insight into mechanisms (see report of the National Research Council: “Toxicity in the 21st Century: A Vision and a Strategy,” National Academies Press, 2007). According to this report, risk assessment can become time- and cost-effective if we move from descriptive to predictive (mechanistically-based) toxicology.
Huge amounts of data have been generated by several companies and agencies worldwide. Information systems have evolved and now are able to handle and process billions of data at once. The next move (which has already started) would be to share data in order to make the best use of what have been done in past, and build the toxicology of the future. Data can be shared within consortia with a well defined legal framework. Building quantitative structure activity relationships [(Q)SARs] is a way to secure the knowledge generated for decades with different in vivo and in vitro assays. These models are useful for prioritizing chemicals for further testing. Also, they can guide the choice of the follow-up assay (a chemical triggering an alert in mutagenesis and not in chromosomal damage will be tested for example in the Ames test rather than both Ames and the micronucleus assay).
The Organisation for Economic Co-operation and Development (OECD) has provided guidelines for developing transparent and mechanistically-based models. So, there is a consensus by all stakeholders (modelers, toxicologists, regulators, and scientists in different fields) on the properties of such models. In addition the OECD is sponsoring a QSAR toolbox for Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH). This toolbox will host not only (Q)SAR models but also databases for read-across. This illustrates how important data sharing is in the field of alternative methods.
Currently, in silico data cannot be used as stand alone. They are complemented by in vitro and or in vivo data. An example of the so called “integrated testing strategy” combining in silico and in vitro approaches is illustrated by the work of Gubbels-van Hal et al.
For in vitro testing, emerging technologies like toxicogenomics, high content screening (HCS) and high throughput screening (HTS), can help investigate the molecular mechanisms leading to specific endpoints. In the literature, several studies aiming at using toxicogenomics for predicting genotoxicity and carcinogenicity can be found (Ellinger-Ziegelbauer et al., Sarrif et al., Thybaud et al.). HCS-based in vitro micronucleus assays are available using automated image analysis platforms (Diaz et al., Kalusche et al.). Combining different in vitro approaches like toxicogenomics, HCS, HTS, and binding to nuclear receptors, holds great promise in term of unraveling molecular events. This is the goal of the US EPA ToxCast program (Dix et al.). Analyzing the large amount of data that will be generated in this type of project is a real challenge. Here again, a collaborative effort is need in order to support such an initiative.
Conclusions and perspectives
Combining in silico with in vitro data may be a way to move towards animal-free genotoxicity and carcinogenicity testing. We are not there yet but work is ongoing in the silico as well as the in vitro areas. Initiatives like REACH and 7th amendment of the European Cosmetics directive in European Union (EU), HPV (High Production Volume program) in the US and the DSL (domestic substance list) in Canada are incentives for seeking alternatives to animal testing.
Rather than having a defined framework, a flexible approach with an insight into the mechanistic pathways will be needed in order to generate relevant data.
©2007 Gladys Ouédraogo
Dix, D.J., Houck, K.A., Martin, M.T., Richard, A.M., Setzer, R.W. & Kavlock R.J. (2007). The ToxCast program for prioritizing toxicity testing of environmental chemicals. Toxicol Sci. 95(1), 5-12.
Ellinger-Ziegelbauer, H., Gmuender, H., Bandenburg, A. & Ahr, H.J. Prediction of a carcinogenic potential of rat hepatocarcinogens using toxicogenomics analysis of short-term in vivo studies. Mutat. Res. Electronic publication ahead of print, July 5, 2007.
Gubbels-van Hal, W.M., Blaauboer, B.J., Barentsen, H.M., Hoitink, M.A., Meerts, I.A. & van der Hoeven, J.C. (2005). An alternative approach for the safety evaluation of new and existing chemicals, an exercise in integrated testing. Regul. Toxicol. Pharmacol. 42, 284–295.
Kalusche, G., Smith, L. & Thomas, N. Ultra High Speed Automated In-Vitro Micronucleus Analysis. SBS congress 2005; Session: ADME/Tox – Poster Session (Presentation: P05007). GE Healthcare – UK.
Sarrif, A., van Delft, J.H., van Schooten, F.J., Gant, T.W., Elliott, B.M., van Ravenzwaay, B., van Steeg, H. & Vrijhof, H. (2005). Toxicogenomics in genetic toxicology and hazard determination: Introduction and overview. Mutat. Res. 575(1-2), 1-3.
Sarrif, A., van Delft, J.H., Gant, T.W., Kleinjans, J.C. & van Vliet, E. (2005). Toxicogenomics in genetic toxicology and hazard determination–concluding remarks. Mutat. Res. 575(1-2), 116-7.
Thybaud, V., Le Fevre, A.C. & Boitier, E. (2007). Application of toxicogenomics to genetic toxicology risk assessment. Environ. Mol. Mutagen. 48, 369-379.