RISK ASSESSMENT + NEW TECHNOLOGIES: Opportunities to Assure Safety Without Animal Testing and Better Protect Public Health?

Home / New Perspectives / Overarching Challenges / RISK ASSESSMENT + NEW TECHNOLOGIES: Opportunities to Assure Safety Without Animal Testing and Better Protect Public Health?

Overarching Challenges

RISK ASSESSMENT + NEW TECHNOLOGIES: Opportunities to Assure Safety Without Animal Testing and Better Protect Public Health?

By Julia Fentem & Carl Westmoreland, Unilever
Published: December 6, 2007
About the Author(s)
Dr. Fentem is Head of Unilever’s Safety & Environmental Assurance Centre (SEAC), which is responsible for assuring consumer, occupational and environmental safety of Unilever products and processes globally. She received her PhD in biochemical toxicology from the University of Nottingham Medical School in the UK in 1991. Julia has a first class degree in biochemistry from Leeds University and an MSc in toxicology from Birmingham University. Prior to joining Unilever in 1998, Julia worked for the European Commission (ECVAM) in Italy and, before that, as a toxicologist with FRAME in the UK.

Dr. Fentem has published extensively on topics such as metabolism-mediated toxicity, validation of new testing approaches, and chemicals risk assessment. She is a Board member of the UK NC3Rs, and a member of the British Toxicology Society Executive Committee. Julia has served on various OECD, European Commission and US government expert groups (e.g. on skin penetration guidelines, peer review of the rodent uterotrophic assay, validation of several in vitro tests).

Julia Fentem
Unilever
Safety & Environmental Assurance Centre (SEAC)
Colworth Science Park
Sharnbrook
Bedfordshire MK44 1LQ
UK
Email: julia.fentem@unilever.com

Dr. Westmoreland is the Director of Science and Technology within Unilever’s Safety & Environmental Assurance Centre (SEAC), which is responsible for assuring consumer, occupational and environmental safety of Unilever products and processes globally. Carl has a degree in biology from the University of York in the UK and received his PhD in in vitro toxicology from the University of Surrey in the UK in 1997. Prior to joining Unilever in 2003, Carl worked for GlaxoSmithKline as a toxicologist working both in in vitro toxicology and in project management of preclinical safety studies, including regulatory packages for phase I and II clinical trials.

Dr. Westmoreland has published on topics such as genetic toxicology, in vitro toxicology and alternative approaches to assuring safety without animal testing. He represents Unilever in scientific committees of the European Cosmetic, Toiletry and Perfumery Association (Colipa), the British Toxicology Society and the European In Vitro Testing Industrial Platform. He also represents Colipa on the ECVAM scientific advisory committee and is a member of one of the working groups within the European Partnership on Alternative Approaches to Animal Testing (EPAA).

Carl Westmoreland
Unilever
Safety & Environmental Assurance Centre (SEAC)
Colworth Science Park
Sharnbrook
Bedfordshire MK44 1LQ
UK
Email: carl.westmoreland@unilever.com

Assuring safety without animal testing is a formidable challenge. In 2004, we published “a suggested future direction” (Fentem et al., 2004). A recent report from the US National Research Council, “Toxicity Testing in the Twenty-first Century: A Vision and a Strategy” (NRC, 2007), commissioned by the US Environmental Protection Agency (EPA), describes in detail a similar approach. The scientific evidence generated in recent years indicates that our best hope for future progress and success in assuring safety without animal testing is to make significant investment in applying the new tools and technologies now available to us (e.g. omics and informatics), in parallel with further defining and evolving the risk-based approaches we use to protect public health. To follow-up on the vision and strategy articulated in the NRC report, it would appear that the scientific thinking and tools are now available to warrant establishing an international collaborative research initiative, to transform our traditional animal-based approaches to toxicity testing into ones which protect human health and the environment without requiring animal testing.

Background

During the last 10 years, we have been successful in achieving the regulatory adoption of several Three Rs methods. In vitro replacement alternatives for skin corrosion, phototoxicity and dermal absorption are now included in the Organisation for Economic Cooperation and Development (OECD) testing guidelines, as are refinement and reduction alternative tests and testing strategies for skin sensitization and acute toxicity, both for local (eye and skin corrosion and irritation) and systemic effects (Fentem, 2006). In April 2007, ECVAM’s Scientific Advisory Committee (ESAC) endorsed the scientific validity of an in vitro replacement test for acute skin irritation, and this will now be proposed for regulatory adoption and implementation (ECVAM, 2007).

There are some examples of the use of in vitro alternatives in the risk assessment process; for example, information derived from in vitro skin penetration tests can provide valuable input to risk assessments. However, many of the current replacement methods were designed and validated to produce data for hazard identification and the classification and labelling of chemicals, rather than to generate the information needed for risk assessment.

There has been a shift in thinking and approach to the development and application of alternative methods during the past 5 years, with more emphasis on test batteries and testing strategies as a way to combine data from different in vitro and in silico tests to make decisions about chemical safety based on risk assessments (Combes et al., 2003; EU, 2005). As we start to consider more complex issues, such as how we might assess the multitude of potential adverse effects following repeated systemic exposure to chemicals without animal testing, we are realising that the focus needs to change – from the direct replacement of a specific animal test to: (1) defining the actual information required to make a safety decision, and then (2) determining how this could be generated without animal testing (Fentem et al., 2004).

Can Data Generated by Applying New Technologies be Used for Risk Assessment?

Decisions about the consumer safety of products are typically made on the basis of a risk assessment, in which data on the potential hazards of the ingredients are interpreted in the context of the likely exposure to the product, i.e. the concentration of the ingredients in the product and how the product is used by consumers. Traditionally, much of the hazard data on chemicals have been generated by applying technologies developed for histological and clinical chemical analyses to animal models. Today improved non-animal in vitro and in silico models are becoming increasingly available, and new technologies such as proteomics and bioinformatics approaches enable us to generate and interpret new types of non-animal data. However, developing ways to weight and integrate these data to enable risk-based safety decisions to be made represents a major challenge.

The new technologies that are being developed appear to offer significant opportunities for a step-change in the approaches used in future to assess consumer safety (Fentem et al., 2004). These new technologies include: “omics” technologies, informatics, advanced analytical methods, and bioengineering. Application of these tools and technologies for risk assessment should enhance the scientific basis of public health protection as well as enabling us to move away from animal testing (Fentem, 2006). This vision, and a strategy to deliver it, has been articulated in a recent report from the US National Research Council (NRC), commissioned by the US Environmental Protection Agency. In its Summary, the report states that: “Advances in toxicogenomics, bioinformatics, systems biology, epigenetics, and computational toxicology could transform toxicity testing from a system based on whole-animal testing to one founded primarily on in vitro methods that evaluate changes in biologic processes using cells, cell lines, or cellular components, preferably of human origin” (NRC, 2007).

Moving from Theoretical Concepts to Practical Reality: is a New Approach Feasible?

Much of the research that Unilever scientists have been undertaking during the past 4 years supports the need to embrace, and further invest in, the strategy proposed by the expert committee (the Committee on Toxicity Testing and Assessment of Environmental Agents, CTTAEA) that prepared the US NRC report (NRC, 2007). We have been working to assess the feasibility, in practice, of the theoretical “conceptual approach” we published in 2004 (Fentem et al., 2004). The key “building blocks” of the conceptual approach that enable the safety (risk management) decision are: (a) risk assessment, (b) data interpretation and processing (translation), (c) models, and (d) technologies. It illustrates how we envisage being able to take safety decisions that adequately protect human health without animal testing, by comparing experimental biological data with relevant clinical data, since these technologies and models are also being widely applied in clinical research.

The objectives of our research programme are to:
(a) develop new risk assessment approaches
(b) develop and apply new models for predicting adverse effects
(c) evaluate the usefulness of data for risk assessment from applying new technologies
(d) maximise the use of both new and existing data

The feasibility of the approach has been assessed in collaboration with research partners outside and inside Unilever, and we are working on different projects with about 20 US- and UK-based academic and contract research groups and informatics organisations.

Generating the Scientific Evidence: Case Study – Skin Allergy

To ensure that our products do not induce skin (contact) allergy in consumers we currently use information on the concentrations of the ingredients in the product, and on how the product is used by consumers, together with data generated in the mouse local lymph node assay (LLNA) to assess whether the chemical ingredients have the potential to induce skin sensitization in humans. Our ultimate aim is to develop a scientifically robust new approach to enable us to perform risk assessments without the generation of new data in animal models.

In the area of skin allergy (sensitization), we are starting to translate parts of the CTTAEA vision (NRC, 2007) into practical application. We have invested in developing new capabilities in, for example: (a) mathematical and computational modelling of biology; (b) tools for mapping and analysing biological networks; (c) omics technologies; (d) biological and chemically-based models; and (e) data integration approaches.

Risk Assessment

A principal objective was to construct and evaluate a new risk assessment framework for skin allergy. The major changes are in the types of biological and chemical non-animal data inputs we are generating, and in the approaches being applied for modelling and integrating the data. We are also seeking to improve the data inputs on consumer exposure.

In determining whether a chemical has the potential to induce sensitization, in vitro data and in silico models are being generated based on our current mechanistic understanding of the key events considered to be important in skin sensitization. Recognizing that our starting point is not a definitive set, the data inputs include: chemistry parameters, skin bioavailability, keratinocyte responses, protein-chemical reactivity, Dendritic cell maturation and T-cell proliferation. Some relatively simple scoring approaches to integrating these data are being evaluated (Jowsey et al., 2006), alongside the application of more complex approaches supported by various mathematical and informatics tools (see below).

To improve our current estimates of skin exposure to chemicals, we are applying probabilistic modelling approaches, such as Monte Carlo techniques, to understand the population distribution of exposure values rather than just taking mean estimates. This enables the refinement of our risk assessments, and makes the sources of uncertainty explicit. It is also possible that for risk assessment approaches that do not rely on data generated in animal models, more information on the local concentrations of chemicals and their flux through different skin compartments will be needed, in addition to the information on exposure used currently (i.e. dose applied to the skin surface).

Risk-based safety decisions also require detailed knowledge of the context and scientific background. For skin allergy, epidemiological and clinical data are valuable in providing this “real-life” context. Better understanding of the prevalence of skin allergy from exposure to specific chemicals, and of the relationships between duration and frequency of exposure and the elicitation of clinical symptoms, will also improve our risk assessments. Similarly, it is important to increase our understanding of how the induction of skin sensitization in the current mouse model relates to the elicitation of allergic responses in sensitized humans.

Models

To understand how systems biology approaches might be applied to support safety risk assessments in the future, we have collaborated with Entelos, Inc. to construct a computer-based mathematical model of the induction of skin sensitization (MacKay et al., 2007). The vision was to build a transparent, robust, mechanistic and quantitative in silico model that captured our current understanding of the biological pathways, processes and mediators involved in skin sensitization in vivo. The model has been constructed such that the biology can be interrogated computationally in an iterative, hypothesis-driven manner. The benefits of the model are considerable, particularly as a way to integrate diverse data quantitatively and to interpret these within the wider biological context. We are now using the model to: (a) continue to generate new biological understanding; (b) guide our experimental research programme and focus our development of new predictive in vitro assays; and (c) inform our risk assessments. Further development of the initial version of the model is in progress, via incorporation of new data being generated in biological and chemistry-based in vitro assays.

Our work on in vitro modelling of the chemistry and biology of skin sensitization complements research being undertaken by others, for example as part of the COLIPA research programme (COLIPA, 2007) and via the EU Framework 6 Programme integrated project, Sens-it-iv (Sens-it-iv, 2007). In particular, we are developing tools to assess binding of chemicals to peptides underpinned by mechanistic classification of sensitizers based on their chemistry. We are trying to build upon the peptide binding assays currently available (Gerberick et al., 2007), to incorporate kinetic, mechanistic and specificity considerations. Our development of new cell-based assays for allergenic potential has focussed on: (a) optimization of Dendritic cell-based assays through addition of inflammatory signals and the identification of new biomarkers (e.g. intracellular signalling pathways); and (b) development of a robust in vitro T-cell proliferation assay, based upon a human Dendritic cell – T-cell co-culture model.

As the skin may metabolize chemicals and result in the formation of more active chemical species within the skin, there is also a need to develop new in vitro models of human skin metabolism. New technologies such as quadrupole time-of-flight mass spectrometry can now be used to identify metabolites in biological systems where previously this has been very difficult technically. Data generated by applying these measurement techniques will be important for risk assessments of skin exposure to chemicals.

In the area of dermal kinetics, we are developing methods to study the detailed kinetics of permeation of chemicals through the heterogeneous layers of skin (Pendlington et al., 2007). The standard in vitro skin penetration methodology described in OECD test guideline 428 is being evolved to provide information on local concentrations of chemicals and their flux through different skin compartments. Time-course data on the epidermal / dermal disposition of chemicals (e.g. 14C-cinnamic aldehyde) in different vehicles in human split-thickness skin have been generated. These data enable the exposure information used in risk assessments for chemicals applied dermally to be refined, and provide valuable understanding of the effects of different vehicles and formulations on the dermal kinetics of chemicals.

Technologies

Our major research into the applicability of new technologies relates to gaining experience in generating, integrating and interpreting ‘omics’ data from various techniques, including transcriptomics, proteomics and metabolomics. The overall objective is to evaluate the applicability of ‘omics’ technologies, in combination, for generating data useful for consumer safety risk assessments. We selected skin inflammation as the clinical adverse response to investigate. Human volunteers are treated with sodium dodecyl sulphate (SDS) to induce a low-grade skin irritant response. Skin biopsies, interstitial fluid, blood and urine samples are then collected and analysed by using various analytical platforms: DNA microarrays (transcriptomics); liquid chromatography / tandem mass spectroscopy (proteomics); and gas chromatography / mass spectroscopy (metabolomics). Relevant functional data from cytokine and immuno-histochemical analyses are also being generated. The intention is that the output of this work will provide a better understanding of the molecular mechanisms of chemically-induced skin inflammation, as well as practical experience in generating, integrating and interpreting very large and diverse data-sets of ‘omics’ and other in vitro data.

The management and analysis of the vast amounts of data generated in these experiments represents a major challenge and can be extremely time-consuming. We have developed an informatics platform to support the analysis and interpretation of these experimental data in an integrated manner. Working with the European Bioinformatics Institute, in-house databases have been built and federated to Web-based databases for adding further information about the biomolecules identified in our experiments. Working with the University of California San Diego, the open-source software Cytoscape (Shannon et al., 2003) is being applied to integrate the data we are generating with biological network and pathway data.

Preliminary results show that differences in erythema responses observed visually after skin patch testing with SDS correlate with, for example, the microarray gene expression profiles. In addition, different patterns of gene expression changes are also observed in individuals who do not respond with a visible irritant response to SDS in the skin patch tests. It is hoped that such information will ultimately be useful when considering whether small changes seen in gene expression analyses represent ‘adaptive’ or ‘adverse’ responses. It is anticipated that as more data are generated we will gain further insights into the molecular mechanisms of skin inflammation and be better able to understand how to use these new technologies in the future.

Data Integration

A key requirement in the development of new approaches to risk-based safety assessment is to develop tools and approaches to integrate data of diverse types. We are working to develop and evaluate three different kinds of approaches: (a) “weight of evidence” methods, including simple scoring and more complex statistical (e.g. Bayesian) approaches; (b) in silico mechanistic modelling; and (c) biological network mapping and analysis of chemical-induced modulation of interactions between molecules in human systems.

Jowsey et al. (2006) have published an initial attempt to define a statistically based weight-of-evidence approach to integrating non-animal data as surrogates for several key processes known to be important mechanistically in the induction of skin sensitization. Whilst recognising that the list of assays or data requirements is not definitive, a conceptual approach for trying to integrate data via a simplistic scoring system has been outlined and is being evaluated. This is a pragmatic starting point given the limited availability of new types of data. As more data are generated, statistical (e.g. Bayesian) approaches will be used to model and interpret the data in a more mathematically robust way.

Systems biology approaches provide significant new opportunities for integrating and interpreting non-animal data in an overall biological context. The in silico model developed with Entelos gives us a biological framework and tools to integrate many different types of data (MacKay et al., 2007). It is the intention that our future research data, and those published in the scientific literature, will be incorporated into the model as we continue to develop it as an integral part of our new non-animal approach to skin allergy risk assessments.

To make sense of much of the ‘omics’ data we are generating, and to be able to interpret them in a broader biological context, we have collaborated with scientists at the University of California San Diego in developing and applying the Cytoscape software. As a result, gene networks and human “interactomes” can now be constructed, visualised and analysed by applying a range of bioinformatics tools and systems biology approaches (Warner et al., 2007).

A Proposed Way Forward: Can We Transform Current Safety Assessment Practices by Applying New Technologies and Scientific Insights?

Assuring consumer safety without animal testing is a formidable challenge; however, through the application of new technologies and the further development of risk-based approaches for safety assessment we remain confident that it is ultimately achievable. The scientific evidence generated in recent years indicates that we should be able to incorporate data from the new and emerging technologies in biology and medicine in our safety risk assessments.

Whilst the new approaches appear to offer much promise, there are some key questions to address as research progresses, such as: (a) which measurements (endpoints, biomarkers) are most relevant and reproducible?; (b) which non-animal models are the most suitable and representative of the in vivo situation?; and (c) how do we best visualize, analyze and interpret the huge amounts of different types of data generated with these new approaches? In the context of assuring safety without animal testing whilst continuing to adequately protect human health, it is important that the answers to these questions are based on understanding the safety decision required and are driven by the needs of the risk assessment. It is possible that the precise combination of non-animal models and technologies used to address risk assessments for skin allergy in the future will differ, depending on the exact nature of the decision required.

We have gained considerable experience in developing and applying non-animal methods during the past 20 years, understanding the importance of establishing the scientific relevance and reliability of the models and data generated, and how (and how not) to validate alternative tests as a pre-requisite for regulatory adoption. It is crucial that we learn from these past successes and failures, and evolve the procedures to meet the requirements of applying new science and technology and risk-based approaches to protecting public health without animal testing. For example, what may have been an appropriate means to assess independently the scientific validity of an in vitro test to identify skin corrosives, will not be fit-for-purpose when it comes to assessing new computational biology approaches. Likewise, how can the validity of a single in vitro assay be assured, when the actual ‘value’ of the data from the assay in the risk assessment process will only be apparent when integrated with data from other assays and exposure information? The core principles of scientific quality, robustness and transparency should be built upon in defining ways to assess the applicability of the new tools and approaches for safety decision-making in a regulatory context. The “evidence-based toxicology” initiative from ECVAM and CAAT (EBTOX.org, 2007) appears to be a step in the right direction to facilitating this.

We suggest that the best hope for future progress and success in assuring safety without animal testing is to make significant investment in applying the new tools and technologies now available to us, in parallel with further defining and evolving the risk-based approaches we use to protect public health. The vision and strategy articulated in the NRC report provide a framework for establishing an international collaborative research initiative, enabling all stakeholders to buy-in to a shared goal and agenda to transform our traditional animal-based approaches to toxicity testing into ones which protect human health and the environment without requiring animal testing.

Acknowledgements

The hard work, ideas and enthusiasm of the numerous scientists in Unilever’s Safety & Environmental Assurance (SEAC) working on this research are acknowledged. The research is part of Unilever’s ongoing effort to develop novel ways of delivering consumer safety.
©2007 Julia Fentem & Carl Westmoreland

References
COLIPA. (2007). European Cosmetic Toiletry and Perfumery Association. Website: (accessed September 10, 2007).

Combes, R., Barratt, M. & Balls, M. (2003) An overall strategy for the testing of chemicals for human hazard and risk assessment under the EU REACH system. Altern. Lab. Anim. 31, 7-19.

EBTOX.org. (2007). Evidence-Based Toxicology. Website: (accessed September 5, 2007).

ECVAM. (2007). News, Events and Meetings. ESAC has endorsed 5 alternative testing methods: skin irritation. Website: (accessed September 5, 2007).

EU. (2005). REACH and the Need for Intelligent Testing Strategies, 33pp., EUR 21544 EN, European Commission Joint Research Centre, Ispra.

Fentem, J.H. (2006). Working together to respond to the challenges of EU policy to replace animal testing. Altern. Lab. Anim. 34, 11-18.

Fentem, J., Chamberlain, M. & Sangster, B. (2004). The feasibility of replacing animal testing for assessing consumer safety: a suggested future direction. Altern. Lab. Anim. 32, 617-623.

Gerberick, G.F., Vassallo, J.D., Foertsch, L.M., Price, B.B., Chaney, J.G. & Lepoittevin, J-P. (2007). Quantification of chemical peptide reactivity for screening contact allergens: a classification tree model approach. Toxicol. Sci. 97, 417-427.

Jowsey, I.R., Basketter, D.A., Westmoreland, C. & Kimber, I. (2006). A future approach to measuring relative skin sensitising potency: a proposal. J. Appl. Toxicol. 26, 341-350.

MacKay, C., Bajaria, S., Shaver, G., Kudrycki, K., Ramanujan, S., Paterson, T., Friedrich, C., Maxwell, G., Jowsey, I., Lockley, D., Reynolds, F. & Fentem, J. (2007). In silico modelling of skin sensitisation. Toxicology. 231, 103.

NRC. (2007). Toxicity Testing in the Twenty-first Century: A Vision and a Strategy, 146pp., Committee on Toxicity and Assessment of Environmental Agents, National Research Council, Washington, DC.

Pendlington, R.U., Minter, H.J., Stupart, L., MacKay, C., Roper, C.S. & Sanders, D.J. (2007). Development of a modified in vitro skin absorption method to study the epidermal / dermal disposition of a contact allergen in human skin, presented at the Third International Occupational & Environmental Exposures of Skin to Chemicals (OEESC) Conference, 17-20 June 2007, Colorado, USA. Website: (accessed September 5, 2007).

Sens-it-iv. (2006). Novel Testing Strategies for In Vitro Assessment of Allergens, project summary. Website: (accessed September 10, 2007).

Shannon, P., Markiel, A., Ozier, O., Baliga, N.S., Wang, J.T., Ramage, D., Amin, N., Schwikowski, B. & Ideker, T. (2003). Cytoscape: a software environment for integrated models of biomolecular interaction networks. Genome Res. 13, 2498-2504.

Warner, G.J., Adeleye, Y.A., Ideker, T., Workman, C.T. & Scott D.J. (2007). A systems approach for the analysis of toxicogenomic data. Toxicology. 231, 109-110.