Metabolomics in Toxicology and Preclinical Research

BASF SE, Experimental Toxicology and Ecology, Ludwigshafen, Germany.
ALTEX 05/2013; 30(2):209-25.
Source: PubMed


Metabolomics, the comprehensive analysis of metabolites in a biological system, provides detailed information about the biochemical/physiological status of a biological system, and about the changes caused by chemicals. Metabolomics analysis is used in many fields, ranging from the analysis of the physiological status of genetically modified organisms in safety science to the evaluation of human health conditions. In toxicology, metabolomics is the -omics discipline that is most closely related to classical knowledge of disturbed biochemical pathways. It allows rapid identification of the potential targets of a hazardous compound. It can give information on target organs and often can help to improve our understanding regarding the mode-of-action of a given compound. Such insights aid the discovery of biomarkers that either indicate pathophysiological conditions or help the monitoring of the efficacy of drug therapies. The first toxicological applications of metabolomics were for mechanistic research, but different ways to use the technology in a regulatory context are being explored. Ideally, further progress in that direction will position the metabolomics approach to address the challenges of toxicology of the 21st century. To address these issues, scientists from academia, industry, and regulatory bodies came together in a workshop to discuss the current status of applied metabolomics and its potential in the safety assessment of compounds. We report here on the conclusions of three working groups addressing questions regarding 1) metabolomics for in vitro studies 2) the appropriate use of metabolomics in systems toxicology, and 3) use of metabolomics in a regulatory context.

Download full-text


Available from: Thomas Hartung, Oct 13, 2015
1 Follower
48 Reads
  • Source
    • "KE is altered and is known to be sufficient to trigger the final AO (Fig. 3). It is assumed that the AO occurs only after a biologically meaningful overall threshold has been passed (Boekelheide and Andersen 2010; Ramirez et al. 2013). However, the situation is more complex, when the MIE (or KE) is necessary, but not sufficient to generate and AO, or when feedback loops or compensatory processes exist between KEs. "
    [Show abstract] [Hide abstract]
    ABSTRACT: A major problem in developmental neurotoxicity (DNT) risk assessment is the lack of toxicological hazard information for most compounds. Therefore, new approaches are being considered to provide adequate experimental data that allow regulatory decisions. This process requires a matching of regulatory needs on the one hand and the opportunities provided by new test systems and methods on the other hand. Alignment of academically and industrially driven assay development with regulatory needs in the field of DNT is a core mission of the International STakeholder NETwork (ISTNET) in DNT testing. The first meeting of ISTNET was held in Zurich on 23-24 January 2014 in order to explore the concept of adverse outcome pathway (AOP) to practical DNT testing. AOPs were considered promising tools to promote test systems development according to regulatory needs. Moreover, the AOP concept was identified as an important guiding principle to assemble predictive integrated testing strategies (ITSs) for DNT. The recommendations on a road map towards AOP-based DNT testing is considered a stepwise approach, operating initially with incomplete AOPs for compound grouping, and focussing on key events of neurodevelopment. Next steps to be considered in follow-up activities are the use of case studies to further apply the AOP concept in regulatory DNT testing, making use of AOP intersections (common key events) for economic development of screening assays, and addressing the transition from qualitative descriptions to quantitative network modelling.
    Archive für Toxikologie 01/2015; 89(2). DOI:10.1007/s00204-015-1464-2 · 5.98 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Information on design principles governing transcriptome changes upon transition from safe to hazardous drug concentrations or from tolerated to cytotoxic drug levels are important for the application of toxicogenomics data in developmental toxicology. Here, we tested the effect of eight concentrations of valproic acid (VPA; 25-1000 μM) in an assay that recapitulates the development of human embryonic stem cells to neuroectoderm. Cells were exposed to the drug during the entire differentiation process, and the number of differentially-regulated genes increased continuously over the concentration range from zero to about 3000. We identified overrepresented transcription factor binding sites (TFBS) as well as superordinate cell biological processes, and we developed a 'gene ontology (GO) activation profiler', as well as a two-dimensional 'teratogenicity index'. Analysis of the transcriptome data set by the above biostatistical and systems biology approaches yielded following insights: (i) 'tolerated' (≤25 μM), 'deregulated/teratogenic' (150 - 550 μM) and 'cytotoxic' (≥800 μM) concentrations could be differentiated. (ii) Biological signatures related to the mode of action of VPA, such as protein acetylation, developmental changes, and cell migration emerged from the teratogenic concentrations range. (iv) Cytotoxicity was not accompanied by signatures of newly-emerging canonical cell death/stress indicators, but by catabolism and decreased expression of cell cycle associated genes. (v) Most, but not all of the GO groups and TFBS seen at the highest concentrations were already overrepresented at 350 - 450 μM. (vi) The 'teratogenicity index' reflected this behavior, and thus differed strongly from cytotoxicity. Our findings suggest the use of the highest non-cytotoxic drug concentration for gene array toxicogenomics studies, as higher concentrations possibly yield wrong information on the mode-of action, and lower drug levels result in decreased gene expression changes and thus a reduced power of the study.
    Chemical Research in Toxicology 01/2014; 27(3). DOI:10.1021/tx400402j · 3.53 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. ■ CONTENTS
    Chemical Research in Toxicology 01/2014; Systems Toxicology(Special issue). DOI:10.1021/tx400410s · 3.53 Impact Factor
Show more