A Decade of Toxicogenomic Research and Its Contribution to Toxicological Science

Division of Bioinformatics and Biostatistics, National Center for Toxicological Research, US Food and Drug Administration, 3900 NCTR Road, Jefferson, AR 72079, USA.
Toxicological Sciences (Impact Factor: 3.85). 07/2012; 130(2). DOI: 10.1093/toxsci/kfs223
Source: PubMed


Toxicogenomics enjoyed considerable attention as a ground-breaking addition to conventional toxicology assays at its inception.
However, the pace at which toxicogenomics was expected to perform has been tempered in recent years. Next to cost, the lack
of advanced knowledge discovery and data mining tools significantly hampered progress in this new field of toxicological sciences.
Recently, two of the largest toxicogenomics databases were made freely available to the public. These comprehensive studies
are expected to stimulate knowledge discovery and development of novel data mining tools, which are essential to advance this
field. In this review, we provide a concise summary of each of these two databases with a brief discussion on the commonalities
and differences between them. We place our emphasis on some key questions in toxicogenomics and how these questions can be
appropriately addressed with the two databases. Finally, we provide a perspective on the future direction of toxicogenomics
and how new technologies such as RNA-Seq may impact this field.

    • "To this end, so-called omics methods are being deployed ; omics approaches are sometimes viewed as " high-throughput " , but it can be argued that even though vast amounts of data are generated , this is not necessarily done in a high-throughput manner, as the data analysis can be demanding. In this context, toxicogenomics is a generic term commonly referring to molecular approaches to screen for alterations in gene expression and products of protein function in living systems subjected to toxicological challenge (Chen et al., 2012). The term comprises transcriptomics, proteomics, and other more recent approaches such as metabolomics and epigenomics, which are, in essence, related to different steps along the complex chain of events of gene expression and its consequences. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems should be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global 'omics' approaches with which to assess changes in genes, proteins, metabolites, etc are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials.
    No preview · Article · Dec 2015 · Toxicology and Applied Pharmacology
    • "Toxicogenomics combines toxicology with omics technologies to investigate the mechanisms underlying a toxicological response (Waters and Fostel, 2004). Microarray-based gene expression profiling still remains the core technological platform in toxicogenomic research (Chen et al., 2012). It is a well-established technique and provides genome-wide information on transcriptomic changes (Shi et al., 2006) and is used to obtain better insight in the molecular mechanisms underlying drug-induced liver toxicity (Cheng et al., 2011; Cui and Paules, 2010; Nuwaysir et al., 1999). "
    [Show abstract] [Hide abstract]
    ABSTRACT: In order to improve attrition rates of candidate-drugs there is a need for a better understanding of the mechanisms underlying drug-induced hepatotoxicity. We aim to further unravel the toxicological response of hepatocytes to a prototypical cholestatic compound by integrating transcriptomic and metabonomic profiling of HepG2 cells exposed to Cyclosporin A. Cyclosporin A exposure induced intracellular cholesterol accumulation and diminished intracellular bile acid levels. Performing pathway analyses of significant mRNAs and metabolites separately and integrated, resulted in more relevant pathways for the latter. Integrated analyses showed pathways involved in cell cycle and cellular metabolism to be significantly changed. Moreover, pathways involved in protein processing of the endoplasmic reticulum, bile acid biosynthesis and cholesterol metabolism were significantly affected. Our findings indicate that an integrated approach combining metabonomics and transcriptomics data derived from representative in vitro models, with bioinformatics can improve our understanding of the mechanisms of action underlying drug-induced hepatotoxicity. Furthermore, we showed that integrating multiple omics and thereby analyzing genes, microRNAs and metabolites of the opposed model for drug-induced cholestasis can give valuable information about mechanisms of drug-induced cholestasis in vitro and therefore could be used in toxicity screening of new drug candidates at an early stage of drug discovery. Copyright © 2015. Published by Elsevier Ltd.
    No preview · Article · Apr 2015 · Toxicology in Vitro
  • Source
    • "Traditional approaches for the assessment of toxicological properties of compounds rely heavily on animal testing (Chen et al. 2012). Several issues related to animal experiments have led to the need for alternative experimental methods. "
    [Show abstract] [Hide abstract]
    ABSTRACT: A joint US-EU workshop on enhancing data sharing and exchange in toxicogenomics was held at the National Institute for Environmental Health Sciences. Currently, efficient reuse of data is hampered by problems related to public data availability, data quality, database interoperability (the ability to exchange information), standardization and sustainability. At the workshop, experts from universities and research institutes presented databases, studies, organizations and tools that attempt to deal with these problems. Furthermore, a case study showing that combining toxicogenomics data from multiple resources leads to more accurate predictions in risk assessment was presented. All participants agreed that there is a need for a web portal describing the diverse, heterogeneous data resources relevant for toxicogenomics research. Furthermore, there was agreement that linking more data resources would improve toxicogenomics data analysis. To outline a roadmap to enhance interoperability between data resources, the participants recommend collecting user stories from the toxicogenomics research community on barriers in data sharing and exchange currently hampering answering to certain research questions. These user stories may guide the prioritization of steps to be taken for enhancing integration of toxicogenomics databases.
    Full-text · Article · Oct 2014 · Archive für Toxikologie
Show more