[Show abstract][Hide abstract] ABSTRACT: This paper outlines the work for which Roland Grafström and Pekka Kohonen were awarded the 2014 Lush Science Prize. The research activities of the Grafström laboratory have, for many years, covered cancer biology studies, as well as the development and application of toxicity-predictive in vitro models to determine chemical safety. Through the integration of in silico analyses of diverse types of genomics data (transcriptomic and proteomic), their efforts have proved to fit well into the recently-developed Adverse Outcome Pathway paradigm. Genomics analysis within state-of-the-art cancer biology research and Toxicology in the 21st Century concepts share many technological tools. A key category within the Three Rs paradigm is the Replacement of animals in toxicity testing with alternative methods, such as bioinformatics-driven analyses of data obtained from human cell cultures exposed to diverse toxicants. This work was recently expanded within the pan-European SEURAT-1 project (Safety Evaluation Ultimately Replacing Animal Testing), to replace repeat-dose toxicity testing with data-rich analyses of sophisticated cell culture models. The aims and objectives of the SEURAT project have been to guide the application, analysis, interpretation and storage of 'omics' technology-derived data within the service-oriented sub-project, ToxBank. Particularly addressing the Lush Science Prize focus on the relevance of toxicity pathways, a 'data warehouse' that is under continuous expansion, coupled with the development of novel data storage and management methods for toxicology, serve to address data integration across multiple 'omics' technologies. The prize winners' guiding principles and concepts for modern knowledge management of toxicological data are summarised. The translation of basic discovery results ranged from chemical-testing and material-testing data, to information relevant to human health and environmental safety.
[Show abstract][Hide abstract] ABSTRACT: Background:
The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs.
The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms.
We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the "representational state transfer" (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure-activity relationships for nanomaterials (NanoQSAR).
[Show abstract][Hide abstract] ABSTRACT: A long-term goal of numerous research projects is to identify biomarkers for in vitro systems predicting toxicity in vivo. Often, transcriptomics data are used to identify candidates for further evaluation. However, a systematic directory summarizing key features of chemically influenced genes in human hepatocytes is not yet available. To bridge this gap, we used the Open TG-GATES database with Affymetrix files of cultivated human hepatocytes incubated with chemicals, further sets of gene array data with hepatocytes from human donors generated in this study, and publicly available genome-wide datasets of human liver tissue from patients with non-alcoholic steatohepatitis (NASH), cirrhosis, and hepatocellular cancer (HCC). After a curation procedure, expression data of 143 chemicals were included into a comprehensive biostatistical analysis. The results are summarized in the publicly available toxicotranscriptomics directory ( http://wiki.toxbank.net/toxicogenomics-map/ ) which provides information for all genes whether they are up- or downregulated by chemicals and, if yes, by which compounds. The directory also informs about the following key features of chemically influenced genes: (1) Stereotypical stress response. When chemicals induce strong expression alterations, this usually includes a complex but highly reproducible pattern named 'stereotypical response.' On the other hand, more specific expression responses exist that are induced only by individual compounds or small numbers of compounds. The directory differentiates if the gene is part of the stereotypical stress response or if it represents a more specific reaction. (2) Liver disease-associated genes. Approximately 20 % of the genes influenced by chemicals are up- or downregulated, also in liver disease. Liver disease genes deregulated in cirrhosis, HCC, and NASH that overlap with genes of the aforementioned stereotypical chemical stress response include CYP3A7, normally expressed in fetal liver; the phase II metabolizing enzyme SULT1C2; ALDH8A1, known to generate the ligand of RXR, one of the master regulators of gene expression in the liver; and several genes involved in normal liver functions: CPS1, PCK1, SLC2A2, CYP8B1, CYP4A11, ABCA8, and ADH4. (3) Unstable baseline genes. The process of isolating and the cultivation of hepatocytes was sufficient to induce some stress leading to alterations in the expression of genes, the so-called unstable baseline genes. (4) Biological function. Although more than 2,000 genes are transcriptionally influenced by chemicals, they can be assigned to a relatively small group of biological functions, including energy and lipid metabolism, inflammation and immune response, protein modification, endogenous and xenobiotic metabolism, cytoskeletal organization, stress response, and DNA repair. In conclusion, the introduced toxicotranscriptomics directory offers a basis for a rationale choice of candidate genes for biomarker evaluation studies and represents an easy to use source of background information on chemically influenced genes.
Archives of Toxicology 11/2014; 88(12):2261-87. DOI:10.1007/s00204-014-1400-x · 5.98 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: SEURAT-1 is a European public-private research consortium that is working towards animal-free testing of chemical compounds and the highest level of consumer protection. A research strategy was formulated based on the guiding principle to adopt a toxicological mode-of-action framework to describe how any substance may adversely affect human health.The proof of the initiative will be in demonstrating the applicability of the concepts on which SEURAT-1 is built on three levels:(i) Theoretical prototypes for adverse outcome pathways are formulated based on knowledge already available in the scientific literature on investigating the toxicological mode-of-actions leading to adverse outcomes (addressing mainly liver toxicity);(ii)adverse outcome pathway descriptions are used as a guide for the formulation of case studies to further elucidate the theoretical model and to develop integrated testing strategies for the prediction of certain toxicological effects (i.e., those related to the adverse outcome pathway descriptions);(iii) further case studies target the application of knowledge gained within SEURAT-1 in the context of safety assessment. The ultimate goal would be to perform ab initio predictions based on a complete understanding of toxicological mechanisms. In the near-term, it is more realistic that data from innovative testing methods will support read-across arguments. Both scenarios are addressed with case studies for improved safety assessment. A conceptual framework for a rational integrated assessment strategy emerged from designing the case studies and is discussed in the context of international developments focusing on alternative approaches for evaluating chemicals using the new 21st century tools for toxicity testing.
[Show abstract][Hide abstract] ABSTRACT: Toxicological research faces the challenge of integrating knowledge from diverse fields and novel technological developments generally in the biological and medical sciences. We discuss herein the fact that the multiple facets of cancer research, including discovery related to mechanisms, treatment and diagnosis, overlap many up and coming interest areas in toxicology, including the need for improved methods and analysis tools. Common to both disciplines, in vitro and in silico methods serve as alternative investigation routes to animal studies. Knowledge on cancer development helps in understanding the relevance of chemical toxicity studies in cell models, and many bioinformatics-based cancer biomarker discovery tools are also applicable to computational toxicology. Robotics-aided cell-based high throughput screening, microscale immunostaining techniques, and gene expression profiling analyses are common tools in cancer research, and when sequentially combined, form a tiered approach to structured safety evaluation of thousands of environmental agents, novel chemicals or engineered nanomaterials. Comprehensive tumour data collections in databases have been translated into clinically useful data, and this concept serves as template for computer-driven evaluation of toxicity data into meaningful results. Future "cancer research-inspired knowledge management" of toxicological data will aid the translation of basic discovery results and chemicals- and materials-testing data to information relevant to human health and environmental safety. This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e., mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance.
[Show abstract][Hide abstract] ABSTRACT: A virtual organisation approach was applied to collaborative drug discovery integrating experimental and computational design approaches. Scientists Against Malaria was formed with the goal of designing novel antimalarial drug candidates. The collaboration of nine founding partners carried out computational and laboratory work that produced significant volumes of data and metadata, the interpretation for the analysis of which, as well as the related decision making, was challenging. During the first phase the partners developed this 'green-field' project from initiation through to target selection and modelling, computational screening, biological materials and assay preparation, culminating in the completion of initial experimental testing. A support infrastructure involving a semantic collaborative laboratory framework, interoperating with a cloud of web services through an ontology describing the virtual and experimental screening data, was designed and tested.
Drug discovery today 02/2013; 18(13-14). DOI:10.1016/j.drudis.2013.02.004 · 6.69 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The aim of the SEURAT-1 (Safety Evaluation Ultimately Replacing Animal Testing-1) research cluster, comprised of seven EU FP7 Health projects co-financed by Cosmetics Europe, is to generate a proof-of-concept to show how the latest technologies, systems toxicology and toxicogenomics can be combined to deliver a test replacement for repeated dose systemic toxicity testing on animals. The SEURAT-1 strategy is to adopt a mode-of-action framework to describe repeated dose toxicity, combining in vitro and in silico methods to derive predictions of in vivo toxicity responses. ToxBank is the cross-cluster infrastructure project whose activities include the development of a data warehouse to provide a web-accessible shared repository of research data and protocols, a physical compounds repository, reference or “gold compounds” for use across the cluster (available via wiki.toxbank.net), and a reference resource for biomaterials. Core technologies used in the data warehouse include the ISA-Tab universal data exchange format, REpresentational State Transfer (REST) web services, the W3C Resource Description Framework (RDF) and the OpenTox standards. We describe the design of the data warehouse based on cluster requirements, the implementation based on open standards, and finally the underlying concepts and initial results of a data analysis utilizing public data related to the gold compounds.
[Show abstract][Hide abstract] ABSTRACT: The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing.
The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists).
The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/
Open Source Software in Life Science Research: Practical Solutions to Common Challenges in the Pharmaceutical Industry and Beyond, 1 edited by L. Harland, M. Foster, 01/2012: chapter 2: pages 35-61; Biohealthcare Publishing Ltd., ISBN: 978-1907568978
[Show abstract][Hide abstract] ABSTRACT: Foreign substances can have a dramatic and unpredictable adverse effect on human health. In the development of new therapeutic agents, it is essential that the potential adverse effects of all candidates be identified as early as possible. The field of predictive toxicology strives to profile the potential for adverse effects of novel chemical substances before they occur, both with traditional in vivo experimental approaches and increasingly through the development of in vitro and computational methods which can supplement and reduce the need for animal testing. To be maximally effective, the field needs access to the largest possible knowledge base of previous toxicology findings, and such results need to be made available in such a fashion so as to be interoperable, comparable, and compatible with standard toolkits. This necessitates the development of open, public, computable, and standardized toxicology vocabularies and ontologies so as to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. Such ontology development will support data management, model building, integrated analysis, validation and reporting, including regulatory reporting and alternative testing submission requirements as required by guidelines such as the REACH legislation, leading to new scientific advances in a mechanistically-based predictive toxicology. Numerous existing ontology and standards initiatives can contribute to the creation of a toxicology ontology supporting the needs of predictive toxicology and risk assessment. Additionally, new ontologies are needed to satisfy practical use cases and scenarios where gaps currently exist. Developing and integrating these resources will require a well-coordinated and sustained effort across numerous stakeholders engaged in a public-private partnership. In this communication, we set out a roadmap for the development of an integrated toxicology ontology, harnessing existing resources where applicable. We describe the stakeholders' requirements analysis from the academic and industry perspectives, timelines, and expected benefits of this initiative, with a view to engagement with the wider community.
[Show abstract][Hide abstract] ABSTRACT: Foreign substances can have a dramatic and unpredictable adverse effect on human health. In the development of new therapeutic agents, it is essential that the potential adverse effects of all candidates be identified as early as possible. The field of predictive toxicology strives to profile the potential for adverse effects of novel chemical substances before they occur, both with traditional in vivo experimental approaches and increasingly through the development of in vitro and computational methods which can supplement and reduce the need for animal testing. To be maximally effective, the field needs access to the largest possible knowledge base of previous toxicology findings, and such results need to be made available in such a fashion so as to be interoperable, comparable, and compatible with standard toolkits. This necessitates the development of open, public, computable, and standardized toxicology vocabularies and ontologies so as to support the applications required by in silico, in vitro,
ALTEX: Alternativen zu Tierexperimenten 01/2012; 29(2-2):129-137. · 5.47 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
[Show abstract][Hide abstract] ABSTRACT: Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications.
This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources.
A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers.
BMC Research Notes 11/2011; 4:487. DOI:10.1186/1756-0500-4-487
[Show abstract][Hide abstract] ABSTRACT: OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.
The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.
Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
Journal of Cheminformatics 08/2010; 2(1):7. DOI:10.1186/1758-2946-2-7 · 4.55 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Scientists Against Malaria (SAM) is an international virtual organization designed to apply modern drug design and modeling techniques in combination with industry std. infrastructure and interdisciplinary science to develop new treatments against malaria. Our strategy strongly relies on the use and development of a novel collaborative research workflow where several computational and exptl. groups meet thanks to new technol. information systems.[p]The first pipeline project of SAM will be described, which is devoted to finding hit compds. against the genetically validated target Pfmap-2 (a P. Falciparum MAP kinase) from a novel chem. library. The strategy involves several virtual screening techniques, including pharmacophore based searching and receptor-based virtual screening based on several docking algorithms. The output of the docking expts. were integrated by a new consensus scoring function developed for this purpose. Preliminary results have identified, and exptl. validated recombinant