Barry Hardy

Istituto Superiore di Sanità, Roma, Latium, Italy

Are you Barry Hardy?

Claim your profile

Publications (19)24.27 Total impact

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A long-term goal of numerous research projects is to identify biomarkers for in vitro systems predicting toxicity in vivo. Often, transcriptomics data are used to identify candidates for further evaluation. However, a systematic directory summarizing key features of chemically influenced genes in human hepatocytes is not yet available. To bridge this gap, we used the Open TG-GATES database with Affymetrix files of cultivated human hepatocytes incubated with chemicals, further sets of gene array data with hepatocytes from human donors generated in this study, and publicly available genome-wide datasets of human liver tissue from patients with non-alcoholic steatohepatitis (NASH), cirrhosis, and hepatocellular cancer (HCC). After a curation procedure, expression data of 143 chemicals were included into a comprehensive biostatistical analysis. The results are summarized in the publicly available toxicotranscriptomics directory ( http://wiki.toxbank.net/toxicogenomics-map/ ) which provides information for all genes whether they are up- or downregulated by chemicals and, if yes, by which compounds. The directory also informs about the following key features of chemically influenced genes: (1) Stereotypical stress response. When chemicals induce strong expression alterations, this usually includes a complex but highly reproducible pattern named 'stereotypical response.' On the other hand, more specific expression responses exist that are induced only by individual compounds or small numbers of compounds. The directory differentiates if the gene is part of the stereotypical stress response or if it represents a more specific reaction. (2) Liver disease-associated genes. Approximately 20 % of the genes influenced by chemicals are up- or downregulated, also in liver disease. Liver disease genes deregulated in cirrhosis, HCC, and NASH that overlap with genes of the aforementioned stereotypical chemical stress response include CYP3A7, normally expressed in fetal liver; the phase II metabolizing enzyme SULT1C2; ALDH8A1, known to generate the ligand of RXR, one of the master regulators of gene expression in the liver; and several genes involved in normal liver functions: CPS1, PCK1, SLC2A2, CYP8B1, CYP4A11, ABCA8, and ADH4. (3) Unstable baseline genes. The process of isolating and the cultivation of hepatocytes was sufficient to induce some stress leading to alterations in the expression of genes, the so-called unstable baseline genes. (4) Biological function. Although more than 2,000 genes are transcriptionally influenced by chemicals, they can be assigned to a relatively small group of biological functions, including energy and lipid metabolism, inflammation and immune response, protein modification, endogenous and xenobiotic metabolism, cytoskeletal organization, stress response, and DNA repair. In conclusion, the introduced toxicotranscriptomics directory offers a basis for a rationale choice of candidate genes for biomarker evaluation studies and represents an easy to use source of background information on chemically influenced genes.
    Archives of toxicology. 11/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: SEURAT-1 is a European public-private research consortium that is working towards animal-free testing of chemical compounds and the highest level of consumer protection. A research strategy was formulated based on the guiding principle to adopt a toxicological mode-of-action framework to describe how any substance may adversely affect human health.The proof of the initiative will be in demonstrating the applicability of the concepts on which SEURAT-1 is built on three levels:(i) Theoretical prototypes for adverse outcome pathways are formulated based on knowledge already available in the scientific literature on investigating the toxicological mode-of-actions leading to adverse outcomes (addressing mainly liver toxicity);(ii)adverse outcome pathway descriptions are used as a guide for the formulation of case studies to further elucidate the theoretical model and to develop integrated testing strategies for the prediction of certain toxicological effects (i.e., those related to the adverse outcome pathway descriptions);(iii) further case studies target the application of knowledge gained within SEURAT-1 in the context of safety assessment. The ultimate goal would be to perform ab initio predictions based on a complete understanding of toxicological mechanisms. In the near-term, it is more realistic that data from innovative testing methods will support read-across arguments. Both scenarios are addressed with case studies for improved safety assessment. A conceptual framework for a rational integrated assessment strategy emerged from designing the case studies and is discussed in the context of international developments focusing on alternative approaches for evaluating chemicals using the new 21st century tools for toxicity testing.
    ALTEX. 11/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Toxicological research faces the challenge of integrating knowledge from diverse fields and novel technological developments generally in the biological and medical sciences. We discuss herein the fact that the multiple facets of cancer research, including discovery related to mechanisms, treatment and diagnosis, overlap many up and coming interest areas in toxicology, including the need for improved methods and analysis tools. Common to both disciplines, in vitro and in silico methods serve as alternative investigation routes to animal studies. Knowledge on cancer development helps in understanding the relevance of chemical toxicity studies in cell models, and many bioinformatics-based cancer biomarker discovery tools are also applicable to computational toxicology. Robotics-aided cell-based high throughput screening, microscale immunostaining techniques, and gene expression profiling analyses are common tools in cancer research, and when sequentially combined, form a tiered approach to structured safety evaluation of thousands of environmental agents, novel chemicals or engineered nanomaterials. Comprehensive tumour data collections in databases have been translated into clinically useful data, and this concept serves as template for computer-driven evaluation of toxicity data into meaningful results. Future "cancer research-inspired knowledge management" of toxicological data will aid the translation of basic discovery results and chemicals- and materials-testing data to information relevant to human health and environmental safety. This article is protected by copyright. All rights reserved.
    Basic & Clinical Pharmacology & Toxicology 04/2014; · 2.18 Impact Factor
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e., mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance.
    Regulatory Toxicology and Pharmacology 10/2013; · 2.13 Impact Factor
  • Barry Hardy, Roman Affentranger
    [Show abstract] [Hide abstract]
    ABSTRACT: A virtual organisation approach was applied to collaborative drug discovery integrating experimental and computational design approaches. Scientists Against Malaria was formed with the goal of designing novel antimalarial drug candidates. The collaboration of nine founding partners carried out computational and laboratory work that produced significant volumes of data and metadata, the interpretation for the analysis of which, as well as the related decision making, was challenging. During the first phase the partners developed this 'green-field' project from initiation through to target selection and modelling, computational screening, biological materials and assay preparation, culminating in the completion of initial experimental testing. A support infrastructure involving a semantic collaborative laboratory framework, interoperating with a cloud of web services through an ontology describing the virtual and experimental screening data, was designed and tested.
    Drug discovery today 02/2013; · 6.63 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of the SEURAT-1 (Safety Evaluation Ultimately Replacing Animal Testing-1) research cluster, comprised of seven EU FP7 Health projects co-financed by Cosmetics Europe, is to generate a proof-of-concept to show how the latest technologies, systems toxicology and toxicogenomics can be combined to deliver a test replacement for repeated dose systemic toxicity testing on animals. The SEURAT-1 strategy is to adopt a mode-of-action framework to describe repeated dose toxicity, combining in vitro and in silico methods to derive predictions of in vivo toxicity responses. ToxBank is the cross-cluster infrastructure project whose activities include the development of a data warehouse to provide a web-accessible shared repository of research data and protocols, a physical compounds repository, reference or “gold compounds” for use across the cluster (available via wiki.toxbank.net), and a reference resource for biomaterials. Core technologies used in the data warehouse include the ISA-Tab universal data exchange format, REpresentational State Transfer (REST) web services, the W3C Resource Description Framework (RDF) and the OpenTox standards. We describe the design of the data warehouse based on cluster requirements, the implementation based on open standards, and finally the underlying concepts and initial results of a data analysis utilizing public data related to the gold compounds.
    Molecular Informatics 01/2013; 32(1):47-63. · 2.34 Impact Factor
  • Open Source Software in Life Science Research: Practical Solutions to Common Challenges in the Pharmaceutical Industry and Beyond, 1 edited by L. Harland, M. Foster, 01/2012: chapter 2: pages 35-61; Biohealthcare Publishing Ltd., ISBN: 978-1907568978
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/
    Journal of biomedical semantics. 01/2012; 3 Suppl 1:S7.
  • [Show abstract] [Hide abstract]
    ABSTRACT: The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
    ALTEX. 01/2012; 29(2):139-56.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Foreign substances can have a dramatic and unpredictable adverse effect on human health. In the development of new therapeutic agents, it is essential that the potential adverse effects of all candidates be identified as early as possible. The field of predictive toxicology strives to profile the potential for adverse effects of novel chemical substances before they occur, both with traditional in vivo experimental approaches and increasingly through the development of in vitro and computational methods which can supplement and reduce the need for animal testing. To be maximally effective, the field needs access to the largest possible knowledge base of previous toxicology findings, and such results need to be made available in such a fashion so as to be interoperable, comparable, and compatible with standard toolkits. This necessitates the development of open, public, computable, and standardized toxicology vocabularies and ontologies so as to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. Such ontology development will support data management, model building, integrated analysis, validation and reporting, including regulatory reporting and alternative testing submission requirements as required by guidelines such as the REACH legislation, leading to new scientific advances in a mechanistically-based predictive toxicology. Numerous existing ontology and standards initiatives can contribute to the creation of a toxicology ontology supporting the needs of predictive toxicology and risk assessment. Additionally, new ontologies are needed to satisfy practical use cases and scenarios where gaps currently exist. Developing and integrating these resources will require a well-coordinated and sustained effort across numerous stakeholders engaged in a public-private partnership. In this communication, we set out a roadmap for the development of an integrated toxicology ontology, harnessing existing resources where applicable. We describe the stakeholders' requirements analysis from the academic and industry perspectives, timelines, and expected benefits of this initiative, with a view to engagement with the wider community.
    ALTEX. 01/2012; 29(2):129-37.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Foreign substances can have a dramatic and unpredictable adverse effect on human health. In the development of new therapeutic agents, it is essential that the potential adverse effects of all candidates be identified as early as possible. The field of predictive toxicology strives to profile the potential for adverse effects of novel chemical substances before they occur, both with traditional in vivo experimental approaches and increasingly through the development of in vitro and computational methods which can supplement and reduce the need for animal testing. To be maximally effective, the field needs access to the largest possible knowledge base of previous toxicology findings, and such results need to be made available in such a fashion so as to be interoperable, comparable, and compatible with standard toolkits. This necessitates the development of open, public, computable, and standardized toxicology vocabularies and ontologies so as to support the applications required by in silico, in vitro,
    ALTEX: Alternativen zu Tierexperimenten 01/2012; 29(2):129-137. · 4.09 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers.
    BMC Research Notes 11/2011; 4:487.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
    Journal of Cheminformatics 01/2010; 2(1):7. · 3.59 Impact Factor
  • Barry Hardy
    [Show abstract] [Hide abstract]
    ABSTRACT: This article provides a perspective on the growing significance of community and collaboration approaches in discovery and development (chemical and pharmaceutical). The drivers of and challenges to these developments are described and the knowledge management concepts that provide a foundation to scientific and cultural progress are explained. Principles and case study examples of community approaches to best practices and emerging approaches to knowledge-oriented collaboration are discussed. Collaboration is a complex activity involving many business factors, scientific issues and challenges, demanding the combination of a variety of knowledge management, cultural and organizational best practices. Flexible knowledge-oriented processes, support activities and services are required for effective collaboration results.
    Future medicinal chemistry 06/2009; 1(3):435-49. · 3.31 Impact Factor
  • ALTEX. Alternatives zu Tierexperimenten.
  • ALTEX. Alternatives zu Tierexperimenten.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Scientists Against Malaria (SAM) is an international virtual organization designed to apply modern drug design and modeling techniques in combination with industry std. infrastructure and interdisciplinary science to develop new treatments against malaria. Our strategy strongly relies on the use and development of a novel collaborative research workflow where several computational and exptl. groups meet thanks to new technol. information systems.[p]The first pipeline project of SAM will be described, which is devoted to finding hit compds. against the genetically validated target Pfmap-2 (a P. Falciparum MAP kinase) from a novel chem. library. The strategy involves several virtual screening techniques, including pharmacophore based searching and receptor-based virtual screening based on several docking algorithms. The output of the docking expts. were integrated by a new consensus scoring function developed for this purpose. Preliminary results have identified, and exptl. validated recombinant