Mechanistic validation

Johns Hopkins University, Bloomberg School of Public Health, CAAT, Baltimore, USA.
ALTEX: Alternativen zu Tierexperimenten (Impact Factor: 5.47). 05/2013; 30(2):119-30.
Source: PubMed


Validation of new approaches in regulatory toxicology is commonly defined as the independent assessment of the reproducibility and relevance (the scientific basis and predictive capacity) of a test for a particular purpose. In large ring trials, the emphasis to date has been mainly on reproducibility and predictive capacity (comparison to the traditional test) with less attention given to the scientific or mechanistic basis. Assessing predictive capacity is difficult for novel approaches (which are based on mechanism), such as pathways of toxicity or the complex networks within the organism (systems toxicology). This is highly relevant for implementing Toxicology for the 21st Century, either by high-throughput testing in the ToxCast/Tox21 project or omics-based testing in the Human Toxome Project. This article explores the mostly neglected assessment of a test's scientific basis, which moves mechanism and causality to the foreground when validating/qualifying tests. Such mechanistic validation faces the problem of establishing causality in complex systems. However, pragmatic adaptations of the Bradford Hill criteria, as well as bioinformatic tools, are emerging. As critical infrastructures of the organism are perturbed by a toxic mechanism we argue that by focusing on the target of toxicity and its vulnerability, in addition to the way it is perturbed, we can anchor the identification of the mechanism and its verification.

Download full-text


Available from: Thomas Hartung, Sep 27, 2014
  • Source
    • "Previously, in Patlewicz et al. (2013), we discussed scientific confidence in the context of two validation frameworks, the IOM Framework (IOM, 2010) and the OECD Validation principles (2004), and adapted these for consideration in HTS/HCS assays. Similarly, Hartung et al. (2013) stated that ''validating the mechanism of a (group of) toxicant(s) is the basis for mechanistic validation of tests that identify those toxicants,'' and proposed that the performance of Tox21 test methods could be addressed through ''mechanistic validation'' which focuses on biological pathways and is carried out in six steps: (1) articulation of the pathway (''the biological/mechanistic circuitry (''in the absence of xenobiotic challenge'') leading to the hazard; (2) documenting the evidence , based on results in validated models, that reference chemicals that cause the hazard ''perturb the biology in question;'' (3) development of test(s) that reflect this biology; (4) verification that toxicants acting by this mechanism also do so in the test(s); (5) verification that antagonism or interference of the mechanism blocks or hinders positive test results. Although not a one-to-one match, this approach is, in essence, very similar to a well-informed adverse outcome pathway (AOP) that includes support for a causal link between the key event and adverse outcome (Ankley et al., 2010). "
    [Show abstract] [Hide abstract]
    ABSTRACT: High throughput (HTS) and high content (HCS) screening methods show great promise in changing how hazard and risk assessments are undertaken, but scientific confidence in such methods and associated prediction models needs to be established prior to regulatory use. Using a case study of HTS-derived models for predicting in vivo androgen (A), estrogen (E), thyroid (T) and steroidogenesis (S) endpoints in endocrine screening assays, we compare classification (fitting) models to cross validation (prediction) models. The more robust cross validation models (based on a set of endocrine ToxCastTM assays and guideline in vivo endocrine screening studies) have balanced accuracies from 79% to 85% for A and E, but only 23% to 50% for S and T. Thus, for E and A, HTS results appear promising for initial use in setting priorities for endocrine screening. However, continued research is needed to expand the domain of applicability and to develop more robust HTS/HCS-based prediction models prior to their use in other regulatory applications. Based on the lessons learned, we propose a framework for documenting scientific confidence in HTS assays and the prediction models derived therefrom. The documentation, transparency and the scientific rigor involved in addressing the elements in the proposed Scientific Confidence Framework could aid in discussions and decisions about the prediction accuracy needed for different applications.
    Regulatory Toxicology and Pharmacology 01/2014; · 2.03 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Despite wide-spread consensus on the need to transform toxicology and risk assessment in order to keep pace with technological and computational changes that have revolutionized the life sciences, there remains much work to be done to achieve the vision of toxicology based on a mechanistic foundation. To this end, a workshop was organized to explore one key aspect of this transformation - the development of Pathways of Toxicity as a key tool for hazard identification based on systems biology. Several issues were discussed in depth in the workshop: The first was the challenge of formally defining the concept of a Pathway of Toxicity (PoT), as distinct from, but complementary to, other toxicological pathway concepts such as mode of action (MoA). The workshop came up with a preliminary definition of PoT as "A molecular definition of cellular processes shown to mediate adverse outcomes of toxicants". It is further recognized that normal physiological pathways exist that maintain homeostasis and these, sufficiently perturbed, can become PoT. Second, the workshop sought to define the adequate public and commercial resources for PoT information, including data, visualization, analyses, tools, and use-cases, as well as the kinds of efforts that will be necessary to enable the creation of such a resource. Third, the workshop explored ways in which systems biology approaches could inform pathway annotation, and which resources are needed and available that can provide relevant PoT information to the diverse user communities.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e., mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance.
    Regulatory Toxicology and Pharmacology 10/2013; 67(3). DOI:10.1016/j.yrtph.2013.10.003 · 2.03 Impact Factor
Show more