[show abstract][hide abstract] ABSTRACT: Purpose: 1. To develop a framework for exposure calculation via the dermal route to meet the needs of 21st
century toxicity testing and refine current approaches; 2. To demonstrate the impact of exposure scenario and application conditions on the plasma concentration following dermal exposure.
Method: A workflow connecting a dynamic skin penetration model with a generic whole-body physiologically-based pharmacokinetic (PBPK) model was developed. The impact of modifying exposure scenarios and application conditions on the simulated steady-state plasma concentration and exposure conversion factor was investigated for 9 chemicals tested previously in dermal animal studies which did not consider kinetics in their experimental designs.
Results: By simulating the animal study scenarios and exposure conditions, we showed that 7 studies were conducted with finite dose exposures, 1 with both finite and infinite dose exposures (in these 8 studies, an increase in the animal dose resulted in an increase in the simulated steady-state plasma concentrations (Cp,ss)), while 1 study was conducted with infinite dose exposures only (an increase in the animal dose resulted in identical Cp,ss). Steady-state plasma concentrations were up to 30-fold higher following an infinite dose scenario vs. a finite dose scenario, and up to 40-fold higher with occlusion vs. without. Depending on the chemical, the presence of water as a vehicle increased or decreased the steady-state plasma concentration, the largest difference being a factor of 16.
Conclusions: The workflow linking Kasting’s model of skin penetration and whole-body PBPK enables estimation of plasma concentrations for various applied doses, exposure scenarios and application conditions. Consequently, it provides a quantitative, mechanistic tool to refine dermal exposure calculations methodology for further use in risk assessment.
[show abstract][hide abstract] ABSTRACT: Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new products (such as nanoparticles or cell therapies), the limited predictivity of traditional tests for human health effects, duration and costs of current approaches, and animal welfare considerations. The latter holds especially true in the context of the scheduled 2013 marketing ban on cosmetic ingredients tested for systemic toxicity. Based on a major analysis of the status of alternative methods (Adler et al., 2011) and its independent review (Hartung et al., 2011), the present report proposes a roadmap for how to overcome the acknowledged scientific gaps for the full replacement of systemic toxicity testing using animals. Five whitepapers were commissioned addressing toxicokinetics, skin sensitization, repeated-dose toxicity, carcinogenicity, and reproductive toxicity testing. An expert workshop of 35 participants from Europe and the US discussed and refined these whitepapers, which were subsequently compiled to form the present report. By prioritizing the many options to move the field forward, the expert group hopes to advance regulatory science.
ALTEX: Alternativen zu Tierexperimenten 01/2012; 29(1):3-91. · 4.09 Impact Factor
[show abstract][hide abstract] ABSTRACT: A comprehensive transient model of chemical penetration through the stratum corneum, viable epidermis and dermis formulated in terms of an Excel™ spreadsheet and associated add-in is presented. The model is a one-dimensional homogenization of underlying microscopic transport models for stratum corneum and dermis; viable epidermis is treated as unperfused dermis. The model's salient features are a detailed structural description of the skin layers, a combination of first-principles based transport equations and empirical partition and diffusion coefficients, and the capability of simulating a variety of exposure scenarios. Model predictions are compared with representative in vitro skin permeation data obtained from the literature using as summary parameters total absorption (Q(abs)), maximum flux (J(max)) and skin permeability coefficient (k(p)). The results of this evaluation demonstrate the current state-of-the-art in prediction of transient skin absorption and highlight areas in which further elaborations are needed to obtain satisfactory predictions.
Advanced drug delivery reviews 01/2012; 65(2):221-236. · 11.96 Impact Factor
[show abstract][hide abstract] ABSTRACT: The need for a more mechanistic understanding of the ways in which chemicals modulate biological pathways is urgent if we are to identify and better assess safety issues relating to a wide range of substances developed by the pharmaceutical, chemical, agri-bio, and cosmetic industries. Omics technologies provide a valuable opportunity to refine existing methods and provide information for so-called integrated testing strategies via the creation of signatures of toxicity. By mapping these signatures to underlying pathways of toxicity, some of which have been identified by toxicologists over the last few decades, and bringing them together with pathway information determined from biochemistry and molecular biology, a "systems toxicology" approach will enable virtual experiments to be conducted that can improve the prediction of hazard and the assessment of compound toxicity.
[show abstract][hide abstract] ABSTRACT: The role that in vitro systems can play in toxicological risk assessment is determined by the appropriateness of the chosen methods, with respect to the way in which in vitro data can be extrapolated to the in vivo situation. This report presents the results of a workshop aimed at better defining the use of in vitro-derived biomarkers of toxicity (BoT) and determining the place these data can have in human risk assessment. As a result, a conceptual framework is presented for the incorporation of in vitro-derived toxicity data into the risk assessment process. The selection of BoT takes into account that they need to distinguish adverse and adaptive changes in cells. The framework defines the place of in vitro systems in the context of data on exposure, structural and physico-chemical properties, and toxicodynamic and biokinetic modeling. It outlines the determination of a proper point-of-departure (PoD) for in vitro-in vivo extrapolation, allowing implementation in risk assessment procedures. A BoT will need to take into account both the dynamics and the kinetics of the compound in the in vitro systems. For the implementation of the proposed framework it will be necessary to collect and collate data from existing literature and new in vitro test systems, as well as to categorize biomarkers of toxicity and their relation to pathways-of-toxicity. Moreover, data selection and integration need to be driven by their usefulness in a quantitative in vitro-in vivo extrapolation (QIVIVE).
[show abstract][hide abstract] ABSTRACT: There is an urgent need to develop data integration and testing strategy frameworks allowing interpretation of results from animal alternative test batteries. To this end, we developed a Bayesian Network Integrated Testing Strategy (BN ITS) with the goal to estimate skin sensitization hazard as a test case of previously developed concepts (Jaworska et al., 2010). The BN ITS combines in silico, in chemico, and in vitro data related to skin penetration, peptide reactivity, and dendritic cell activation, and guides testing strategy by Value of Information (VoI). The approach offers novel insights into testing strategies: there is no one best testing strategy, but the optimal sequence of tests depends on information at hand, and is chemical-specific. Thus, a single generic set of tests as a replacement strategy is unlikely to be most effective. BN ITS offers the possibility of evaluating the impact of generating additional data on the target information uncertainty reduction before testing is commenced.
[show abstract][hide abstract] ABSTRACT: The topic of Integrated Testing Strategies (ITS) has attracted considerable attention, and not only because it is supposed to be a central element of REACH, the ambitious European chemical regulation effort. Although what ITSs are supposed to do seems unambiguous, i.e. speeding up hazard and risk assessment while reducing testing costs, not much has been said, except basic conceptual proposals, about the methodologies that would allow execution of these concepts. Although a pressing concern, the topic of ITS has drawn mostly general reviews, broad concepts, and the expression of a clear need for more research on ITS. Published research in the field remains scarce. Solutions for ITS design emerge slowly, most likely due to the methodological challenges of the task, and perhaps also to it its complexity and the need for multidisciplinary collaboration. Along with the challenge, ITS offer a unique opportunity to contribute to the Toxicology of the 21st century by providing frameworks and tools to actually implement 21st century toxicology data in the chemical management and decision making processes. Further, ITS have the potential to significantly contribute to a modernization of the science of risk assessment. Therefore, to advance ITS research we propose a methodical approach to their design and will discuss currently available approaches as well as challenges to overcome. To this end, we define a framework for ITS that will inform toxicological decisions in a systematic, transparent, and consistent way. We review conceptual requirements for ITS developed earlier and present a roadmap to an operational framework that should be probabilistic, hypothesis-driven, and adaptive. Furthermore, we define properties an ITS should have in order to meet the identified requirements and differentiate them from evidence synthesis. Making use of an ITS for skin sensitization, we demonstrate how the proposed ITS concepts can be implemented.
ALTEX: Alternativen zu Tierexperimenten 01/2010; 27(4):231-42. · 4.09 Impact Factor
[show abstract][hide abstract] ABSTRACT: A systematic expert-driven process is presented for evaluating analogs for read across in SAR (structure activity relationship) toxicological assessments. The approach involves categorizing potential analogs based upon their degree of structural, reactivity, metabolic and physicochemical similarity to the chemical with missing toxicological data (target chemical). It extends beyond structural similarity, and includes differentiation based upon chemical reactivity and addresses the potential that an analog and target could show toxicologically significant metabolic convergence or divergence. In addition, it identifies differences in physicochemical properties, which could affect bioavailability and consequently biological responses observed in vitro or in vivo. The approach provides a stepwise decision tree for categorizing the suitability of analogs, which qualitatively characterizes the strength of the evidence supporting the hypothesis of similarity and level of uncertainty associated with their use for read across. The result is a comprehensive framework to apply chemical, biochemical and toxicological principles in a systematic manner to identify and evaluate factors that can introduce uncertainty into SAR assessments, while maximizing the appropriate use of all available data.
Regulatory Toxicology and Pharmacology 09/2009; 56(1):67-81. · 2.13 Impact Factor
[show abstract][hide abstract] ABSTRACT: The use of Integrated Testing Strategies (ITS) in toxicological hazard identification and characterisation is becoming increasingly common as a method for enabling the integration of diverse types of toxicology data. At present, there are no existing procedures and guidelines for the construction and validation of ITS, so a joint EPAA WG5-ECVAM workshop was held with the following objectives: a) to investigate the role of ITS and the need for validation of ITS in the different industry sectors (pharmaceuticals, cosmetics, chemicals); b) to formulate a common definition of ITS applicable across different sectors; c) to explore how and when Three Rs methods are used within ITS; and d) to propose a validation rationale for ITS and for alternative methods that are foreseen to be used within ITS. The EPAA provided a platform for comparing experiences with ITS across different industry sectors. It became clear that every ITS has to be adapted to the product type, R&D stage, and regulatory context. However, common features of ITS were also identified, and this permitted the formulation of a general definition of ITS in a regulatory context. The definition served as a basis for discussing the needs, rationale and process of formal ITS validation. One of the main conclusions was that a formal validation should not be required, unless the strategy will serve as full replacement of an in vivo study used for regulatory purposes. Finally, several challenges and bottlenecks to the ITS validation were identified, and it was agreed that a roadmap on how to address these barriers would be established by the EPAA partners.
[show abstract][hide abstract] ABSTRACT: Chemical category is a regulatory concept facilitating filling safety data gaps. Practically, all chemical management programs like the OECD HPV Program, EU REACH, or the Canadian DSL Categorization are planning to use or are already using categorization approaches to reduce resources including animal testing. The aim of the study was to discuss the feasibility to apply computational structural similarity methods to augment formation of a category. The article discusses also how this understanding can be translated into computer readable format, an ultimate need for practical, broad scope applications. We conclude that for the skin sensitization endpoint, used as a working example, mechanistic understanding expressed as chemical reactivity can be exploited by computational structural similarity methods to augment category formation process. We propose a novel method, atom environments ranking (AER), to assess similarity to a reference training set representing a common mechanism of action, as a potential method for grouping chemicals into reactivity domains.
SAR and QSAR in Environmental Research 01/2007; 18(3-4):195-207. · 1.67 Impact Factor
[show abstract][hide abstract] ABSTRACT: A novel mechanistic modeling approach has been developed that assesses chemical biodegradability in a quantitative manner. It is an expert system predicting biotransformation pathway working together with a probabilistic model that calculates probabilities of the individual transformations. The expert system contains a library of hierarchically ordered individual transformations and matching substructure engine. The hierarchy in the expert system was set according to the descending order of the individual transformation probabilities. The integrated principal catabolic steps are derived from set of metabolic pathways predicted for each chemical from the training set and encompass more than one real biodegradation step to improve the speed of predictions. In the current work, we modeled O2 yield during OECD 302 C (MITI I) test. MITI-I database of 532 chemicals was used as a training set. To make biodegradability predictions, the model only needs structure of a chemical. The output is given as percentage of theoretical biological oxygen demand (BOD). The model allows for identifying potentially persistent catabolic intermediates and their molar amounts. The data in the training set agreed well with the calculated BODs (r2 = 0.90) in the entire range i.e. a good fit was observed for readily, intermediate and difficult to degrade chemicals. After introducing 60% ThOD as a cut off value the model predicted correctly 98% ready biodegradable structures and 96% not ready biodegradable structures. Crossvalidation by four times leaving 25% of data resulted in Q2 = 0.88 between observed and predicted values. Presented approach and obtained results were used to develop computer software for biodegradability prediction CATABOL.
SAR and QSAR in Environmental Research 04/2002; 13(2):307-23. · 1.67 Impact Factor