FIG 1 - uploaded by Raymond R Tice
Content may be subject to copyright.
Systems Toxicology draws on multiple disciplines and integrates them across all levels of biological organization to derive a detailed mechanistic understanding of toxicity. This understanding can then be used to predict adverse outcomes, and contribute to risk assessment for all applications of chemicals. Used with permission from Sturla et al ., (2014). Artwork by Samantha J. Elmhurst (www.livingart.org.uk). 

Systems Toxicology draws on multiple disciplines and integrates them across all levels of biological organization to derive a detailed mechanistic understanding of toxicity. This understanding can then be used to predict adverse outcomes, and contribute to risk assessment for all applications of chemicals. Used with permission from Sturla et al ., (2014). Artwork by Samantha J. Elmhurst (www.livingart.org.uk). 

Source publication
Article
Full-text available
FutureTox II, a Society of Toxicology Contemporary Concepts in Toxicology workshop, was held in January, 2014. The meeting goals were to review and discuss the state of the science in toxicology in the context of implementing the NRC 21st century vision of predicting in vivo responses from in vitro and in silico data, and to define the goals for th...

Context in source publication

Context 1
... 2007, the National Research Council (NRC) published Toxicity Testing in the 21 st Century: A Vision and a Strategy . Hazard-based values for regulatory toxicology are traditionally estimated from laboratory animal studies. However, because of the large number of chemicals in commerce with little or no toxicity information, the NRC report highlighted the need for high-throughput screening (HTS) technologies to identify chemically induced biological activity in human cells and cell lines and to develop predictive models of in vivo biological response for targeted testing. In the 7 years since publication of the NRC report, the implementation of the 21st century vision and implications of this new strategy for basic science and regulatory decision-making have been extensively discussed and debated (Andersen and Krewski, 2010; Boekelheide and Campion, 2010; Bus and Becker, 2009; Chapin and Stedman, 2009; Kavlock et al ., 2012; MacDonald and Robertson, 2009; Meek and Doull, 2009; NRC 2007a; Sturla et al . 2014; Walker and Bucher, 2009). The toxicology community has made significant progress developing assays and tools that will help achieve the predictive toxicology goals outlined by the NRC in 2007. This raises the important question of how in vitro data and in silico models can be used to understand and predict in vivo toxicity. This question was the central focus of a Society of Toxicology (SOT) Contemporary Concepts in Toxicology (CCT) Workshop held in January 2014 in Chapel Hill, North Carolina. FutureTox II, the second in the FutureTox series (Rowlands et al ., 2014), was attended in person and via webcast by more than 400 scientists from governmental and regulatory agencies, research institutes, academia, and the chemical and pharmaceutical industries in Europe, Canada, and the United States. The agenda for FutureTox II can be found at https://www.toxicology.org/ai/meet/cct_futureToxII.asp#program (accessed October 22, 2014). Workshop participants reviewed and discussed the state of the science in toxicology and human risk and exposure assessment and attempted to define collective goals for the future. This article reports on the key issues covered in FutureTox II with regard to state of the science and challenges to implementation of the new testing paradigm in understanding toxicological risks. Many efforts have been initiated toward developing new assays for toxicity testing and models to integrate large datasets into an emerging risk assessment framework. A major challenge to progress is the complex nature of biological systems. Reducing the complexity of test systems to simpler cell and small model organisms (such as Caenorhabditis elegans and zebrafish) enables the application of higher throughput testing strategies, but in so doing may lose many of the systems-level characteristics that make a human toxicological response complex. Programs such as Tox21, ToxCast TM , SEURAT-1 (and other EU-sponsored efforts), and NIH-NCATS-FDA-sponsored work on microphysiological systems have made significant contributions toward implementation of approaches that are scalable to the problem, but much remains to be accomplished. Mathematical modeling of complex integrated biological systems remains somewhat of a bottleneck. Biological models of metabolism, pharmacokinetics, and risk estimation have been prominent in toxicology for a few decades; however, new graph- ical and analytical tools and methods are needed in order to “decode the toxicological blueprint of active substances that interact with living systems” (Sturla et al . 2014). Ultimately, the goal is a detailed mechanistic, quantitative understanding of toxicological processes—from a specific molecular initiating event to a specific biological outcome (Ankley et al., 2010), in the context of an integrated biological system. A number of factors drive society’s need for an accurate and efficient paradigm for predictive toxicology ( Figure 1). Although some of these drivers are specific to a region of the world, others are relevant worldwide, and can only be addressed through international cooperation and harmonization. In the European Union (EU), Directive 2010/63/EU legislates an end to “the use of animals in toxicology testing and biomedical research as soon as scientifically feasible to do so.” Also in the EU, “REACH” is a directive that restricts animal testing only “as a last resort to satisfy registration information requirements,” whereas Regulation 1223/2009/EU legislates a comprehensive ban on marketing within the EU of any cosmetic product (or ingredient thereof) that has been tested on animals since March 2013. This means that, in the EU, effective in vitro / in silico tools for predictive toxicology are an urgent priority. Predictive toxicology will also play an important role in satisfying other societal and legis- lative demands: these include, the need to protect human health and the environment from (1) endocrine-disrupting chemicals, and (2) the effects of chemical mixtures. The US FDA endorses the effort “to reduce animal testing [and] to work towards replacement of animal testing” as a basis for regulatory action . This policy may reflect concern that predictive models based on animal testing fail to account for the rate of reported adverse events and lethal adverse events among pharmaceutical drug users in the USA. Notably, the frequency of reported adverse events increased steadily from 1998 to 2005 (Moore et al., 2007 ). This suggests that current test methods may neglect factors that contribute to “false negative” predictions, such as: polypharmacology; age-, gender-, or ethnicity-linked drug susceptibility, genetic diversity of human populations, and failure to independently verify study outcomes. Alternatively, some animal models may simply be inadequate for extrapolation to humans. Looking to overcome these problems, one must ask the question, ‘what types of models need to be developed, and how will human genetic diversity be incorporated into the models?’ FutureTox II explored a broad range of current toxicological research to address those and other questions. Priority concerns and emerging areas in the field of predictive toxicology are summarized in Box 1. Priority concerns include, for example, predicting, modeling, or experimentally evaluating the role of metabolism on toxicological outcomes; modeling chemical mixtures; understanding the controls of cell growth and differentiation; identifying and characterizing human subpopulations susceptible to specific adverse outcomes; and developing tools for integrating multiple types of data from diverse experimental systems into a unified risk assessment paradigm. Emerging trends in technologies include, for example, increased use of induced pluripotent stem cell (iPSC)-derived human cells; development of defined heterotypic cell and three-dimensional (3D) organotypic culture models; engineered micro-scale physiological systems; mathematical modeling of cellular processes and morphogenesis; Adverse Outcome Pathways (AOPs) as a regulatory tool; and high-content imaging of in vivo systems and small model organisms. In addressing these key issues, the question, “how do currently available in vitro / in silico methodologies com- pare to in vivo methods—are they as good or superior, and will their predictive accuracy eventually be high enough to obviate the need for in vivo models?” must be asked. The charge presented by conference organizers to FutureTox II participants included the following 3 goals: (1) to address progress and advances toward a paradigm where improvements to predictivity and concordance are based on in vitro / in silico approaches; (2) to provide a forum to integrate newer in vitro methodologies and computational ( in silico ) modeling approaches with advances in systems biology; and (3) to clarify the usefulness and validity of new and emerging technologies and approaches, so that expectations can be managed in both the regulatory and regulated scientific communities. Suggestions in considering the state of the science since the NRC 2007 report and insight into future technologies led to recommendations for future activities to achieve the goals projected in that report. The following synopsizes these main themes, an analysis of the progress, significant Key Workshop Discussions: Challenges and Potential ...

Similar publications

Article
Full-text available
The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and interpretation of mechanistic data representing multiple biological levels and deriving from a range of methodological approaches including in silico, in vitro and in vivo assays. AOPs are playing an increasingly important role in the chemical safety asses...
Article
Full-text available
The prevalence of hormone-related health issues caused by exposure to endocrine disrupting chemicals (EDCs) is a significant, and increasing, societal challenge. Declining fertility rates together with rising incidence rates of reproductive disorders and other endocrine-related diseases underscores the urgency in taking more action. Addressing the...

Citations

... 14 that while MIEs/KEs are required at a qualitative level, they must be activated to a sufficient level and duration to cause an adverse outcome (Conolly et al., 2017). Computationally-derived quantitative effect levels, or "molecular tipping points" can be used as tools for adversity determinations using shorter-term data (Julien et al., 2009;Knudsen et al., 2015). Using biomarker TALs that were derived a number of ways, we found that across 163 chemicals examined at multiple time points, the NAM had predictive accuracies of 96%-97% Lewis et al., 2020). ...
Article
Full-text available
Current methods for cancer risk assessment are resource-intensive and not feasible for most of the thousands of untested chemicals. In earlier studies, we developed a new approach methodology (NAM) to identify liver tumorigens using gene expression biomarkers and associated tumorigenic activation levels (TALs) after short-term exposures in rats. The biomarkers are used to predict the six most common rodent liver cancer molecular initiating events. In the present study, we wished to confirm that our approach could be used to identify liver tumorigens at only one time point/dose and if the approach could be applied to (targeted) RNA-Seq analyses. Male rats were exposed for 4 days by daily gavage to 15 chemicals at doses with known chronic outcomes and liver transcript profiles were generated using Affymetrix arrays. Our approach had 75% or 85% predictive accuracy using TALs derived from the TG-GATES or DrugMatrix studies, respectively. In a dataset generated from the livers of male rats exposed to 16 chemicals at up to 10 doses for 5 days, we found that our NAM coupled with targeted RNA-Seq (TempO-Seq) could be used to identify tumorigenic chemicals with predictive accuracies of up to 91%. Overall, these results demonstrate that our NAM can be applied to both microarray and (targeted) RNA-Seq data generated from short-term rat exposures to identify chemicals, their doses, and mode of action that would induce liver tumors, one of the most common endpoints in rodent bioassays.
... These in vitro tests can help identify single pathways that may be impacted before using whole animal studies. However, the complexity of the processes that occur even in a single cell and the numerous pathway interactions that can either enhance or reduce effects, whole animal studies are, at this time, necessary to identify potential effects in humans (Juberg et al., 2017;Knudsen et al., 2021;Knudsen et al., 2015;Rowlands et al., 2014). ...
Article
Full-text available
Animals and animal models have been invaluable for our current understanding of human and animal biology, including physiology, pharmacology, biochemistry, and disease pathology. However, there are increasing concerns with continued use of animals in basic biomedical, pharmacological, and regulatory research to provide safety assessment for drugs and chemicals. There are concerns that animals do not provide sufficient information on toxicity and/or efficacy to protect the target population, so scientists are utilizing the principles of the 3Rs (replacement, reduction, and refinement) and increasing development and application of new approach methods (NAMs). NAMs are any technology, methodology, approach, or assay used to understand effects and mechanisms of drugs or chemicals with specific focus on applying the 3Rs. Although progress has been made in several areas with NAMs, complete replacement of animal models with NAMs is not yet attainable. The road to NAMs requires additional development, increased use, and for regulatory decision-making, usually formal validation. Moreover, it is likely that replacement of animal models with NAMs will require multiple assays to ensure sufficient biological coverage. The purpose of this manuscript is to provide a balanced view of the current state of use of animal models and NAMs as approaches to development, safety, efficacy, and toxicity testing of drugs and chemicals. Animals do not provide all needed information nor do NAMs, but each can elucidate key pieces of the puzzle of human and animal biology and contribute to the goal of protecting human and animal health.
... Conceived in 2012, the EDSP has received limited support from the EPA, with only 52 chemicals screened through its first tier of evaluative assays ('to identify chemicals with the potential to interact with oestrogen, androgen or thyroid receptors, or chemicals that alter steroidogenesis') and zero chemicals tested for endocrine disruption in its second tier of assays ('to evaluate endocrine-mediated adverse outcomes') over 25 years of the programme 35 . In 2023, the EPA proposed the replacement of several EDSP screening assays with high-throughput screening methods to rapidly screen a large number of diverse chemical samples to identify candidates and predict adverse health outcomes 36,37 . These high-throughput screening methods (for example, ToxCast and Tox21 (ref. ...
Article
Full-text available
Endocrine-disrupting chemicals (EDCs) are substances generated by human industrial activities that are detrimental to human health through their effects on the endocrine system. The global societal and economic burden posed by EDCs is substantial. Poorly defined or unenforced policies can increase human exposure to EDCs, thereby contributing to human disease, disability and economic damage. Researchers have shown that policies and interventions implemented at both individual and government levels have the potential to reduce exposure to EDCs. This Review describes a set of evidence-based policy actions to manage, minimize or even eliminate the widespread use of these chemicals and better protect human health and society. A number of specific challenges exist: defining, identifying and prioritizing EDCs; considering the non-linear or non-monotonic properties of EDCs; accounting for EDC exposure effects that are latent and do not appear until later in life; and updating testing paradigms to reflect 'real-world' mixtures of chemicals and cumulative exposure. A sound strategy also requires partnering with health-care providers to integrate strategies to prevent EDC exposure in clinical care. Critical next steps include addressing EDCs within global policy frameworks by integrating EDC exposure prevention into emerging climate policy.
... Although in recent decades the use of animals began to increase again, mainly due to the development of genetically modified animals [2], closer to 96% of the first clinical trials for drug development fail despite having been successfully tested on animals during preclinical tests [5] and animal toxicity testing fails to predict toxicity in almost 50% of drugs in the pipeline between Phase I trials and early post-market withdrawals [6]. Current efforts focus on predictive toxicology, an approach that seeks to predict in vivo toxicity responses from in vitro data and in silico models in drug screening and chemical risk assessment [7]. Some alternatives have been extended due to improving prediction in data extrapolation, and one of these is physiologically based pharmacokinetic (PBPK) modeling that allows a transition from descriptive to predictive toxicology. ...
Article
Introduction Physiologically based pharmacokinetic (PBPK) modeling is a computational approach that simulates the anatomical structure of the studied species and presents the organs and tissues as compartments interconnected by arterial and venous blood flows. Aim The aim of this systematic review was to analyze the published articles focused on the development of PBPK models for interspecies extrapolation in the disposition of drugs and health risk assessment, presenting to this modeling an alternative to reduce the use of animals. Methods For this purpose, a systematic search was performed in PubMed using the following search terms: “PBPK” and “Interspecies extrapolation”. The revision was performed according to PRISMA guidelines. Results In the analysis of the articles, it was found that rats and mice are the most commonly used animal models in the PBPK models; however, most of the physiological and physicochemical information used in the reviewed studies were obtained from previous publications. Additionally, most of the PBPK models were developed to extrapolate pharmacokinetic parameters to humans and the main application of the models was for toxicity testing. Conclusion PBPK modeling is an alternative that allows the integration of in vitro and in silico data as well as parameters reported in the literature to predict the pharmacokinetics of chemical substances, reducing in large quantity the use of animals that are required in traditional studies.
... For example, as discussed above, Chiu et al. 's Bayesian statistical approach to estimate the range of variability in response to TCE in genetically diverse strains of mice [91] may be applied in models assessing human response variability. Furthermore, as scientists establish more representative models (e.g., tissue/organ bioengineered models) of human development for toxicological investigations as well as more sophisticated statistical approaches to model these data more accurately [122], we expect that NAMs will prove to be more useful at predicting the genetic component of human variability. ...
Article
Full-text available
A key element of risk assessment is accounting for the full range of variability in response to environmental exposures. Default dose-response methods typically assume a 10-fold difference in response to chemical exposures between average (healthy) and susceptible humans, despite evidence of wider variability. Experts and authoritative bodies support using advanced techniques to better account for human variability due to factors such as in utero or early life exposure and exposure to multiple environmental, social, and economic stressors. This review describes: 1) sources of human variability and susceptibility in dose-response assessment, 2) existing US frameworks for addressing response variability in risk assessment; 3) key scientific inadequacies necessitating updated methods; 4) improved approaches and opportunities for better use of science; and 5) specific and quantitative recommendations to address evidence and policy needs. Current default adjustment factors do not sufficiently capture human variability in dose-response and thus are inadequate to protect the entire population. Susceptible groups are not appropriately protected under current regulatory guidelines. Emerging tools and data sources that better account for human variability and susceptibility include probabilistic methods, genetically diverse in vivo and in vitro models, and the use of human data to capture underlying risk and/or assess combined effects from chemical and non-chemical stressors. We recommend using updated methods and data to improve consideration of human variability and susceptibility in risk assessment, including the use of increased default human variability factors and separate adjustment factors for capturing age/life stage of development and exposure to multiple chemical and non-chemical stressors. Updated methods would result in greater transparency and protection for susceptible groups, including children, infants, people who are pregnant or nursing, people with disabilities, and those burdened by additional environmental exposures and/or social factors such as poverty and racism.
... Among the aforementioned strategies and approaches, it is clear that to overcome the lack of data, improve existing techniques and enhance drug design, we must consider the feasibility and rely on the combination of different and diversified methods (e.g. in silico and in vitro) available. As some pipelines may entrust in vitro [194][195][196] or in silico [197][198][199] approaches solely, the employment of combined computational predictions with biological evaluation could provide better outcomes. For example, the study of cruzain, a cysteine-protease of Trypanosoma cruzi, assessed different theoretical, predictive, and biological assessment approaches together with VS, which resulted in sets of various enzymatic inhibitors through the decade [200][201][202][203], with the potential for computational simulations results reach to the clinics [204]. ...
Article
Introduction: Modern drug discovery generally is accessed by useful information from previous large databases or uncovering novel data. The lack of biological and/or chemical data tends to slow the development of scientific research and innovation. Here, approaches that may help provide solutions to generate or obtain enough relevant data or improve/accelerate existing methods within the last five years were reviewed. Areas covered: One-shot learning (OSL) approaches, structural modeling, molecular docking, scoring function space (SFS), molecular dynamics (MD), and quantum mechanics (QM) may be used to amplify the amount of available data to drug design and discovery campaigns, presenting methods, their perspectives, and discussions to be employed in the near future. Expert opinion: Recent works have successfully used these techniques to solve a range of issues in the face of data scarcity, including complex problems such as the challenging scenario of drug design aimed at intrinsically disordered proteins and the evaluation of potential adverse effects in a clinical scenario. These examples show that it is possible to improve and kickstart research from scarce available data to design and discover new potential drugs.
... Many other works focused on the use of computer models for toxicity prediction and how to identify the knowledge gaps in this field [36][37][38][39]. Adopting innovative approaches for the prediction and extrapolation of time dependent findings like time series analysis models could also be considered as next steps for this work, however expanding the database and analyzing more studies using the developed workflow would be needed prior to the application of such predictive approaches. ...
Preprint
In-vivo toxicity assessment is an important step prior to clinical development and is still the main source of data for overall risk assessment of a new molecular entity (NCE). All in-vivo studies are performed according to regulatory requirements and many efforts have been exerted to minimize these studies in accordance with the (Replacement, Reduction and Refinement) 3Rs principle. Many aspects of in-vivo toxicology packages can be optimized to reduce animal use, including the number of studies performed as well as study durations, which is the main focus of this analysis. We performed a statistical comparison of adverse findings observed in 116 short-term versus 78 long-term studies in order to explore the possibility of using only short-term studies as a prediction tool for the longer-term effects. Annotation of treatment related findings was one of the challenges faced during this work. A specific focus was therefore put on the summary and conclusion sections of the reports since they contain expert assessments on whether the findings were considered adverse or were attributed to other reasons. Our analysis showed a general good concordance between short-term and long-term toxicity findings for large molecules and the majority of small molecules. Less concordance was seen for certain target organ systems findings. While this work supports the minimization of in-vivo study durations, a larger-scale effort would be needed to provide more evidence. We therefore present the steps performed in this study as an open-source R workflow (CSL-Tox) and we provide the dataset used in the work to allow researchers to reproduce such analysis and to promote large-scale application of this study.
... Hence, extension to predict the toxicity of RS is made regularly nowadays [14]. Predictivetoxicology approaches are now supporting in-silico absorption, distribution, metabolism, and elimination evaluations [15]. The goal of the ADMET predictions is to estimate the in-vivo kinetic behavior and toxicity of the compounds in the early stages of drug development [16]. ...
Article
Pharmaceutical drug analysis (PDA), besides quantifying drugs and related substances (RS), can enrich drug discovery (DD) by suggesting new leads. PDA may be extended towards comparative in-silico predictions for drugs and RS. This may lead to the assessment of drug likeliness of nontoxic RS as an incentive to study them further. This review overviews the in-silico profiling of drugs and RS as an innovative scope. It may help extending classical PDA with in-silico determinations to widen horizon from regulatory toxicology evaluation towards drug discovery. The virtual screening of selected RS may indicate them as potential DD leads. The extension of impurity profiling, after toxicity evaluation, to ADME estimation, QSAR studies, molecular docking, and bioactivity prediction, has widened the purposes of drug analysis. The RS which are predicted to have low toxicity may be in-silico exploited for their therapeutic potential prior to entering into the DD path.
... In vitro studies on human cell lines can contribute to filling in the gap between animal studies and deriving limits for human exposure and therefore have been extensively used as they provide multiple E2-responsive endpoint determinants, including cell proliferation and gene expression. [13,14] The present in vitro study was designed to add to the body of ...
Article
Methyl paraben (MP) is an endocrine-disrupting compound that possesses es-trogenic properties and contributes to an aberrant burden of estrogen signaling in the human breast and subsequently increasing the risks for the development of breast cancer. The exact exposure, as well as the safe concentrations, are variable among daily products. The present study addresses the effects of exposure to escalated concentrations of MP on the proliferation of MCF-7 breast cancer cells in addition to exploring its other mechanisms of action. The study involved exposure of cultured MCF-7 breast cancer cells to seven MP concentrations, ranging from 40 to 800 µM for 5 days. Cell viability, apoptosis, and proliferation were respectively assessed using crystal violet test, flow cytometric analysis, and quantitative real-time polymerase chain reaction for Ki-67 expression. The estradiol (E2) secretion and oxidative stress were also assessed and analyzed in correlation to MP's proliferation and cytotoxicity potentials. The results showed that the maximum proliferative concentration of MP was 800 µM. At a concentration of 40 μM and higher, MP induced increased expression of Ki-67, denoting enhanced proliferation of the cells in monolayer culture. A positive correlation between the detrimental oxidative stress effect of MP's tested concentrations, cell proliferation, and viability was demonstrated (p < 0.05). Our results indicated that MP at high doses induced sustained cell proliferation due to E2 secretion as well as its antioxidant activity. Accordingly, it was concluded that high and unpredicted exposure to MP might carry a carcinogenic hazard on estrogen receptor-positive breast cancer cells. K E Y W O R D S cell proliferation, cell viability, estrogen hormone, Ki67, MCF-7 cells, methyl paraben, oxidative stress index J Biochem Mol Toxicol. 2022;e23012. wileyonlinelibrary.com/journal/jbt
... Several high-throughput screening (HTS) programs now exist [e.g., Tox21 (Attene-Ramos et al. 2013), ToxCast (Dix et al. 2007)], providing activity estimates for thousands of chemicals across hundreds of in vitro assays. Researchers have used ToxCast's in vitro data to model in vivo hazard (Knudsen et al. 2015), with many achieving robust models with >70% success for diverse end points such as rat reproductive toxicity , prenatal developmental toxicity (Sipes et al. 2011), and hepatotoxicity (Liu et al. 2015). In contrast, others have achieved poorer predictive success, hypothesized to result in part from missing mechanistic pathways not covered by ToxCast assays in the early phase data releases (Schwarzman et al. 2015). ...
Article
Full-text available
Background: Research suggests environmental contaminants can impact metabolic health; however, high costs prohibit in vivo screening of putative metabolic disruptors. High-throughput screening programs, such as ToxCast, hold promise to reduce testing gaps and prioritize higher-order (in vivo) testing. Objectives: We sought to a) examine the concordance of in vitro testing in 3T3-L1 cells to a targeted literature review for 38 semivolatile environmental chemicals, and b) assess the predictive utility of various expert models using ToxCast data against the set of 38 reference chemicals. Methods: Using a set of 38 chemicals with previously published results in 3T3-L1 cells, we performed a metabolism-targeted literature review to determine consensus activity determinations. To assess ToxCast predictive utility, we used two published ToxPi models: a) the 8-Slice model published by Janesick et al. (2016) and b) the 5-Slice model published by Auerbach et al. (2016). We examined the performance of the two models against the Janesick in vitro results and our own 38-chemical reference set. We further evaluated the predictive performance of various modifications to these models using cytotoxicity filtering approaches and validated our best-performing model with new chemical testing in 3T3-L1 cells. Results: The literature review revealed relevant publications for 30 out of the 38 chemicals (the remaining 8 chemicals were only examined in our previous 3T3-L1 testing). We observed a balanced accuracy (average of sensitivity and specificity) of 0.86 comparing our previous in vitro results to the literature-derived calls. ToxPi models provided balanced accuracies ranging from 0.55 to 0.88, depending on the model specifications and reference set. Validation chemical testing correctly predicted 29 of 30 chemicals as per 3T3-L1 testing, suggesting good adipogenic prediction performance for our best adapted model. Discussion: Using the most recent ToxCast data and an updated ToxPi model, we found ToxCast performed similarly to that of our own 3T3-L1 testing in predicting consensus calls. Furthermore, we provide the full ranked list of largely untested chemicals with ToxPi scores that predict adipogenic activity and that require further investigation. https://doi.org/10.1289/EHP6779.