Article

Sloppy Models, Parameter Uncertainty, and the Role of Experimental Design

Department of Biological Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA.
Molecular BioSystems (Impact Factor: 3.21). 10/2010; 6(10):1890-900. DOI: 10.1039/b918098b
Source: PubMed

ABSTRACT

Computational models are increasingly used to understand and predict complex biological phenomena. These models contain many unknown parameters, at least some of which are difficult to measure directly, and instead are estimated by fitting to time-course data. Previous work has suggested that even with precise data sets, many parameters are unknowable by trajectory measurements. We examined this question in the context of a pathway model of epidermal growth factor (EGF) and neuronal growth factor (NGF) signaling. Computationally, we examined a palette of experimental perturbations that included different doses of EGF and NGF as well as single and multiple gene knockdowns and overexpressions. While no single experiment could accurately estimate all of the parameters, experimental design methodology identified a set of five complementary experiments that could. These results suggest optimism for the prospects for calibrating even large models, that the success of parameter estimation is intimately linked to the experimental perturbations used, and that experimental design methodology is important for parameter fitting of biological models and likely for the accuracy that can be expected from them.

Download full-text

Full-text

Available from: Joshua Apgar, Nov 24, 2015
  • Source
    • "). We point out that this is the premise invoked also in the context of Sloppy Models [35] [36] [37], whose behavior depends only on a few stiff combinations of parameters (accounted here by W and y), with many sloppy parameter directions largely unimportant for model predictions (accounted here by η z ). We also note here the fundamental difference with PCA decompositions which attain the same form as Equation(9). "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper is concerned with a lesser-studied problem in the context of model-based, uncertainty quantification (UQ), that of optimization/design/control under uncertainty. The solution of such problems is hindered not only by the usual difficulties encountered in UQ tasks (e.g. the high computational cost of each forward simulation, the large number of random variables) but also by the need to solve a nonlinear optimization problem involving large numbers of design variables and potentially constraints. We propose a framework that is suitable for a large class of such problems and is based on the idea of recasting them as probabilistic inference tasks. To that end, we propose a Variational Bayesian (VB) formulation and an iterative VB-Expectation-Maximization scheme that is also capable of identifying a low-dimensional set of directions in the design space, along which, the objective exhibits the largest sensitivity. We demonstrate the validity of the proposed approach in the context of two numerical examples involving $\mathcal{O}(10^3)$ random and design variables. In all cases considered the cost of the computations in terms of calls to the forward model was of the order $\mathcal{O}(10^2)$. The accuracy of the approximations provided is assessed by appropriate information-theoretic metrics.
    Full-text · Article · Jul 2015
  • Source
    • "Experiments are used to test hypotheses, and the more complex a hypothesis, the more complicated and numerous the necessary experimental tests are likely to be. Models can potentially be used to carefully design experimental tests that would be optimal for supporting or disproving a hypothesis [37] [38] [39]. Second, models can also be used to reconcile surprising or conflicting data. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions.
    Full-text · Article · Jul 2015 · Physical Biology
  • Source
    • "In the case of an integral approach, experimental design can give data coverage for many parameter directions and maximize predictive accuracy (Apgar et al. 2010), because large uncertainty parameter directions in an experiment can correspond to less uncertainty parameter direction in other experiments (Apgar et al. 2010). The effect of the multi-fitting complementary design is the constriction of parameters (Gutenkunst et al. 2007b) in sloppy multi-parameter models with few stiff parameters and many sloppy parameter directions (Daniels et al. 2008). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Modern researchers working in applied animal science systems have faced issues with modelling huge quantities of data. Modelling approaches that use to be useful to model biological systems are having problems to adapt to increased number of publications and research. In order to develop new approaches that have potential to deal with these fast- changing complex conditions, it is relevant to review modern modelling approaches that have been used successfully in other fields. Therefore, this paper reviews the potential capacity of new integrated applied animal science approaches to discriminate parameters, interpret data and understand biological process. The analysis shows that the principal challenge is handling ill- conditioned complex models, but an integrated approach can obtain meaningful information from complementary data that cannot be obtained from present applied animal science approaches. Furthermore, it is shown that parameter sloppiness and data complementarity are key concepts during system behavior restrictions and parameter discrimination. Additionally, model evaluation and implementation of the potential integrated approach are reviewed. Finally, the objective of an integral approach is discussed. Our conclusion is that these approaches have potential to be used to deepen the understanding of applied animal systems, and that exist enough developed resources and methodologies to deal with the huge quantities of data associated with this science.
    Full-text · Article · Oct 2014 · Animal Production Science
Show more