[Show abstract][Hide abstract] ABSTRACT: Deleterious effects of prenatal tobacco smoking on fetal growth and newborn weight are well-established. One of the proposed mechanisms underlying this relationship is alterations in epigenetic programming. We selected 506 newborns from a population-based prospective birth cohort in the Netherlands. Prenatal parental tobacco smoking was assessed using self-reporting questionnaires. Information on birth outcomes was obtained from medical records. The deoxyribonucleic acid (DNA) methylation of the growth genes IGF2DMR and H19 was measured in newborn umbilical cord white blood cells. Associations were assessed between parental tobacco smoking and DNA methylation using linear mixed models and adjusted for potential confounders.
The DNA methylation levels of IGF2DMR and H19 in the non-smoking group were median (90 % range), 54.0 % (44.6-62.0), and 30.0 % (25.5-34.0), in the first trimester only smoking group 52.2 % (44.5-61.1) and 30.8 % (27.1-34.1), and in the continued smoking group 51.6 % (43.9-61.3) and 30.2 % (23.7-34.8), respectively. Continued prenatal maternal smoking was inversely associated with IGF2DMR methylation (β = -1.03, 95 % CI -1.76; -0.30) in a dose-dependent manner (P-trend = 0.030). This association seemed to be slightly more profound among newborn girls (β = -1.38, 95 % CI -2.63; -0.14) than boys (β = -0.72, 95 % CI -1.68; 0.24). H19 methylation was also inversely associated continued smoking <5 cigarettes/day (β = -0.96, 95 % CI -1.78; -0.14). Moreover, the association between maternal smoking and newborns small for gestational age seems to be partially explained by IGF2DMR methylation (β = -0.095, 95 % CI -0.249; -0.018). Among non-smoking mothers, paternal tobacco smoking was not associated with IGF2DMR or H19 methylation.
Maternal smoking is inversely associated with IGF2DMR methylation in newborns, which can be one of the underlying mechanisms through which smoking affects fetal growth.
[Show abstract][Hide abstract] ABSTRACT: A fast and stable algorithm for estimating multidimensional adaptive P-spline models is presented. We call it as Separation of Overlapping Penalties (SOP) as it is an extension of the Separation of Anisotropic Penalties (SAP) algorithm. SAP was originally derived for the estimation of the smoothing parameters of a multidimensional tensor product P-spline model with anisotropic penalties.
30th International Workshop on Statistical Modelling; 07/2015
[Show abstract][Hide abstract] ABSTRACT: Understanding the genetic basis of plant development in potato requires a proper characterization of plant morphology over time. Parameters related to different aging stages can be used to describe the developmental processes. It is attractive to map these traits simultaneously in a QTL analysis; because the power to detect a QTL will often be improved and it will be easier to identify pleiotropic QTLs. We included complex, agronomic traits together with plant development parameters in a multi-trait QTL analysis. First, the results of our analysis led to coherent insight into the genetic architecture of complex traits in potato. Secondly, QTL for parameters related to plant development were identified. Thirdly, pleiotropic regions for various types of traits were identified. Emergence, number of main stems, number of tubers and yield were explained by 9, 5, 4 and 6 QTL, respectively. These traits were measured once during the growing season. The genetic control of flowering, senescence and plant height, which were measured at regular time intervals, was explained by 9, 10 and 12 QTL, respectively. Genetic relationships between aboveground and belowground traits in potato were observed in 14 pleiotropic QTL. Some of our results suggest the presence of QTL-by-Environment interactions. Therefore, additional studies comparing development under different photoperiods are required to investigate the plasticity of the crop.
[Show abstract][Hide abstract] ABSTRACT: One of the difficulties in modeling visual field (VF) data is the sometimes large and correlated measurement errors in the point-wise sensitivity estimates. As these errors affect all locations of the same VF, we propose to model them as global visit effects (GVE). We evaluate this model and show the effect it has on progression estimation and prediction.
Visual field series (24-2 Full Threshold; 15 biannual VFs per patient) of 125 patients with primary glaucoma were included in the analysis. The contribution of the GVE was evaluated by comparing the fitting and predictive ability of a conventional model, which does not contain GVE, to such a model that incorporates the GVE. Moreover, the GVE's effect on the estimated slopes was evaluated by determining the absolute difference between the slopes of the models. Finally, the magnitude of the GVE was compared with that of other measurement errors.
The GVE model showed a significant improvement in both the model fit and predictive ability over the conventional model, especially when the number of VFs in a series is limited. The average absolute difference in slopes between the models was 0.13 dB/y. Lastly, the magnitude of the GVE was more than three times larger than the measureable factors combined.
By incorporating the GVE in the longitudinal modeling of VF data, better estimates may be obtained of the rate of progression as well as of predicted future sensitivities.
[Show abstract][Hide abstract] ABSTRACT: In many settings of empirical interest, time variation in the distribution parameters is important for capturing the dynamic behaviour of time series processes. Although the fitting of heavy tail distributions has become easier due to computational advances, the joint and explicit modelling of time-varying conditional skewness and kurtosis is a challenging task. We propose a class of parameter-driven time series models referred to as the generalized structural time series (GEST) model. The GEST model extends Gaussian structural time series models by a) allowing the distribution of the dependent variable to come from any parametric distribution, including highly skewed and kurtotic distributions (and mixed distributions) and b) expanding the systematic part of parameter-driven time series models to allow the joint and explicit modelling of all the distribution parameters as structural terms and (smoothed) functions of independent variables. The paper makes an applied contribution in the development of a fast local estimation algorithm for the evaluation of a penalised likelihood function to update the distribution parameters over time without the need for evaluation of a high-dimensional integral based on simulation methods.
[Show abstract][Hide abstract] ABSTRACT: The Bayesian approach has become increasingly popular because it allows to
model quite complex models via Markov chain Monte Carlo (MCMC) sampling.
However, it is also recognized nowadays that MCMC sampling can become
computationally prohibitive when a complex model needs to be fit to a large
data set. To overcome this problem, we applied and extended a recently proposed
two-stage approach to model a complex hierarchical data structure of glaucoma
patients who participate in an ongoing Dutch study. Glaucoma is one of the
leading causes of blindness in the world. In order to detect deterioration at
an early stage, a model for predicting visual fields (VF) in time is needed.
Hence, the true underlying VF progression can be determined, and treatment
strategies can then be optimized to prevent further VF loss. Since we were
unable to fit these data with the classical one-stage approach upon which the
current popular Bayesian software is based, we made use of the two-stage
Bayesian approach. The considered hierarchical longitudinal model involves
estimating a large number of random effects and deals with censoring and high
measurement variability. In addition, we extended the approach with tools for
[Show abstract][Hide abstract] ABSTRACT: The beneficial health effects of fruits and vegetables have been attributed to their polyphenol content. These compounds undergo many bioconversions in the body. Modeling polyphenol exposure of humans upon intake is a prerequisite for understanding the modulating effect of the food matrix and the colonic microbiome. This modeling is not a trivial task and requires a careful integration of measuring techniques, modeling methods and experimental design. Moreover, both at the population level as well as the individual level polyphenol exposure has to be quantified and assessed. We developed a strategy to quantify polyphenol exposure based on the concept of nutrikinetics in combination with population-based modeling. The key idea of the strategy is to derive nutrikinetic model parameters that summarize all information of the polyphenol exposure at both individual and population level. This is illustrated by a placebo-controlled crossover study in which an extract of wine/grapes and black tea solids was administered to twenty subjects. We show that urinary and plasma nutrikinetic time-response curves can be used for phenotyping the gut microbial bioconversion capacity of individuals. Each individual harbours an intrinsic microbiota composition converting similar polyphenols from both test products in the same manner and stable over time. We demonstrate that this is a novel approach for associating the production of two gut-mediated γ-valerolactones to specific gut phylotypes. The large inter-individual variation in nutrikinetics and γ-valerolactones production indicated that gut microbial metabolism is an essential factor in polyphenol exposure and related potential health benefits.
[Show abstract][Hide abstract] ABSTRACT: Maternal one-carbon (1-C) metabolism provides methylgroups for fetal development and programing by DNA methylation as one of the underlying epigenetic mechanisms. We aimed to investigate maternal 1-C biomarkers, folic acid supplement use, and MTHFR C677T genotype as determinants of 1-C metabolism in early pregnancy in association with newborn DNA methylation levels of fetal growth and neurodevelopment candidate genes. The participants were 463 mother-child pairs of Dutch national origin from a large population-based birth cohort in Rotterdam, The Netherlands. In early pregnancy (median 13.0 weeks, 90% range 10.4-17.1), we assessed the maternal folate and homocysteine blood concentrations, folic acid supplement use, and the MTHFR C677T genotype in mothers and newborns. In newborns, DNA methylation was measured in umbilical cord blood white blood cells at 11 regions of the seven genes: NR3C1, DRD4, 5-HTT, IGF2DMR, H19, KCNQ1OT1, and MTHFR. The associations between the 1-C determinants and DNA methylation were examined using linear mixed models. An association was observed between maternal folate deficiency and lower newborn DNA methylation, which attenuated after adjustment for potential confounders. The maternal MTHFR TT genotype was significantly associated with lower DNA methylation. However, maternal homocysteine and folate concentrations, folic acid supplement use, and the MTHFR genotype in the newborn were not associated with newborn DNA methylation. The maternal MTHFR C677T genotype, as a determinant of folate status and 1-C metabolism, is associated with variations in the epigenome of a selection of genes in newborns. Research on the implications of these variations in methylation on gene expression and health is recommended.
[Show abstract][Hide abstract] ABSTRACT: STUDY QUESTION Is in vitro fertilization treatment with or without intracytoplasmatic sperm injection (IVF/ICSI) associated with changes in first and
second trimester embryonic and fetal growth trajectories and birthweight in singleton pregnancies?
Human Reproduction 10/2014; 29(12). DOI:10.1093/humrep/deu271 · 4.57 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Deconvolution of noisy signals and images is an important task in various areas, examples are: chemometrics, biology and imaging. When the solution is required to be sparse, desirable results are obtained using penalized estimation techniques. Sparseness is realized by shrinking coefficients to zero. We use penalized regression with a penalty based on the L0 norm, as presented, for one dimensional data, in earlier work. Several extensions to this approach are presented. In case of blind deconvolution, a smoother is applied to improve the estimated impulse response, which is applicable to any unimodal response function. Results are demonstrated on pulse identification in endocrine data where it is aimed to model secretion pattern as a sparse series of spikes. Application to single-molecule fluorescence imaging is also demonstrated for functional superresolution in cell biology.
[Show abstract][Hide abstract] ABSTRACT: X-ray diffraction scans consist of series of counts; these numbers obey Poisson distributions with varying expected values. These scans are often smoothed and the Kα2 component is removed. This article proposes a framework in which both issues are treated. Penalized likelihood estimation is used to smooth the data. The penalty combines the Poisson log-likelihood and a measure for roughness based on ideas from generalized linear models. To remove the Kα doublet the model is extended using the composite link model. As a result the data are decomposed into two smooth components: a Kα1 and a Kα2 part. For both smoothing and Kα2 removal, the weight of the applied penalty is optimized automatically. The proposed methods are applied to experimental data and compared with the Savitzky–Golay algorithm for smoothing and the Rachinger method for Kα2 stripping. The new method shows better results with less local distortion. Freely available software in MATLAB and R has been developed.
[Show abstract][Hide abstract] ABSTRACT: A new computational algorithm for estimating the smoothing parameters of a multidimensional penalized spline generalized linear model with anisotropic penalty is presented. This new proposal is based on the mixed model representation of a multidimensional P-spline, in which the smoothing parameter for each covariate is expressed in terms of variance components. On the basis of penalized quasi-likelihood methods, closed-form expressions for the estimates of the variance components are obtained. This formulation leads to an efficient implementation that considerably reduces the computational burden. The proposed algorithm can be seen as a generalization of the algorithm by Schall (1991)—for variance components estimation—to deal with non-standard structures of the covariance matrix of the random effects. The practical performance of the proposed algorithm is evaluated by means of simulations, and comparisons with alternative methods are made on the basis of the mean square error criterion and the computing time. Finally, we illustrate our proposal with the analysis of two real datasets: a two dimensional example of historical records of monthly precipitation data in USA and a three dimensional one of mortality data from respiratory disease according to the age at death, the year of death and the month of death.
Statistics and Computing 04/2014; 25(5). DOI:10.1007/s11222-014-9464-2 · 1.62 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Purpose:
To introduce a method to optimize structural retinal nerve fiber layer (RNFL) models based on glaucomatous visual field data and to show how such an optimized model can be used to reduce noise in visual fields while probably preserving clinically important features.
Correlation coefficients between age-adjusted deviation values of pairs of visual field test locations were calculated from 103 visual fields of eyes with moderate glaucomatous damage. Distances between those test locations were defined for various parameters of a mathematical RNFL model. Then, the correspondence between the structural and functional data was defined by the spread, or variance, of the correlation coefficients for all distances. The model parameters that minimized this spread constituted the optimized model. To reduce noise in visual fields, the optimized model was used to smooth visual field data according to the RNFL's structure. The resulting fields were compared with visual fields that were smoothed based on the regular testing grid.
The optimal parameters for the RNFL model reduced the variance of the correlation coefficients by 78% and were well within the range of parameters previously determined from fundus photographs. Smoothing the visual fields based on the optimized RNFL model strongly reduced noise while keeping important features.
Mathematic RNFL models can be optimized based on visual field data, resulting in a strong structure-function relationship. Taking the RNFL's shape, as defined by such an optimized model, into account when smoothing visual fields results in better noise reduction while preserving important details.