Robert L Strawderman

Cornell University, Ithaca, NY, United States

Are you Robert L Strawderman?

Claim your profile

Publications (43)123.88 Total impact

  • [show abstract] [hide abstract]
    ABSTRACT: BACKGROUND: -Disease management programs for patients hospitalized with heart failure (HF) although effective, are often resource intensive, limiting their uptake. Peer support programs have led to improved outcomes among patients with other chronic conditions and may result in similar improvements for HF patients. METHODS AND RESULTS: -In this randomized controlled trial, Reciprocal Peer Support (RSP) arm patients participated in a HF nurse practitioner (NP)-led goal setting group session, received brief training in peer communication skills, and were paired with another participant in their cohort with whom they were encouraged to talk weekly using a telephone platform. Participants were also encouraged to attend three NP-facilitated peer support group sessions. Patients in the Nurse Care Management (NCM) arm attended a NP-led session to address their HF care questions and receive HF educational materials and information on how to access care management services. The median age of the patients was 69 years, 51% were female, and 26% were racial/ethnic minorities. Only 55% of RPS patients participated in peer calls or group sessions. In intention-to-treat analyses, the RPS and NCM groups did not differ in time to first all-cause rehospitalization or death or in mean numbers of rehospitalizations or deaths. There were no differences in improvements in 6-month measures of HF-specific quality of life or social support. CONCLUSIONS: -Among patients recently hospitalized for HF, over half of RPS participants had no or minimal engagement with the reciprocal peer support program, and the program did not improve outcomes compared to usual HF-nurse care management. Clinical Trial Registration-URL: http://www.clinicaltrials.gov. Unique identifier: NCT00508508.
    Circulation Heart Failure 02/2013; · 6.68 Impact Factor
  • [show abstract] [hide abstract]
    ABSTRACT: BACKGROUND: Favorable health outcomes are more likely to occur when the clinical team recognizes patients at risk and intervenes in consort. Prediction rules can identify high-risk subsets, but the availability of multiple rules for various conditions present implementation and assimilation challenges. METHODS: A prediction rule for 30-day mortality at the beginning of the hospitalization was derived in a retrospective cohort of adult inpatients from a community hospital in the Midwestern United States from 2008 to 2009, using clinical laboratory values, past medical history, and diagnoses present on admission. It was validated using 2010 data from the same and from a different hospital. The calculated mortality risk was then used to predict unplanned transfers to intensive care units, resuscitation attempts for cardiopulmonary arrests, a condition not present on admission (complications), intensive care unit utilization, palliative care status, in-hospital death, rehospitalizations within 30 days, and 180-day mortality. RESULTS: The predictions of 30-day mortality for the derivation and validation datasets had areas under the receiver operating characteristic curve of 0.88. The 30-day mortality risk was in turn a strong predictor for in-hospital death, palliative care status, 180-day mortality; a modest predictor for unplanned transfers and cardiopulmonary arrests; and a weaker predictor for the other events of interest. CONCLUSIONS: The probability of 30-day mortality provides health systems with an array of prognostic information that may provide a common reference point for organizing the clinical activities of the many health professionals involved in the care of the patient. Journal of Hospital Medicine 2012; © 2012 Society of Hospital Medicine.
    Journal of Hospital Medicine 12/2012; · 1.40 Impact Factor
  • Source
    Karen Lostritto, Robert L Strawderman, Annette M Molinaro
    [show abstract] [hide abstract]
    ABSTRACT: Accurately assessing a patient's risk of a given event is essential in making informed treatment decisions. One approach is to stratify patients into two or more distinct risk groups with respect to a specific outcome using both clinical and demographic variables. Outcomes may be categorical or continuous in nature; important examples in cancer studies might include level of toxicity or time to recurrence. Recursive partitioning methods are ideal for building such risk groups. Two such methods are Classification and Regression Trees (CART) and a more recent competitor known as the partitioning Deletion/Substitution/Addition (partDSA) algorithm, both of which also utilize loss functions (e.g., squared error for a continuous outcome) as the basis for building, selecting, and assessing predictors but differ in the manner by which regression trees are constructed. Recently, we have shown that partDSA often outperforms CART in so-called "full data" settings (e.g., uncensored outcomes). However, when confronted with censored outcome data, the loss functions used by both procedures must be modified. There have been several attempts to adapt CART for right-censored data. This article describes two such extensions for partDSA that make use of observed data loss functions constructed using inverse probability of censoring weights. Such loss functions are consistent estimates of their uncensored counterparts provided that the corresponding censoring model is correctly specified. The relative performance of these new methods is evaluated via simulation studies and illustrated through an analysis of clinical trial data on brain cancer patients. The implementation of partDSA for uncensored and right-censored outcomes is publicly available in the R package, partDSA.
    Biometrics 04/2012; · 1.41 Impact Factor
  • Lynn M Johnson, Robert L Strawderman
    [show abstract] [hide abstract]
    ABSTRACT: This paper proposes an estimation procedure for the semiparametric accelerated failure time frailty model that combines smoothing with an Expectation and Maximization-like algorithm for estimating equations. The resulting algorithm permits simultaneous estimation of the regression parameter, the baseline cumulative hazard, and the parameter indexing a general frailty distribution. We develop novel moment-based estimators for the frailty parameter, including a generalized method of moments estimator. Standard error estimates for all parameters are easily obtained using a randomly weighted bootstrap procedure. For the commonly used gamma frailty distribution, the proposed algorithm is very easy to implement using widely available numerical methods. Simulation results demonstrate that the algorithm performs very well in this setting. We re-analyz several previously analyzed data sets for illustrative purposes.
    Statistics in Medicine 03/2012; 31(21):2335-58. · 2.04 Impact Factor
  • Robert L. Strawderman, Martin T. Wells
    [show abstract] [hide abstract]
    ABSTRACT: Using a Bayesian model with a class of hierarchically specified scale-mixture-of-normals priors as motivation, we consider a generalization of the grouped LASSO in which an additional penalty is placed on the penalty parameter of the L2 norm. We show that the resulting MAP estimator obtained by jointly minimizing the corresponding objective function in both the mean and penalty parameter is a thresholding estimator that generalizes (i) the grouped lasso estimator of Yuan and Lin (2006) and (ii) the univariate minimax concave penalization procedure of Zhang (2010) to the setting of a vector of parameters. An exact formula for the risk and a corresponding SURE formula are obtained for the proposed class of estimators. ¶ A new universal threshold is proposed under appropriate sparsity assumptions; in combination with the proposed class of estimators, we subsequently obtain a new and interesting motivation for the class of positive part estimators. In particular, we establish that the original positive part estimator corresponds to a suboptimal choice of this thresholding parameter. Numerical comparisons between the proposed class of estimators and the positive part estimator show that the former can achieve further, significant reductions in risk near the origin.
    01/2012;
  • Herbert D Aronow, Robert L Strawderman, Mauro Moscucci, Mark E Cowen
    [show abstract] [hide abstract]
    ABSTRACT: Patients who undergo coronary artery stent procedures are at risk for late atherothrombotic events, including stent thrombosis. The relationship between the duration during which evidence-based medical therapies are utilized after coronary artery stenting and the risk of late atherothrombotic events is not well characterized. In a retrospective cohort study linking a hospital-based percutaneous coronary intervention registry with a health maintenance organization claims dataset, we related the duration of medical therapy utilization during follow up to the hazard for death, myocardial infarction, unstable angina, transient ischemic attack or stroke following a coronary artery stent procedure. Multivariable Cox models were employed in which medical treatments were entered as time-varying covariates; data were stratified by stent type and time period. The median [interquartile range, IQR] duration of follow up was 832 [460, 1420] days. During this time, 86 ischemic events occurred in 84 of 386 patients at a median [IQR] of 260 [110, 658] days. The incidence of atherothrombotic events following coronary artery stenting was highest during the first post-procedure year and declined substantially thereafter. Multivariable predictors of incident ischemic events included multivessel coronary artery disease (HR 2.01 [95% CI 1.30-3.11], p=0.0018) and longer duration angiotensin converting enzyme (ACE) inhibitor/angiotensin receptor blocker (ARB), beta blocker or statin therapy (HR 0.52 [95% CI 0.28-0.99], p=0.045). The use of longer-term ACE inhibitor/ARB, beta blocker or statin therapy was associated with a significantly lower risk; these risk reductions were of greater magnitude than those associated with clopidogrel.
    International journal of cardiology 12/2011; 153(3):262-6. · 7.08 Impact Factor
  • [show abstract] [hide abstract]
    ABSTRACT: The role of sound in Drosophila melanogaster courtship, along with its perception via the antennae, is well established, as is the ability of this fly to learn in classical conditioning protocols. Here, we demonstrate that a neutral acoustic stimulus paired with a sucrose reward can be used to condition the proboscis-extension reflex, part of normal feeding behavior. This appetitive conditioning produces results comparable to those obtained with chemical stimuli in aversive conditioning protocols. We applied a logistic model with general estimating equations to predict the dynamics of learning, which successfully predicts the outcome of training and provides a quantitative estimate of the rate of learning. Use of acoustic stimuli with appetitive conditioning provides both an alternative to models most commonly used in studies of learning and memory in Drosophila and a means of testing hearing in both sexes, independently of courtship responsiveness.
    Journal of Experimental Biology 09/2011; 214(Pt 17):2864-70. · 3.24 Impact Factor
  • Lei Liu, Robert L. Strawderman, Mark E. Cowen, Ya-Chen T. Shih
    [show abstract] [hide abstract]
    ABSTRACT: In this paper, we propose a flexible “two-part” random effects model ( [35] and [40]) for correlated medical cost data. Typically, medical cost data are right-skewed, involve a substantial proportion of zero values, and may exhibit heteroscedasticity. In many cases, such data are also obtained in hierarchical form, e.g., on patients served by the same physician. The proposed model specification therefore consists of two generalized linear mixed models (GLMM), linked together by correlated random effects. Respectively, and conditionally on the random effects and covariates, we model the odds of cost being positive (Part I) using a GLMM with a logistic link and the mean cost (Part II) given that costs were actually incurred using a generalized gamma regression model with random effects and a scale parameter that is allowed to depend on covariates (cf., Manning et al., 2005). The class of generalized gamma distributions is very flexible and includes the lognormal, gamma, inverse gamma and Weibull distributions as special cases. We demonstrate how to carry out estimation using the Gaussian quadrature techniques conveniently implemented in SAS Proc NLMIXED. The proposed model is used to analyze pharmacy cost data on 56,245 adult patients clustered within 239 physicians in a mid-western U.S. managed care organization.
    Journal of Health Economics. 01/2010;
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Frailty models derived from the proportional hazards regression model are frequently used to analyze clustered right-censored survival data. We propose a semiparametric Bayesian methodology for this purpose, modeling both the unknown baseline hazard and density of the random effects using mixtures of B-splines. The posterior distributions for all regression coefficients and spline parameters are obtained using Markov Chain Monte Carlo (MCMC). The methodology permits the use of weighted mixtures of parametric and nonparametric components in modeling the hazard function and frailty distribution; in addition, the spline knots may also be selected adaptively using reversible-jump MCMC. Simulations indicate that the method produces smooth and accurate posterior hazard and frailty density estimates. The Bayesian approach not only produces point estimators that outperform existing approaches in certain circumstances, but also offers a wealth of information about the parameters of interest in the form of MCMC samples from the joint posterior probability distribution. We illustrate the adaptability of the method with data from a study of congestive heart failure.
    Electronic Journal of Statistics 01/2010; · 0.79 Impact Factor
  • Source
    Elizabeth D. Schifano, Robert L Strawderman, Martin T. Wells
    [show abstract] [hide abstract]
    ABSTRACT: The use of penalization, or regularization, has become common in high-dimensional statistical analysis, where an increasingly frequent goal is to simultaneously select important variables and estimate their effects. It has been shown by several authors that these goals can be achieved by minimizing some parameter-dependent “goodness-of-fit” function (e.g., a negative loglikelihood) subject to a penalization that promotes sparsity. Penalty functions that are singular at the origin have received substantial attention, arguably beginning with the Lasso penalty [62]. ¶ The current literature tends to focus on specific combinations of differentiable goodness-of-fit functions and penalty functions singular at the origin. One result of this combined specificity has been a proliferation in the number of computational algorithms designed to solve fairly narrow classes of optimization problems involving objective functions that are not everywhere continuously differentiable. In this paper, we propose a general class of algorithms for optimizing an extensive variety of nonsmoothly penalized objective functions that satisfy certain regularity conditions. The proposed framework utilizes the majorization-minimization (MM) algorithm as its core optimization engine. In the case of penalized regression models, the resulting algorithms employ iterated soft-thresholding, implemented componentwise, allowing for fast and stable updating that avoids the need for inverting high-dimensional matrices. We establish convergence theory under weaker assumptions than previously considered in the statistical literature. We also demonstrate the exceptional effectiveness of new acceleration methods, originally proposed for the EM algorithm, in this class of problems. Simulation results and a microarray data example are provided to demonstrate the algorithm’s capabilities and versatility.
    Electronic Journal of Statistics 01/2010; · 0.79 Impact Factor
  • Source
    Lynn M Johnson, Robert L Strawderman
    [show abstract] [hide abstract]
    ABSTRACT: This paper extends the induced smoothing procedure of Brown & Wang (2006) for the semiparametric accelerated failure time model to the case of clustered failure time data. The resulting procedure permits fast and accurate computation of regression parameter estimates and standard errors using simple and widely available numerical methods, such as the Newton-Raphson algorithm. The regression parameter estimates are shown to be strongly consistent and asymptotically normal; in addition, we prove that the asymptotic distribution of the smoothed estimator coincides with that obtained without the use of smoothing. This establishes a key claim of Brown & Wang (2006) for the case of independent failure time data and also extends such results to the case of clustered data. Simulation results show that these smoothed estimates perform as well as those obtained using the best available methods at a fraction of the computational cost.
    Biometrika 09/2009; 96(3):577-590. · 1.65 Impact Factor
  • Source
    David Y Clement, Robert L Strawderman
    [show abstract] [hide abstract]
    ABSTRACT: This paper deals with the analysis of recurrent event data subject to censored observation. Using a suitable adaptation of generalized estimating equations for longitudinal data, we propose a straightforward methodology for estimating the parameters indexing the conditional means and variances of the process interevent (i.e. gap) times. The proposed methodology permits the use of both time-fixed and time-varying covariates, as well as transformations of the gap times, creating a flexible and useful class of methods for analyzing gap-time data. Censoring is dealt with by imposing a parametric assumption on the censored gap times, and extensive simulation results demonstrate the relative robustness of parameter estimates even when this parametric assumption is incorrect. A suitable large-sample theory is developed. Finally, we use our methods to analyze data from a randomized trial of asthma prevention in young children.
    Biostatistics 04/2009; 10(3):451-67. · 2.43 Impact Factor
  • [show abstract] [hide abstract]
    ABSTRACT: Our objective was to determine the frequency and predictive factors for cardiac-related emergency department (ED) encounters within 30 days after percutaneous coronary intervention (PCI). The data source was an electronic database of 2,731 patients who had PCI from 2002 to 2004. Almost all underwent stent placement. Risk factors for returning to the ED were identified from clinical, anatomic, and demographic candidate variables using multivariate logistic regression. Approximately 9% of the cohort (255 of 2,731 patients) returned to the ED for cardiac reasons within 30 days, peaking around 3 days. ED visits were more likely in those whose index PCI was emergent or urgent (odds ratio [OR] 2.0, 95% confidence interval [CI] 1.3 to 3.0), in women (OR 1.9, 95% CI 1.5 to 2.5), and in those who had previous encounters with the ED or hospital (OR 1.7, 95% CI 1.5 to 2.0). Patients receiving stents were somewhat less likely to return (OR 0.7, 95% CI 0.5 to 1.0). In conclusion, the clinical courses of the 255 returning patients were generally benign, but 12% had a subsequent myocardial infarction or repeat PCI within 30 days of the ED encounter.
    The American Journal of Cardiology 02/2007; 99(2):197-201. · 3.21 Impact Factor
  • Min Zhang, Robert L. Strawderman, Mark E. Cowen, Martin T. Wells
    Journal of the American Statistical Association 02/2006; 101(September):934-945. · 1.83 Impact Factor
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: This article describes a simple computational method for obtaining the maximum likelihood estimates (MLE) in nonlinear mixed-effects models when the random effects are assumed to have a nonnormal distribution. Many computer programs for fitting nonlinear mixed-effects models, such as PROC NLMIXED in SAS, require that the random effects have a normal distribution. However, there is often interest in either fitting models with nonnormal random effects or assessing the sensitivity of inferences to departures from the normality assumption for the random effects. When the random effects are assumed to have a nonnormal distribution, we show how the probability integral transform can be used, in conjunction with standard statistical software for fitting nonlinear mixed-effects models (e.g., PROC NLMIXED in SAS), to obtain the MLEs. Specifically, the probability integral transform is used to transform a normal random effect to a nonnormal random effect. The method is illustrated using a gamma frailty model for clustered survival data and a beta-binomial model for clustered binary data. Finally, the results of a simulation study, examining the impact of misspecification of the distribution of the random effects, are presented.
    Journal of Computational and Graphical Statistics - J COMPUT GRAPH STAT. 01/2006; 15(1).
  • Min Zhang, Robert L Strawderman, Mark E Cowen, Martin T Wells
    Journal of The American Statistical Association - J AMER STATIST ASSN. 01/2006; 101(475):934-945.
  • Source
    03/2004;
  • Mark E Cowen, Robert L Strawderman
    [show abstract] [hide abstract]
    ABSTRACT: Despite the availability of more sophisticated techniques, few alternatives to ordinary least squares (OLS) regression have been utilized to profile physician prescribing in managed care. It is not known to what extent the modest R values derived from OLS models reflect incomplete risk adjustment or widely varying physician prescribing patterns. To quantify the role of interphysician variability relative to overall variability in managed care pharmacy expenses, and to examine the extent to which different statistical approaches generate meaningful differences in profile results. Comparison of three basic statistical modeling approaches: OLS, fixed effects regression, and random effects (ie, hierarchical) regression models. Two managed care populations that differed more than 2-fold in per member pharmacy expenditures in 1999, one from the Midwestern United States, the other from three Western States. The intraclass correlation coefficient (ICC, the proportion of variability in expenses attributable to differences among physicians) and the range of projected expenses attributed to each physician's prescribing style. The ICCs were small for aggregated pharmacy expenditures, 0.04 or less in both populations. As determined by OLS, the most costly physician contributed 94,399 U.S. dollars in excess expenses to the organization whereas the most parsimonious saved 89,940 U.S. dollars. When derived from random effects models, the range in performance was 63% of that derived from OLS. In the populations studied, systematic prescribing differences among physicians were small relative to the overall variability in pharmacy expenses, suggesting other factors were more likely driving these costs. Random effects models generated smaller estimates of the individual physicians' contribution to costs, sometimes considerably, relative to those derived from OLS and fixed effects approaches.
    Medical Care 09/2002; 40(8):650-61. · 3.23 Impact Factor
  • Source
    Arvind K Jain, Robert L Strawderman
    [show abstract] [hide abstract]
    ABSTRACT: The modeling of lifetime (i.e. cumulative) medical cost data in the presence of censored follow-up is complicated by induced informative censoring, rendering standard survival analysis tools invalid. With few exceptions, recently proposed nonparametric estimators for such data do not extend easily to handle covariate information. We propose to model the hazard function for lifetime cost endpoints using an adaptation of the HARE methodology (Kooperberg, Stone, and Truong, Journal of the American Statistical Association, 1995, 90, 78-94). Linear splines and their tensor products are used to adaptively build a model that incorporates covariates and covariate-by-cost interactions without restrictive parametric assumptions. The informative censoring problem is handled using inverse probability of censoring weighted estimating equations. The proposed method is illustrated using simulation and also with data on the cost of dialysis for patients with end-stage renal disease.
    Biostatistics 04/2002; 3(1):101-18. · 2.43 Impact Factor
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Patients with idiopathic interstitial pneumonias (IIPs) can be subdivided into groups based on the histological appearance of lung tissue obtained by surgical biopsy. The quantitative impact of histological diagnosis, baseline factors and response to therapy on survival has not been evaluated. Surgical lung biopsy specimens from 168 patients with suspected IIP were reviewed according to the latest diagnostic criteria. The impact of baseline clinical, physiological, radiographic and histological features on survival was evaluated using Cox regression analysis. The predictive value of honeycombing on high-resolution computed tomography (HRCT) as a surrogate marker for usual interstitial pneumonia (UIP) was examined. The response to therapy and survival of 39 patients treated prospectively with high-dose prednisone was evaluated. The presence of UIP was the most important factor influencing mortality. The risk ratio of mortality when UIP was present was 28.46 (95% confidence interval (CI) 5.5-148.0; p=0.0001) after controlling for patient age, duration of symptoms, radiographic appearance, pulmonary physiology, smoking history and sex. Honeycombing on HRCT indicated the presence of UIP with a sensitivity of 90% and specificity of 86%. Patients with nonspecific interstitial pneumonia were more likely to respond or remain stable (9 of 10) compared to patients with UIP (14 of 29) after treatment with prednisone. Patients remaining stable had the best prognosis. The risk ratio of mortality for stable patients compared to nonresponders was 0.32 (95% CI 0.11-0.93; p=0.04) in all patients and 0.33 (95% CI 0.12-0.96; p=0.04) in patients with UIP. The histological diagnosis of usual interstitial pneumonia is the most important factor determining survival in patients with suspected idiopathic interstitial pneumonia. The presence of honeycombing on high-resolution computed tomography is a good surrogate for usual interstitial pneumonia and could be utilized in patients unable to undergo surgical lung biopsy. Patients with nonspecific interstitial pneumonia are more likely to respond or remain stable following a course of prednisone. Patients remaining stable following prednisone therapy have the best prognosis.
    European Respiratory Journal 03/2002; 19(2):275-83. · 6.36 Impact Factor

Publication Stats

1k Citations
53 Downloads
2k Views
123.88 Total Impact Points

Institutions

  • 2004–2012
    • Cornell University
      • Department of Statistical Science
      Ithaca, NY, United States
  • 1994–2002
    • University of Michigan
      • • Department of Biostatistics
      • • Department of Internal Medicine
      Ann Arbor, MI, United States