Chirag R Parikh

Yale University, New Haven, Connecticut, United States

Are you Chirag R Parikh?

Claim your profile

Publications (271)1725.81 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Background and aims Cirrhosis affects 5.5 million patients with estimated costs of US$4 billion. Previous studies about dialysis requiring acute kidney injury (AKI-D) in decompensated cirrhosis (DC) are from a single center/year. We aimed to describe national trends of incidence and impact of AKI-D in DC hospitalizations. Methods We extracted our cohort from the Nationwide Inpatient Sample (NIS) from 2006–2012. We identified hospitalizations with DC and AKI-D by validated ICD9 codes. We analyzed temporal changes in DC hospitalizations complicated by AKI-D and utilized multivariable logistic regression models to estimate AKI-D impact on hospital mortality. Results We identified a total of 3,655,700 adult DC hospitalizations from 2006 to 2012 of which 78,015 (2.1 %) had AKI-D. The proportion with AKI-D increased from 1.5 % in 2006 to 2.23 % in 2012; it was stable between 2009 and 2012 despite an increase in absolute numbers from 6773 to 13,930. The overall hospital mortality was significantly higher in hospitalizations with AKI-D versus those without (40.87 vs. 6.96 %; p < 0.001). In an adjusted multivariable analysis, adjusted odds ratio for mortality was 2.17 (95 % CI 2.06–2.28; p < 0.01) with AKI-D, which was stable from 2006 to 2012. Changes in demographics and increases in acute/chronic comorbidities and procedures explained temporal changes in AKI-D. Conclusions Proportion of DC hospitalizations with AKI-D increased from 2006 to 2009, and although this was stable from 2009 to 2012, there was an increase in absolute cases. These results elucidate the burden of AKI-D on DC hospitalizations and excess associated mortality, as well as highlight the importance of prevention, early diagnosis and testing of novel interventions in this vulnerable population.
    No preview · Article · Jan 2016 · Hepatology International
  • [Show abstract] [Hide abstract]
    ABSTRACT: Aims: Hyponatraemia is strongly associated with adverse outcomes in heart failure. However, accumulating evidence suggests that chloride may play an important role in renal salt sensing and regulation of neurohormonal and sodium-conserving pathways. Our objective was to determine the prognostic importance of hypochloraemia in patients with heart failure. Methods and results: Patients in the BEST trial with baseline serum chloride values were evaluated (n = 2699). Hypochloraemia was defined as a serum chloride ≤96 mmol/L and hyponatraemia as serum sodium ≤135 mmol/L. Hypochloraemia was present in 13.0% and hyponatraemia in 13.7% of the population. Chloride and sodium were only modestly correlated (r = 0.53), resulting in only 48.7% of hypochloraemic patients having concurrent hyponatraemia. Both hyponatraemia and hypochloraemia identified a population with greater disease severity; however, renal function tended to be worse and loop diuretic doses higher with hypochloraemia. In univariate analysis, lower serum sodium or serum chloride as continuous parameters were each strongly associated with mortality (P < 0.001). However, when both parameters were included in the same model, serum chloride remained strongly associated with mortality [hazard ratio (HR) 1.3 per standard deviation decrease, 95% confidence interval (CI) 1.18-1.42, P < 0.001], whereas sodium was not (HR 0.97 per standard deviation decrease, 95% CI 0.89-1.06, P = 0.52). Conclusion: Serum chloride is strongly and independently associated with worsened survival in patients with chronic heart failure and accounted for the majority of the risk otherwise attributable to hyponatraemia. Given the critical role of chloride in a number of regulatory pathways central to heart failure pathophysiology, additional research is warranted in this area.
    No preview · Article · Jan 2016 · European Journal of Heart Failure
  • [Show abstract] [Hide abstract]
    ABSTRACT: Reduction in systolic blood pressure (SBP reduction) during the treatment of acute decompensated heart failure is strongly and independently associated with worsening renal function. Our objective was to determine whether SBP reduction or titration of oral neurohormonal antagonists during acute decompensated heart failure treatment negatively influences diuresis and decongestion. Methods and Results-SBP reduction was evaluated from admission to discharge in consecutive acute decompensated heart failure admissions (n=656). Diuresis and decongestion were examined across a range of parameters, such as diuretic efficiency, fluid output, hemoconcentration, and diuretic dose. The average reduction in SBP was 14.4±19.4 mm Hg, and 77.6% of the population had discharge SBP lower than admission. SBP reduction was strongly associated with worsening renal function (odds ratio, 1.9; 95% confidence interval, 1.2-2.9; P=0.004), a finding that persisted after adjusting for parameters of diuresis and decongestion (odds ratio, 2.0; 95% confidence interval, 1.3-3.2; P=0.002). However, SBP reduction did not negatively affect diuresis or decongestion (P≥0.25 for all parameters). Uptitration of neurohormonal antagonists occurred in >50% of admissions and was associated with a modest additional reduction in blood pressure (≤5.6 mm Hg). Notably, worsening renal function was not increased, and diuretic efficiency was significantly improved with the uptitration of neurohormonal antagonists. Conclusions-Despite a higher rate of worsening renal function, blood pressure reduction was not associated with worsening of diuresis or decongestion. Furthermore, titration of oral neurohormonal antagonists was actually associated with improved diuresis in this cohort. These results provide reassurance that the guideline-recommended titration of chronic oral medication during acute decompensated heart failure hospitalization may not be antagonistic to the short-term goal of decongestion.
    No preview · Article · Jan 2016 · Circulation Heart Failure
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: Removal of excess sodium and fluid is a primary therapeutic objective in acute decompensated heart failure and commonly monitored with fluid balance and weight loss. However, these parameters are frequently inaccurate or not collected and require a delay of several hours after diuretic administration before they are available. Accessible tools for rapid and accurate prediction of diuretic response are needed. Methods and results: Based on well-established renal physiological principles, an equation was derived to predict net sodium output using a spot urine sample obtained 1 or 2 hours after loop diuretic administration. This equation was then prospectively validated in 50 acute decompensated heart failure patients using meticulously obtained timed 6-hour urine collections to quantify loop diuretic-induced cumulative sodium output. Poor natriuretic response was defined as a cumulative sodium output of <50 mmol, a threshold that would result in a positive sodium balance with twice-daily diuretic dosing. Following a median dose of 3 mg (2-4 mg) of intravenous bumetanide, 40% of the population had a poor natriuretic response. The correlation between measured and predicted sodium output was excellent (r=0.91; P<0.0001). Poor natriuretic response could be accurately predicted with the sodium prediction equation (area under the curve =0.95, 95% confidence interval 0.89-1.0; P<0.0001). Clinically recorded net fluid output had a weaker correlation (r=0.66; P<0.001) and lesser ability to predict poor natriuretic response (area under the curve =0.76, 95% confidence interval 0.63-0.89; P=0.002). Conclusions: In patients being treated for acute decompensated heart failure, poor natriuretic response can be predicted soon after diuretic administration with excellent accuracy using a spot urine sample.
    No preview · Article · Jan 2016 · Circulation Heart Failure
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: Studies suggest that in patients with heart failure (HF), high serum erythropoietin is associated with risk of recurrent HF and mortality. Trials of erythropoietin-stimulating agents in persons with kidney disease have also suggested an increased incidence of adverse clinical events. No large studies of which we are aware have evaluated the association of endogenous erythropoietin levels with clinical outcomes in the community-living older adults. Methods and results: Erythropoietin concentration was measured in 2488 participants aged 70-79 years in the Health, Aging and Body Composition Study. Associations of erythropoietin with incident HF, coronary heart disease, stroke, mortality, and ≥30% decline in estimated glomerular filtration rate were examined using Cox proportional hazards and logistic regression over 10.7 years of follow-up. Mean (SD) age was 75 (3) years and median (quartile 1, quartile 3) erythropoietin was 12.3 (9.0, 17.2) mIU/mL. There were 503 incident HF events, and each doubling of serum erythropoietin was associated with a 25% increased risk of incident HF 1.25 (95% confidence interval 1.13, 1.48) after adjusting for demographics, prevalent cardiovascular disease, cardiovascular disease risk factors, kidney function, and serum hemoglobin. There was no interaction of serum erythropoietin with chronic kidney disease or anemia (P>0.50). There were 330 incident coronary heart disease events, 161 strokes, 1112 deaths, and 698 outcomes of ≥30% decline in estimated glomerular filtration rate. Serum erythropoietin was not significantly associated with these outcomes. Conclusions: Higher levels of endogenous erythropoietin are associated with incident HF in older adults. Studies need to elucidate the mechanisms through which endogenous erythropoietin levels associate with specific outcomes.
    No preview · Article · Jan 2016 · Circulation Heart Failure
  • [Show abstract] [Hide abstract]
    ABSTRACT: Observational studies have shown that acute change in kidney function (specifically, AKI) is a strong risk factor for poor outcomes. Thus, the outcome of acute change in serum creatinine level, regardless of underlying biology or etiology, is frequently used in clinical trials as both efficacy and safety end points. We performed a meta-analysis of clinical trials to quantify the relationship between positive or negative short-term effects of interventions on change in serum creatinine level and more meaningful clinical outcomes. After a thorough literature search, we included 14 randomized trials of interventions that altered risk for an acute increase in serum creatinine level and had reported between-group differences in CKD and/or mortality rate ≥3 months after randomization. Seven trials assessed interventions that, compared with placebo, increased risk of acute elevation in serum creatinine level (pooled relative risk, 1.52; 95% confidence interval, 1.22 to 1.89), and seven trials assessed interventions that, compared with placebo, reduced risk of acute elevation in serum creatinine level (pooled relative risk, 0.57; 95% confidence interval, 0.44 to 0.74). However, pooled risks for CKD and mortality associated with interventions did not differ from those with placebo in either group. In conclusion, several interventions that affect risk of acute, mild to moderate, often temporary elevation in serum creatinine level in placebo-controlled randomized trials showed no appreciable effect on CKD or mortality months later, raising questions about the value of using small to moderate changes in serum creatinine level as end points in clinical trials.
    No preview · Article · Dec 2015 · Journal of the American Society of Nephrology
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background and objectives: Data reported to the Organ Procurement and Transplantation Network (OPTN) are used in kidney transplant research, policy development, and assessment of center quality, but the accuracy of early post-transplant outcome measures is unknown. Design, setting, participants, & measurements: The Deceased Donor Study (DDS) is a prospective cohort study at five transplant centers. Research coordinators manually abstracted data from electronic records for 557 adults who underwent deceased donor kidney transplantation between April of 2010 and November of 2013. We compared the post-transplant outcomes of delayed graft function (DGF; defined as dialysis in the first post-transplant week), acute rejection, and post-transplant serum creatinine reported to the OPTN with data collected for the DDS. Results: Median kidney donor risk index was 1.22 (interquartile range [IQR], 0.97-1.53). Median recipient age was 55 (IQR, 46-63) years old, 63% were men, and 47% were black; 93% had received dialysis before transplant. Using DDS data as the gold standard, we found that pretransplant dialysis was not reported to the OPTN in only 11 (2%) instances. DGF in OPTN data had a sensitivity of 89% (95% confidence interval [95% CI], 84% to 93%) and specificity of 98% (95% CI, 96% to 99%). Surprisingly, the OPTN data accurately identified acute allograft rejection in only 20 of 47 instances (n=488; sensitivity of 43%; 95% CI, 17% to 73%). Across participating centers, sensitivity of acute rejection varied widely from 23% to 100%, whereas specificity was uniformly high (92%-100%). Six-month serum creatinine values in DDS and OPTN data had high concordance (n=490; Lin concordance correlation =0.90; 95% CI, 0.88 to 0.92). Conclusions: OPTN outcomes for recipients of deceased donor kidney transplants have high validity for DGF and 6-month allograft function but lack sensitivity in detecting rejection. Future studies using OPTN data may consider focusing on allograft function at 6 months as a useful outcome.
    No preview · Article · Dec 2015 · Clinical Journal of the American Society of Nephrology
  • [Show abstract] [Hide abstract]
    ABSTRACT: This study sought to determine if amino-terminal pro-B-type natriuretic peptide (NT-proBNP) has different diagnostic and prognostic utility in patients with renal dysfunction.
    No preview · Article · Dec 2015 · JACC: Heart Failure
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Reviewing the literature in many fields on proposed risk models reveals problems with the way many risk models are developed. Furthermore, papers reporting new risk models do not always provide sufficient information to allow readers to assess the merits of the model. In this review, we discuss sources of bias that can arise in risk model development. We focus on two biases that can be introduced during data analysis. These two sources of bias are sometimes conflated in the literature and we recommend the terms resubstitution bias and model-selection bias to delineate them. We also propose the RiGoR reporting standard to improve transparency and clarity of published papers proposing new risk models.
    Full-text · Article · Dec 2015
  • [Show abstract] [Hide abstract]
    ABSTRACT: Hypothermic machine perfusion (HMP) is increasingly used in deceased-donor kidney transplantation, but controversy exists regarding the value of perfusion biomarkers and pump parameters for assessing organ quality. We prospectively determined associations between perfusate biomarkers [neutrophil gelatinase-associated lipocalin (NGAL), kidney injury molecule-1 (KIM-1), interleukin-18 (IL-18) and liver-type fatty acid-binding protein (L-FABP)] and pump parameters (resistance and flow) with outcomes of delayed graft function (DGF) and 6-month estimated glomerular filtration rate (eGFR). DGF occurred in 230/671 (34%) recipients. Only 1-hour flow was inversely associated with DGF. Higher NGAL or L-FABP concentrations and increased resistance were inversely associated with 6-month eGFR, while higher flow was associated with higher adjusted 6-month eGFR. Discarded kidneys had consistently higher median resistance and lower median flow than transplanted kidneys, but median perfusate biomarker concentrations were either lower or not significantly different in discarded compared with transplanted kidneys. Notably, most recipients of transplanted kidneys with isolated “undesirable” biomarker levels or HMP parameters experienced acceptable 6-month allograft function, suggesting these characteristics should not be used in isolation for discard decisions. Additional studies must confirm the utility of combining HMP measurements with other characteristics to assess kidney quality. This article is protected by copyright. All rights reserved.
    No preview · Article · Dec 2015 · American Journal of Transplantation

  • No preview · Article · Nov 2015 · JAMA Internal Medicine
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: Renal dysfunction (RD) is a potent risk factor for death in patients with cardiovascular disease. This relationship may be causal since experimentally induced RD produces findings such as myocardial necrosis and apoptosis in animals. Cardiac transplantation provides an opportunity to investigate this hypothesis in humans. Methods and results: Cardiac transplantations from the UNOS registry were studied (n=23,056). RD was defined as an estimated glomerular filtration rate < 60 ml/min/1.73m(2). RD was present in 17.9% of donors and 39.4% of recipients. Unlike multiple donor characteristics, such as older age, hypertension or diabetes, donor RD was not associated with recipient death or retransplantation (age-adjusted HR=1.00, 95% CI 0.94-1.07, p=0.92). Moreover, in recipients with RD the highest risk for death or retransplantation occurred immediately post-transplant (0-30 day HR=1.8, 95% CI 1.54-2.02, p<0.001) with subsequent attenuation of the risk over time (30-365 day HR=0.92, 95% CI 0.77-1.09, p=0.33). Conclusions: The risk for adverse recipient outcomes associated with RD does not appear to be transferrable from donor to recipient via the cardiac allograft and the risk associated with recipient RD is greatest immediately following transplant. These observations suggest that the risk for adverse outcomes associated with RD is likely primarily driven by non-myocardial factors.
    No preview · Article · Nov 2015 · Journal of cardiac failure
  • Stuart D. Katz · Chirag R. Parikh

    No preview · Article · Oct 2015 · Journal of the American College of Cardiology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: In Central America, an epidemic of chronic kidney disease of unknown cause disproportionately affects young male agricultural workers. Study design: Longitudinal cohort study. Setting & participants: 284 sugarcane workers in 7 jobs were recruited from one company in northwestern Nicaragua. Blood and urine samples were collected before and near the end of the 6-month harvest season. Predictors: Job category (cane cutter, seeder, seed cutter, agrichemical applicator, irrigator, driver, and factory worker); self-reported water and electrolyte solution intake. Outcomes & measurements: Changes in levels of urinary kidney injury biomarkers normalized to urine creatinine level, including neutrophil gelatinase-associated lipocalin (NGAL), interleukin 18 (IL-18), N-acetyl-β-d-glucosaminidase (NAG), and albumin; serum creatinine-based estimated glomerular filtration rate (eGFR). Results: Mean eGFR was 113 mL/min/1.73 m(2) and <5% of workers had albuminuria. Field workers had increases in NGAL and IL-18 levels that were 1.49 (95% CI, 1.06 to 2.09) and 1.61 (95% CI, 1.12 to 2.31) times as high, respectively, as in non-field workers. Cane cutters and irrigators had the greatest increases in NGAL levels during the harvest, whereas cane cutters and seeders had the greatest increases in IL-18 levels. Electrolyte solution consumption was associated with lower mean NGAL and NAG levels among cane cutters and lower mean IL-18 and NAG levels among seed cutters; however, there was no overall effect of hydration among all workers. On average, workers with the largest increases in NGAL and NAG levels during the harvest had declines in eGFRs of 4.6 (95% CI, 1.0 to 8.2) and 3.1 (95% CI, -0.6 to 6.7) mL/min/1.73 m(2), respectively. Limitations: Surrogate exposure measure, loss to follow-up. Conclusions: Results are consistent with the hypothesis that occupational heat stress and volume depletion may be associated with the development of kidney disease, and future studies should directly measure these occupational factors. The presence of urine tubular injury markers supports a tubulointerstitial disease that could occur with repeated tubular injury.
    Full-text · Article · Oct 2015 · American Journal of Kidney Diseases
  • [Show abstract] [Hide abstract]
    ABSTRACT: Individual biomarkers of renal injury are only modestly predictive of acute kidney injury (AKI). Using multiple biomarkers has the potential to improve predictive capacity. In this systematic review, statistical methods of articles developing biomarker combinations to predict AKI were assessed. We identified and described three potential sources of bias (resubstitution bias, model selection bias, and bias due to center differences) that may compromise the development of biomarker combinations. Fifteen studies reported developing kidney injury biomarker combinations for the prediction of AKI after cardiac surgery (8 articles), in the intensive care unit (4 articles), or other settings (3 articles). All studies were susceptible to at least one source of bias and did not account for or acknowledge the bias. Inadequate reporting often hindered our assessment of the articles. We then evaluated, when possible (7 articles), the performance of published biomarker combinations in the TRIBE-AKI cardiac surgery cohort. Predictive performance was markedly attenuated in six out of seven cases. Thus, deficiencies in analysis and reporting are avoidable, and care should be taken to provide accurate estimates of risk prediction model performance. Hence, rigorous design, analysis, and reporting of biomarker combination studies are essential to realizing the promise of biomarkers in clinical practice.Kidney International advance online publication, 23 September 2015; doi:10.1038/ki.2015.283.
    No preview · Article · Sep 2015 · Kidney International
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: The interaction between baseline kidney function and the performance of biomarkers of acute kidney injury (AKI) on the development of AKI is unclear. Study design: Post hoc analysis of prospective cohort study. Setting & participants: The 1,219 TRIBE-AKI Consortium adult cardiac surgery cohort participants. Predictor: Unadjusted postoperative urinary biomarkers of AKI measured within 6 hours of surgery. Outcome: AKI was defined as AKI Network stage 1 (any AKI) or higher, as well as a doubling of serum creatinine level from the preoperative value or the need for post-operative dialysis (severe AKI). Measurements: Stratified analyses by preoperative estimated glomerular filtration rate (eGFR) ≤ 60 versus > 60mL/min/1.73m(2). Results: 180 (42%) patients with preoperative eGFRs≤60mL/min/1.73m(2) developed clinical AKI compared with 246 (31%) of those with eGFRs>60mL/min/1.73m(2) (P<0.001). For log2-transformed biomarker concentrations, there was a significant interaction between any AKI and baseline eGFR for interleukin 18 (P=0.007) and borderline significance for liver-type fatty acid binding protein (P=0.06). For all biomarkers, the adjusted relative risk (RR) point estimates for the risk for any AKI were higher in those with elevated baseline eGFRs compared with those with eGFRs≤60mL/min/1.73m(2). However, the difference in magnitude of these risks was low (adjusted RRs were 1.04 [95% CI, 0.99-1.09] and 1.11 [95% CI, 1.07-1.15] for those with preoperative eGFRs≤60mL/min/1.73m(2) and those with higher eGFRs, respectively). Although no biomarker displayed an interaction for baseline eGFR and severe AKI, log2-transformed interleukin 18 and kidney injury molecule 1 had significant adjusted RRs for severe AKI in those with and without baseline eGFRs≤60mL/min/1.73m(2). Limitations: Limited numbers of patients with severe AKI and post-operative dialysis. Conclusions: The association between early postoperative AKI urinary biomarkers and AKI is modified by preoperative eGFR. The degree of this modification and its impact on the biomarker-AKI association is small across biomarkers. Our findings suggest that distinct biomarker cutoffs for those with and without a preoperative eGFR≤60mL/min/1.73m(2) is not necessary.
    No preview · Article · Sep 2015 · American Journal of Kidney Diseases
  • [Show abstract] [Hide abstract]
    ABSTRACT: Assessment of deceased-donor organ quality is integral to transplant allocation practices, but tools to more precisely measure donor kidney injury and better predict outcomes are needed. In this study, we assessed associations between injury biomarkers in deceased-donor urine and the following outcomes: donor AKI (stage 2 or greater), recipient delayed graft function (defined as dialysis in first week post-transplant), and recipient 6-month eGFR. We measured urinary concentrations of microalbumin, neutrophil gelatinase-associated lipocalin (NGAL), kidney injury molecule-1 (KIM-1), IL-18, and liver-type fatty acid binding protein (L-FABP) from 1304 deceased donors at organ procurement, among whom 112 (9%) had AKI. Each biomarker strongly associated with AKI in adjusted analyses. Among 2441 kidney transplant recipients, 31% experienced delayed graft function, and mean±SD 6-month eGFR was 55.7±23.5 ml/min per 1.73 m(2). In analyses adjusted for donor and recipient characteristics, higher donor urinary NGAL concentrations associated with recipient delayed graft function (highest versus lowest NGAL tertile relative risk, 1.21; 95% confidence interval, 1.02 to 1.43). Linear regression analyses of 6-month recipient renal function demonstrated that higher urinary NGAL and L-FABP concentrations associated with slightly lower 6-month eGFR only among recipients without delayed graft function. In summary, donor urine injury biomarkers strongly associate with donor AKI but provide limited value in predicting delayed graft function or early allograft function after transplant.
    No preview · Article · Sep 2015 · Journal of the American Society of Nephrology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false-positive rates because of inherent laboratory and biologic variabilities of creatinine. We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false-positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient's true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false-positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%-8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false-positive AKI diagnosis rate of 30.5% (interquartile range =30.1%-30.9%) versus 2.0% (interquartile range =1.9%-2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Use of small serum creatinine changes to diagnose AKI is limited by high false-positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. Copyright © 2015 by the American Society of Nephrology.
    Full-text · Article · Sep 2015 · Clinical Journal of the American Society of Nephrology
  • [Show abstract] [Hide abstract]
    ABSTRACT: An epidemic of chronic kidney disease (CKD) of non-traditional aetiology has been recently recognized by health authorities as a public health priority in Central America. Previous studies have identified strenuous manual work, agricultural activities and residence at low altitude as potential risk factors; however, the aetiology remains unknown. Because individuals are frequently diagnosed with CKD in early adulthood, we measured biomarkers of kidney injury among adolescents in different regions of Nicaragua to assess whether kidney damage might be initiated during childhood. Participants include 200 adolescents aged 12-18 years with no prior work history from four different schools in Nicaragua. The location of the school served as a proxy for environmental exposures and geographic locations were selected to represent a range of factors that have been associated with CKD in adults (e.g. altitude, primary industry and CKD mortality rates). Questionnaires, urine dipsticks and kidney injury biomarkers [interleukin-18, N-acetyl-d-glucosaminidase (NAG), neutrophil gelatinase-associated lipocalin (NGAL) and albumin-creatinine ratio] were assessed. Biomarker concentrations were compared by school using linear regression models. Protein (3.5%) and glucose (1%) in urine measured by dipstick were rare and did not differ by school. Urine biomarkers of tubular kidney damage, particularly NGAL and NAG, showed higher concentrations in those schools and regions within Nicaragua that were defined a priori as having increased CKD risk. Painful urination was a frequent self-reported symptom. Although interpretation of these urine biomarkers is limited because of the lack of population reference values, results suggest the possibility of early kidney damage prior to occupational exposures in these adolescents. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
    No preview · Article · Aug 2015 · Nephrology Dialysis Transplantation
  • Source
    Justin M Belcher · Steven G Coca · Chirag R Parikh
    [Show abstract] [Hide abstract]
    ABSTRACT: Background and aims: Hepatorenal syndrome is a severe complication of cirrhosis and associates with significant mortality. Vasoconstrictor medications improve renal function in patients with hepatorenal syndrome. However, it is unclear to what extent changes in serum creatinine during treatment may act as a surrogate for changes in mortality. We have performed a meta-analysis of randomized trials of vasoconstrictors assessing the association between changes in serum creatinine, taken as a continuous variable, and mortality, both while on treatment and during the follow-up period for survivors. Methods: The electronic databases of PubMed, Web of Science and Embase were searched for randomized trials evaluating the efficacy of vasoconstrictor therapy for treatment of HRS type 1 or 2. The relative risk (RR) for mortality was calculated against delta creatinine. The proportion of treatment effect explained (PTE) was calculated for delta creatinine. Results: Seven trials enrolling 345 patients were included. The correlation between delta creatinine and ln (RR) was moderately good (R2 = 0.61). The intercept and parameter estimate indicated a fall in creatinine while on treatment of 1 mg/dL resulted in a 27% reduction in RR for mortality compared to the control arm. In patients surviving the treatment period, a fall in creatinine while on treatment of 1 mg/dL resulted in a 16% reduction in RR for post-treatment mortality during follow-up. The PTE of delta creatinine for overall mortality was 0.91 and 0.26 for post-treatment mortality. Conclusions: Changes in serum creatinine in response to vasoconstrictor therapy appear to be a valid surrogate for mortality, even in the period following the completion of treatment.
    Full-text · Article · Aug 2015 · PLoS ONE

Publication Stats

9k Citations
1,725.81 Total Impact Points


  • 2005-2016
    • Yale University
      • • School of Medicine
      • • Section of Nephrology
      • • Department of Internal Medicine
      New Haven, Connecticut, United States
  • 2014
    • San Francisco VA Medical Center
      San Francisco, California, United States
  • 2005-2014
    • Yale-New Haven Hospital
      • Department of Pathology
      New Haven, Connecticut, United States
  • 2012
    • National Institutes of Health
      • National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK)
      베서스다, Maryland, United States
    • Indiana University-Purdue University Indianapolis
      • Department of Medicine
      Indianapolis, IN, United States
  • 2011
    • West Haven University
      Вэст Хэйвн, Connecticut, United States
    • United States Department of Veterans Affairs
      Бедфорд, Massachusetts, United States
  • 2009-2011
    • The University of Western Ontario
      • • Department of Medicine
      • • Division of Nephrology
      London, Ontario, Canada
  • 2008
    • Virginia Commonwealth University
      • Division of Nephrology
      Ричмонд, Virginia, United States
  • 2007
    • Minneapolis Veterans Affairs Hospital
      Minneapolis, Minnesota, United States
  • 2004
    • University of Colorado Hospital
      • Department of Medicine
      Denver, Colorado, United States
  • 2002-2004
    • University of Colorado
      • Department of Medicine
      Denver, Colorado, United States
  • 1999
    • King Edward Memorial Hospital
      Mumbai, Maharashtra, India