[Show abstract][Hide abstract] ABSTRACT: Background: Although dysnatremia has been reported to be correlated with mortality risk, this issue remains unresolved in patients undergoing continuous renal replacement therapy (CRRT). Furthermore, it has not been determined whether change in or correction of sodium is related to mortality risk in this subset. Methods: A total of 569 patients were prospectively enrolled at the start of CRRT between May 2010 and September 2013. The patients were divided into 5 groups: normonatremia (135-145 mmol/L), mild hyponatremia (131.1-134.9 mmol/L), moderate to severe hyponatremia (115.4-131.0 mmol/L), mild hypernatremia (145.1-148.4 mmol/L), and moderate to severe hypernatremia (148.5-166.0 mmol/L). The non-linear relationship between sodium and mortality was initially explored. Subsequently, the odds ratios (ORs) for 30-day mortality were calculated after adjustment of multiple covariates. Results: The relationship between baseline sodium and mortality was U-shaped. The mild hyponatremia, moderate to severe hyponatremia, and moderate to severe hypernatremia groups had greater ORs for mortality (1.65, 1.91, and 2.32, respectively) than the normonatremia group (all P values < 0.05). However, later sodium levels (24 and 72 h after CRRT) did not predict 30-day mortality. Furthermore, the changes in sodium over 24 or 72 h, including the appropriate correction of dysnatremia, did not show any relationships with mortality, irrespective of baseline sodium level. Conclusions: Sodium level at the start of CRRT was a strong predictor of mortality. However, changes in sodium level and the degree of sodium correction were not associated with the mortality risk in the patients with CRRT.
[Show abstract][Hide abstract] ABSTRACT: Acute kidney injury (AKI) is a major health concern, because AKI is related with an increase in morbidity and mortality. Anemia is related to AKI in several clinical settings. However, the relationship between anemia and AKI and the effect of anemia on long-term mortality are unresolved in critically ill patients. A total of 2,145 patients admitted to the intensive care unit were retrospectively analyzed. We calculated a threshold value of hemoglobin associated with an increased risk of AKI and used this value to define anemia. The odds ratios (ORs) and hazard ratios for AKI and all-cause mortality were calculated after adjusting for multiple covariates. The OR of AKI increased depending on the decrease in hemoglobin level and the ideal threshold point of hemoglobin linked to increasing AKI risk was 10.5 g/dL. We categorized patients into anemia (< 10.5 g/dL) and non-anemia (≥ 10.5 g/dL) groups. The risk of AKI was higher in the anemia group than the non-anemia group and this trend remained significant irrespective of the AKI development time (early vs. late) or duration (< 3 days vs. ≥ 3 days). Both anemia and AKI increased the 10-year mortality risk and this risk prediction was significantly separated by the presence of anemia and AKI. Furthermore, the risk prediction remained consistent irrespective of the AKI severity (i.e., recovery, stage, or duration of AKI). Based on these, we urge clinicians to monitor anemia and AKI in critically ill patients.
Preview · Article · Nov 2015 · The Tohoku Journal of Experimental Medicine
[Show abstract][Hide abstract] ABSTRACT: Background:
Although adiponectin levels have been reported to be correlated with albuminuria, this issue remains unresolved in non-diabetic hypertensive subjects, particularly when urinary adiponectin is considered.
Urinary adiponectin levels were examined using an enzyme-linked immunosorbent assay in 229 participants. who used olmesartan as a hypertensive agent. Their albuminuria levels were measured for 16 weeks after randomization and initiation of conventional or intensive diet education. Linear or logistic regression models were applied, as appropriate, to explore the relationship with albuminuria itself or its response after the intervention.
Urinary adiponectin levels were positively related to baseline albuminuria level (r = 0.529). After adjusting for several covariates, the adiponectin level was associated with the albuminuria level (β = 0.446). Among the 159 subjects with baseline macroalbuminuria, the risk of consistent macroalbuminuria (> 300 mg/day) at 16 weeks was higher in the 3(rd) tertile of adiponectin than in the 1(st) tertile (odds ratio = 6.9), despite diet education. In contrast, among all subjects, the frequency of the normoalbuminuria achievement (< 30 mg/day) at 16 weeks was higher in the 1(st) tertile than in the 3(rd) tertile (odds ratio = 13.0).
Urinary adiponectin may be a useful biomarker for albuminuria or its response after treatment in non-diabetic hypertensive patients.
[Show abstract][Hide abstract] ABSTRACT: Background:
Weights assigned to comorbidities to predict mortality may vary based on the type of index disease and advances in the management of comorbidities. We aimed to develop a modified Charlson comorbidity index (CCI) in incident hemodialysis patients (mCCI-IHD), thereby improving risk stratification for mortality.
Data on 24,738 Koreans who received their first hemodialysis treatment between 2005 and 2008 were obtained from the Korean Health Insurance dataset. The mCCI-IHD score were calculated by summing up the weights which were assigned to individual comorbidities according to their relative prognostic significance determined by multivariate Cox proportional hazards model. The modified index was validated in an independent nationwide prospective cohort (n=1,100).
The Cox proportional hazards model revealed that all comorbidities in the CCI except ulcers significantly predicted mortality. Thus, the mCCI-IHD included 14 comorbidities with re-assigned severity weights. In the validation cohort, both the CCI and the mCCI-IHD were correlated with mortality. However, the mCCI-IHD showed modest but significant increases in c statistics compared with the CCI at 6 months and 1 year. The analyses using continuous net reclassification improvement revealed that the mCCI-IHD improved net mortality risk reclassification by 24.6% (95% CI, 2.5-46.7; P=0.03), 26.2% (95% CI, 1.0-51.4; P=0.04) and 42.8% (95% CI, 4.9-80.8; P=0.03) with respect to the CCI at 6 months and 1 and 2 years, respectively.
The mCCI-IHD facilitates better risk stratification for mortality in incident hemodialysis patients compared with the CCI, suggesting that it may be a preferred index for use in clinical practice and the statistical analysis of epidemiological studies.
[Show abstract][Hide abstract] ABSTRACT: Peritoneal dialysis (PD) is one potential treatment option for patients starting dialysis after graft loss (DAGL). However, the infectious outcomes and their associations with steroid use remain undetermined in these patients.
A total of 41 DAGL patients undergoing PD were recruited. The patients were divided into low- and high-dose steroid groups according to the median level. Additionally, they were categorized into tapering and nontapering groups, for which the tapering regimen was defined as the withdrawal of steroids within 1 year after starting PD. Primary outcomes, such as peritonitis and exit site infection (ESI), were compared between DAGL and 712 transplant-naive (TN) patients.
The overall risk of peritonitis was similar between the TN and DAGL groups. However, when the DAGL group was stratified by the steroid variable, the risk was higher in the high-dose or nontapering steroid groups than in the TN or counterpart steroid groups. The DAGL group had a higher risk of ESI than the TN group, irrespective of steroid dose. When the analysis was stratified by tapering regimen, the difference in ESI risks was seen only in the nontapering group and the TN group; the tapering group had a similar risk of ESI as that of the TN group.
The present study first addresses the risks of peritonitis and ESI together and then raises awareness of the high risk that should be considered after using high-dose steroids or the nontapering regimen in the DAGL patients.
[Show abstract][Hide abstract] ABSTRACT: Echocardiographic parameters can predict cardiovascular events in several clinical settings. However, which echocardiographic parameter is most predictive of each cardiovascular or non-cardiovascular event in patients starting hemodialysis remains unresolved. Echocardiography was used in 189 patients at the time of starting hemodialysis. We established primary outcomes as follows: cardiovascular events (ischemic heart disease, cerebrovascular disease, peripheral artery disease, and acute heart failure), fatal non-cardiovascular events, all-cause mortality, and all combined events. The most predictable echocardiographic parameter was determined in the Cox hazard ratio model with a backward selection after the adjustment of multiple covariates. Among several echocardiographic parameters, the E/e' ratio and the left ventricular end-diastolic volume (LVEDV) were the strongest predictors of cardiovascular and non-cardiovascular events, respectively. After the adjustment of clinical and biochemical covariates, the predictability of E/e' remained consistent, but LVEDV did not. When clinical events were further analyzed, the significant echocardiographic parameters were as follows: s' for ischemic heart disease and peripheral artery disease, LVEDV and E/e' for acute heart failure, and E/e' for all-cause mortality and all combined events. However, no echocardiographic parameter independently predicted cerebrovascular disease or non-cardiovascular events. In conclusion, E/e', s', and LVEDV have independent predictive values for several cardiovascular and mortality events.
Full-text · Article · Jan 2015 · Journal of Korean Medical Science
[Show abstract][Hide abstract] ABSTRACT: The inverse relationship between 25-hydroxyvitamin D [25(OH)D] status and insulin resistance (IR) has been reported, but many interventional studies failed to reduce IR with 25(OH)D supplementation. In addition, there has been a paucity of literature on the interaction between 25(OH)D status and IR according to the degree of obesity in Asian subjects. We therefore evaluated the association between 25(OH)D status and IR according to the degree of obesity. Data from the Korea National Health and Nutrition Examination Survey in 2008-2010 were analyzed. The study subjects comprised 10,629 participants aged ≥ 20 years with fasting glucose < 100 mg/dL. IR was estimated by the homeostasis model assessment (HOMA). We found an inverse linear association between 25(OH)D and loge(HOMA-IR) in multiple linear regression analysis; namely, 10 ng/mL increase of 25(OH)D was associated with 0.018 decrease of loge(HOMA-IR) (p < 0.0001). In the subgroup analysis, we identified a distinct trend that the inverse linear association between 25(OH)D and loge(HOMA-IR) became more prominent with the progression of body mass index, waist circumference, or fat mass quartile (Q): -0.009, -0.004, -0.029 and -0.037 in Q1-Q4 of body mass index, -0.004, -0.014, -0.02 and -0.038 in Q1-Q4 of waist circumference, and -0.002, -0.001, -0.017 and -0.025 in Q1-Q4 of fat mass. Thus, the IR-lowering effect of 25(OH)D became more evident with the progression of obesity in an adult Korean population without increased fasting glucose levels. We suggest that proper supplementation of vitamin D might be beneficial in obese Korean adults.
No preview · Article · Oct 2014 · The Tohoku Journal of Experimental Medicine
[Show abstract][Hide abstract] ABSTRACT: Background
Vitamin D deficiencies and increases in urinary albumin excretion (UAE) are both important and potentially related health problems; however, the nature of their relationship has not been established in normoalbuminuric subjects.
We obtained data from 14,594 normoalbuminuric Korean adults who underwent voluntary health screenings. We used a generalized additive model to examine the threshold level for relationship between serum 25-hydroxyvitamin D [25(OH)D] and urinary-albumin creatinine ratio (UACR) levels. We conducted multivariate logistic regression for high-normal UAE (UACR, 10–29 mg/g), according to various categories of vitamin D status.
The generalized additive model confirmed a non-linear relationship between serum 25(OH)D and UACR levels, and the threshold concentration of 25(OH)D was 8.0 ng/mL after multivariate adjustment. Comparing subjects who fell into the lowest category of serum 25(OH)D levels with subjects who were in the reference range (the highest category), we observed that the multivariate adjusted odds ratio (OR) for high-normal UAE was significantly increased, regardless of the criteria used to categorize vitamin D levels: OR of the 1st quartile over the 4th quartile, 1.20 (95% CI, 1.04-1.39); OR of the 1.0-4.9th percentile over the 50-100th percentile, 1.56 (95% CI, 1.25-1.93); and OR of vitamin D deficiency group over vitamin D sufficiency group, 1.28 (95% CI, 1.08-1.52).
We demonstrated that there was an inverse relationship between serum 25(OH)D less than 8.0 ng/mL and UACR in normoalbuminuric subjects, suggesting that severe vitamin D deficiency could cause an increase in UAE in subjects with normoalbuminuria.
[Show abstract][Hide abstract] ABSTRACT: Background
Proteinuria and hematuria are both important health issues; however, the nature of the association between these findings and acute kidney injury (AKI) or mortality remains unresolved in critically ill patients.
Proteinuria and hematuria were measured by a dipstick test and scored using a scale ranging from a negative result to 3+ in 1883 patients admitted to the intensive care unit. AKI was defined according to the Kidney Disease: Improving Global Outcomes (KDIGO) guidelines. The odds ratios (ORs) for AKI and 3-year mortality were calculated after adjustment for multiple covariates according to the degree of proteinuria or hematuria. For evaluating the synergistic effect on mortality among proteinuria, hematuria, and AKI, the relative excess risk due to interaction (RERI) was used.
Proteinuria and hematuria increased the ORs for AKI: the ORs of proteinuria were 1.66 (+/−), 1.86 (1+), 2.18 (2+), and 4.74 (3+) compared with non-proteinuria; the ORs of hematuria were 1.31 (+/−), 1.58 (1+), 2.63 (2+), and 2.52 (3+) compared with non-hematuria. The correlations between the mortality risk and proteinuria or hematuria were all significant and graded (Ptrend < 0.001). There was a relative excess risk of mortality when both AKI and proteinuria or hematuria were considered together: the synergy indexes were 1.30 and 1.23 for proteinuria and hematuria, respectively.
Proteinuria and hematuria are associated with the risks of AKI and mortality in critically ill patients. Additionally, these findings had a synergistic effect with AKI on mortality.
[Show abstract][Hide abstract] ABSTRACT: Low or high counts of white blood cells (WBCs) and WBC subtypes can be a predictor of morbidity and mortality in several clinical settings. However, the correlations of WBC and its subtypes with acute kidney injury (AKI) and mortality remain unresolved in critically ill patients. The counts of WBC and subtypes, such as neutrophil, lymphocyte, monocyte, and eosinophil, were measured in 2,079 patients admitted to the intensive care unit (ICU) from June 2004 through June 2010. The non-linear relationship between WBC counts and AKI risk was initially explored by a restricted cubic spline analysis. The odds ratios (ORs) for AKI and 1-year mortality were calculated after adjustment for multiple covariates. The relationship between WBC counts and AKI risk was U-shaped. Accordingly, we divided patients into quintiles according to the counts of WBC or subtypes. The 1(st) and 5(th) quintiles of WBC counts had greater ORs for AKI (1.42 and 2.05, respectively) and mortality (1.40 and 1.36, respectively) compared with the 3(rd) quintile. After stratification by WBC subtype, the 5(th) quintile of neutrophil counts and the 1(st) quintiles of lymphocyte and monocyte counts tended to have higher ORs for AKI (1.69, 1.40, and 1.77, respectively). For mortality, the 1(st) quintiles of neutrophil, lymphocyte, and eosinophil counts were associated with higher mortality compared with the 3(rd) quintile (the ORs were 1.48, 1.57, and 1.42, respectively). Both leukopenia and leukocytosis are associated with AKI and mortality risk in critically ill patients. This result may be attributable to the change in the subtype counts.
No preview · Article · Mar 2014 · The Tohoku Journal of Experimental Medicine
[Show abstract][Hide abstract] ABSTRACT: The relationship between body fat mass and vitamin D appears to vary by ethnicity, but our understanding of this predisposition in Asians is limited due to the scarcity of prior investigations. Data on 1,697 Korean adults were obtained from the second and third years (2008-2009) of the fourth Korean National Health and Nutritional Examination Survey. Body fat mass was measured using dual-energy X-ray absorptiometry. Both linear regression analysis for serum 25-hydroxyvitamin D [25(OH)D] and logistic analysis for vitamin D deficiency [25(OH)D <20 ng/mL] were performed to determine significant predictors among BMI, waist circumference (WC), and body fat percentage (BF), after adjustment of multiple covariates. To explore a possible non-linear relationship between them, the fractional polynomials method was used. All analyses were conducted following stratification by sex. In linear regression analysis, BMI and WC were not associated with 25(OH)D. However, BF was inversely related to 25(OH)D, irrespective of the fat location (both appendicular and truncal fat) in both sexes. In logistic regression analysis, the highest quartile group of BF had a greater OR for vitamin D deficiency than the lower quartile groups, irrespective of the fat location and sex. However, the quartiles of BMI and WC were not associated with vitamin D deficiency. The linear relationships between BF and 25(OH)D (or vitamin D deficiency) were confirmed despite use of the fractional polynomials method. Body fat mass is inversely associated with serum 25(OH)D in Korean adults. Monitoring of vitamin D deficiency in Korean adults with high fat mass is needed.
No preview · Article · Feb 2014 · Asia Pacific Journal of Clinical Nutrition
[Show abstract][Hide abstract] ABSTRACT: Background
Periodontitis and chronic kidney disease (CKD) are important health issues; however, the association between periodontitis and CKD markers, especially in Korean adults, remains elusive.
Data on 15,729 Korean adults were obtained from the Korean National Health and Nutritional Examination Surveys IV and V. The CKD markers included a decreased estimated glomerular filtration rate (eGFR;<60 mL/min/1.73 m2), proteinuria, and hematuria. Odds ratios (ORs) and 95% confidence intervals were measured using stepwise multivariate logistic regression analyses for CKD markers based on the presence of periodontitis.
Patients with periodontitis had greater unadjusted ORs for CKD markers compared to those without periodontitis, as follows: decreased eGFR, 4.07 (3.11–5.33); proteinuria, 2.12 (1.48–3.05); and hematuria, 1.25 (1.13–1.39; (all P<0.001). Periodontitis was a significant predictor of decreased eGFR independent of all covariates [1.39 (1.03–1.89), P=0.034]. However, the effect of periodontitis on decreased eGFR seemed to be affected by hypertension and diabetes mellitus. Periodontitis was not an independent predictor of proteinuria; the significance disappeared after adjusting for hypertension and diabetes mellitus. Periodontitis was significantly correlated with hematuria, leading to similar ORs regardless of the adjustment for covariates [1.29 (1.15–1.46), P<0.001].
This study confirms the correlation between periodontitis and CKD markers, including decreased eGFR, proteinuria, and hematuria in Korean adults.
[Show abstract][Hide abstract] ABSTRACT: Anemia and vitamin D deficiency are both important health issues; however, the nature of the association between vitamin D and either hemoglobin or anemia remains unresolved in the general population.
Data on 11,206 adults were obtained from the fifth Korean National Health and Nutritional Examination Survey. A generalized additive model was used to examine the threshold level for relationship between serum 25-hydroxyvitamin D [25(OH)D] and hemoglobin levels. A multivariate logistic regression for anemia was conducted according to 25(OH)D quintiles. All analyses were stratified according to sex and menstrual status.
The generalized additive model confirmed a threshold 25(OH)D level of 26.4 ng/mL (male, 27.4 ng/mL; premenopausal females, 11.8 ng/mL; postmenopausal females, 13.4 ng/mL). The threshold level affected the pattern of association between 25(OH)D and anemia risk: the odds ratio of the 1(st) quintile but not the 2(nd), 3(rd), and 4(th) quintiles were significantly different from the 5(th) quintile in both premenopausal and postmenopausal females, however there was no obvious trend in males.
This population-based study demonstrated a non-linear relationship with a threshold effect between serum 25(OH)D and hemoglobin levels in females. Further interventional studies are warranted to determine whether the appropriate level of hemoglobin can be achieved by the correction of vitamin D deficiency.
[Show abstract][Hide abstract] ABSTRACT: The addition of relevant parameters to acute kidney injury (AKI) criteria might allow better prediction of patient mortality than AKI criteria alone. Here, we evaluated whether inclusion of AKI duration could address this issue.
AKI was defined according to the Kidney Disease: Improving Global Outcomes (KDIGO) guidelines in 2,143 critically ill patients, within 15 days of patient admission. AKI cases were categorized according to tertiles of AKI duration: 1st tertile, 1--2 days; 2nd tertile, 3--5 days; and 3rd tertile, >=6 days. The hazard ratios (HRs) for overall survival rates in three groups were calculated after adjustment for multiple covariates compared with ICU patients without AKI as the reference group. The predictive ability for mortality was assessed by calculating the area under the curve (AUC) of the receiver operating characteristic curve.
AKI increased the HRs for overall mortality, and the mortality rate increased with AKI duration: the adjusted HRs were 1.99 (1st tertile), 2.67 (2nd tertile), and 2.85 (3rd tertile) compared with the non-AKI group (all Ps < 0.001). The AUC of the ROC curve for overall mortality based on the AKI duration groups (0.716) was higher than the AUC of AKI staging using the KDIGO guidelines (0.696) (P = 0.001). When considering KDIGO stage and AKI duration together, the AUC (0.717) was also significantly higher than that using the KDIGO stage alone (P < 0.001).
AKI duration is an additional parameter for the prediction of mortality in critically ill patients. The inclusion of AKI duration could be considered as a refinement of the AKI criteria.
[Show abstract][Hide abstract] ABSTRACT: Introduction:
Toxic heavy metals have adverse effects on human health. However, the risk of hematuria caused by heavy metal exposure has not been evaluated.
Data from 4701 Korean adults were obtained in the Korean National Health and Nutritional Examination Survey (2008-2010). Blood levels of the toxic heavy metals cadmium, lead, and mercury were measured. Hematuria was defined as a result of ≥+1 on a urine dipstick test. The odds ratios (ORs) for hematuria were measured according to the blood heavy metal levels after adjusting for multiple variables.
Individuals with blood cadmium levels in the 3rd and 4th quartiles had a greater OR for hematuria than those in the 1st quartile group: 3rd quartile, 1.35 (1.019-1.777; P=0.037); 4th quartile, 1.52 (1.140-2.017; P=0.004). When blood cadmium was considered as a log-transformed continuous variable, the correlation between blood cadmium and hematuria was significant: OR, 1.97 (1.224-3.160; Ptrend=0.005). In contrast, no significant correlations between hematuria and blood lead or mercury were found in the multivariate analyses.
The present study shows that high cadmium exposure is associated with a risk of hematuria.
No preview · Article · May 2013 · Environmental Research
[Show abstract][Hide abstract] ABSTRACT: Background
The effects of air pollution on the respiratory and cardiovascular systems, and the resulting impacts on public health, have been widely studied. However, little is known about the effect of air pollution on the occurrence of hemorrhagic fever with renal syndrome (HFRS), a rodent-borne infectious disease. In this study, we evaluated the correlation between air pollution and HFRS incidence from 2001 to 2010, and estimated the significance of the correlation under the effect of climate variables.
We obtained data regarding HFRS, particulate matter smaller than 10 μm (PM10) as an index of air pollution, and climate variables including temperature, humidity, and precipitation from the national database of South Korea. Poisson regression models were established to predict the number of HFRS cases using air pollution and climate variables with different time lags. We then compared the ability of the climate model and the combined climate and air pollution model to predict the occurrence of HFRS.
The correlations between PM10 and HFRS were significant in univariate analyses, although the direction of the correlations changed according to the time lags. In multivariate analyses of adjusted climate variables, the effects of PM10 with time lags were different. However, PM10 without time lags was selected in the final model for predicting HFRS cases. The model that combined climate and PM10 data was a better predictor of HFRS cases than the model that used only climate data, for both the study period and the year 2011.
This is the first report to document an association between HFRS and PM10 level.
[Show abstract][Hide abstract] ABSTRACT: The purpose of our study was to evaluate the dietary intake of kidney transplant recipients (KTRs) and assess oral intake related nutrition problems. Fifty patients who had undergone kidney transplantation were included: 24 males, 26 females. The mean age was 46.8 ± 11.2 years, height was 161.3 ± 8.3 cm, and body weight was 60.5 ± 8.7 kg. We conducted nutrition education based on the diet guideline for KTRs (energy 32 kcal/kg of ideal body weight [IBW], protein 1.3 g/kg of IBW) and neutropenic diet guideline before discharge. Dietary intake of the patients at 1 month after transplantation was investigated by 3-day food records. Body weight and laboratory values for nutritional status and graft function were also collected. Body weight was significantly decreased from admission to discharge. Body weight from discharge to 1 month and 3 months after transplantation was increased but was not significant. Biochemical measurements were generally improved but the number of patients with hypophosphatemia increased. The daily dietary intake of energy and protein was adequate (33.1 kcal/kg, 1.5 g/kg, respectively). However, the dietary intake of calcium, folate, and vitamin C did not meet the Korean Recommended Nutrient Intake of vitamins and minerals (86.8%, 62.4%, and 88.0%, respectively). Patients with low intake of calcium, folate, and vitamin C presented low intake in milk and dairy products, vegetables, and fruits, and these foods were related to restricted food items in neutropenic diet. More attention should be paid on improving quality of diet, and reconsideration of present neutropenic diet guideline is necessary. These results can be used to establish evidence-based medical nutrition therapy guideline for KTRs.
[Show abstract][Hide abstract] ABSTRACT: Kidney transplantation and accompanying medical conditions may result in changes in body composition. Such changes have been evaluated in Caucasian recipients, but not in Asian recipients. Herein, we conducted a study on Asian recipients because Asians have a different body composition from Caucasians. A total of 50 Asian recipients was enrolled as a prospective cohort. Using bioelectrical impedance analysis, body composition (muscle and fat mass) was assessed after 2 weeks (baseline), and at 1, 3, 6, 9, and 12 months following kidney transplantation. To find predictors related to changes, the data were analyzed by multivariate analysis using forward selection. All of the patients had good graft function during the study period. Patients gained approximately 3 kg within 1 yr of kidney transplantation. The proportion of muscle mass significantly decreased (P(trend) = 0.001) and the proportion of fat mass significantly increased over time (P(trend) = 0.002). The multivariate results revealed that male recipients, deceased donor type, and low protein intake were associated with an increase in fat mass and a decrease in muscle mass. The results from this study may help to investigate differences in body composition changes between races, as well as the factors related to these changes.
Preview · Article · Oct 2012 · Journal of Korean medical science