Youfu Li

University of Worcester, Worcester, England, United Kingdom

Are you Youfu Li?

Claim your profile

Publications (21)86.51 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Necrotizing soft-tissue infections (NSTI) are rare, potentially fatal, operative emergencies. We studied a national cohort of patients to determine recent trends in incidence, treatment, and outcomes for NSTI. METHODS: We queried the Nationwide Inpatient Sample (1998-2010) for patients with a primary diagnosis of NSTI. Temporal trends in patient characteristics, treatment (debridement, amputation, hyperbaric oxygen therapy [HBOT]), and outcomes were determined with Cochran-Armitage trend tests and linear regression. To account for trends in case mix (age, sex, race, insurance, Elixhauser index) or receipt of HBOT on outcomes, multivariable analyses were conducted to determine the independent effect of year of treatment on mortality, any major complication, and hospital length of stay (LOS) for NSTI. RESULTS: We identified 56,527 weighted NSTI admissions, with an incidence ranging from approximately 3,800-5,800 cases annually. The number of cases peaked in 2004 and then decreased between 1998 and 2010 (P < .0001). The percentage of female patients decreased slightly over time (38.6-34.1%, P < .0001). Patients were increasingly in the 18- to 34-year-old (8.8-14.6%, P < .0001) and 50- to 64-year-old age groups (33.2-43.5, P < .0001), Hispanic (6.8-10.5%, P < .0001), obese (8.9-24.6%, P < .0001), and admitted with >3 comorbidities (14.5-39.7%, P < .0001). The percentage of patients requiring only one operative debridement increased somewhat (43.2-46.2%, P < .0001), whereas the use of HBOT was rare and decreasing (1.6-0.8%, P < .0001). The percentage of patients requiring operative wound closure decreased somewhat (23.5-20.8%, P < .0001). Although major complication rates increased (30.9-48.2%, P < .0001), hospital LOS remained stable (18-19 days) and mortality decreased (9.0-4.9%, P < .0001) on univariate analyses. On multivariable analyses each 1-year incremental increase in year was associated with a 5% increased odds of complication (odds ratio 1.05), 0.4 times decrease in hospital LOS (coefficient -0.41), and 11% decreased odds of mortality (odds ratio 0.89). CONCLUSION: There were potentially important national trends in patient characteristics and treatment patterns for NSTI between 1998 and 2010. Importantly, though patient acuity worsened and complication rates increased, but LOS remained relatively stable and mortality decreased. Improvements in early diagnosis, wound care, and critical care delivery may be the cause.
    Surgery 02/2013; · 3.37 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: Scoring systems for predicting mortality after repair of ruptured abdominal aortic aneurysms (RAAAs) have not been developed or tested in a United States population and may not be accurate in the endovascular era. Using prospectively collected data from the Vascular Study Group of New England (VSGNE), we developed a practical risk score for in-hospital mortality after open repair of RAAAs and compared its performance to that of the Glasgow aneurysm score, Hardman index, Vancouver score, and Edinburg ruptured aneurysm score. METHODS: Univariate analysis followed by multivariable analysis of patient, prehospital, anatomic, and procedural characteristics identified significant predictors of in-hospital mortality. Integer points were derived from the odds ratio (OR) for mortality based on each independent predictor in order to generate a VSGNE RAAA risk score, which was internally validated using bootstrapping methodology. Discrimination and calibration of all models were assessed by calculating the area under the receiver-operating characteristic curve (C-statistic) and applying the Hosmer-Lemeshow test. RESULTS: From 2003 to 2009, 242 patients underwent open repair of RAAAs at 10 centers. In-hospital mortality was 38% (n = 91). Independent predictors of mortality included age >76 years (OR, 5.3; 95% confidence interval [CI], 2.8-10.1), preoperative cardiac arrest (OR, 4.3; 95% CI, 1.6-12), loss of consciousness (OR, 2.6; 95% CI, 1.2-6), and suprarenal aortic clamp (OR, 2.4; 95% CI, 1.3-4.6). Patient stratification according to the VSGNE RAAA risk score (range, 0-6) accurately predicted mortality and identified those at low and high risk for death (8%, 25%, 37%, 60%, 80%, and 87% for scores of 0, 1, 2, 3, 4, and ≥5, respectively). Discrimination (C = .79) and calibration (χ(2) = 1.96; P = .85) were excellent in the derivation and bootstrap samples and superior to that of existing scoring systems. The Glasgow aneurysm score, Hardman index, Vancouver score, and Edinburg ruptured aneurysm score correlated with mortality in the VSGNE cohort but failed to identify accurately patients with a risk of mortality >65%. CONCLUSIONS: Existing scoring systems predict mortality after RAAA repair in this cohort but do not identify patients at highest risk. This parsimonious VSGNE RAAA risk score based on four variables readily assessed at the time of presentation allows accurate prediction of in-hospital mortality after open repair of RAAAs, including identification of those patients at highest risk for postoperative mortality.
    Journal of vascular surgery: official publication, the Society for Vascular Surgery [and] International Society for Cardiovascular Surgery, North American Chapter 11/2012; · 3.52 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Organ shortage is the greatest challenge facing the field of organ transplantation today. Use of more organs of marginal quality has been advocated to address the shortage. METHOD: We examined the pattern of donation and organ use in the United States as shown in the Organ Procurement and Transplantation Network/United Network for Organ Sharing database of individuals who were consented for and progressed to organ donation between January 2001 and December 2010. RESULTS: There were 66,421 living donors and 73,359 deceased donors, including 67,583 (92.1 %) identified as donation after brain death and 5,776 (7.9 %) as donation after circulatory death (DCD). Comparing two periods, era 1 (01/2001-12/2005) and era 2 (01/2006-12/2010), the number of deceased donors increased by 20.3 % from 33,300 to 40,059 while there was a trend for decreasing living donation. The DCD subgroup increased from 4.9 to 11.7 % comparing the two eras. A significant increase in cardiovascular/cerebrovascular disease as a cause of death was also noted, from 38.1 % in era 1 to 56.1 % in era 2 (p < 0.001), as was a corresponding decrease in the number of deaths due to head trauma (48.8 vs. 34.9 %). The overall discard rate also increased from 13,411 (11.5 %) in era 1 to 19,516 (13.7 %) in era 2. This increase in discards was especially prominent in the DCD group [440 (20.9 %) in era 1 vs. 2,089 (24.9 %) in era 2]. CONCLUSIONS: We detect a significant change in pattern of organ donation and use in the last decade in the United States. The transplant community should consider every precaution to prevent the decay of organ quality and to improve the use of marginal organs.
    World Journal of Surgery 08/2012; · 2.23 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Living donor liver transplantation (LDLT) is an accepted treatment for patients with end-stage liver disease. To minimize risk to the donor, left lobe (LL) LDLT may be an ideal option in adult LDLT. This study assessed the outcomes of LL-LDLT compared with right lobe (RL) LDLT in adults (1998-2010) as reported to the United Network for Organ Sharing (UNOS) Organ Procurement and Transplantation Network (OPTN). A total of 2844 recipients of LDLT were identified. Of these, 2690 (94.6%) underwent RL-LDLT and 154 (5.4%) underwent LL-LDLT. A recent increase in the number of LL-LDLTs was noted: average numbers of LL-LDLTs per year were 5.2 during 1998-2003 and 19.4 during 2004-2010. Compared with RL-LDLT recipients, LL-LDLT recipients were younger (mean age: 50.5 years vs. 47.0 years), had a lower body mass index (BMI) (mean BMI: 24.5 kg/m(2) vs. 26.8 kg/m(2)), and were more likely to be female (64.6% vs. 41.9%). Donors in LL-LDLT had a higher BMI (mean BMI: 29.4 kg/m(2) vs. 26.5 kg/m(2)) and were less likely to be female (30.9% vs. 48.1%). Recipients of LL-LDLT had a longer mean length of stay (24.9 days vs. 18.2 days) and higher retransplantation rates (20.3% vs. 10.9%). Allograft survival in LL-LDLT was significantly lower than in RL-LDLT and there was a trend towards inferior patient survival. In Cox regression analysis, LL-LDLT was found to be associated with an increased risk for allograft failure [hazard ratio (HR): 2.39)] and inferior patient survival (HR: 1.86). The number of LL-LDLTs has increased in recent years.
    HPB 07/2012; 14(7):455-60. · 1.94 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To date, history of a contralateral amputation as a potential predictor of outcomes after lower extremity bypass (LEB) for critical limb ischemia (CLI) has not been studied. We sought to determine if a prior contralateral lower extremity amputation predicts worse outcomes in patients undergoing LEB in the remaining intact limb. A retrospective analysis of all patients undergoing infrainguinal LEB for CLI between 2003 and 2010 within hospitals comprising the Vascular Study Group of New England was performed. Patients were stratified according to whether or not they had previously undergone a contralateral major or minor amputation before LEB. Primary end points included major amputation and graft occlusion at 1 year postoperatively. Secondary end points included in-hospital major adverse events, discharge status, and mortality at 1 year. Of 2636 LEB procedures, 228 (8.6%) were performed in the setting of a prior contralateral amputation. Patients with a prior amputation compared to those without were younger (66.5 vs 68.7; P = .034), more like to have congestive heart failure (CHF; 25% vs 16%; P = .002), hypertension (94% vs 85%; P = .015), renal insufficiency (26% vs 14%; P = .0002), and hemodialysis-dependent renal failure (14% vs 6%; P = .0002). They were also more likely to be nursing home residents (8.0% vs 3.6%; P = .036), less likely to ambulate without assistance (41% vs 80%; P < .0002), and more likely to have had a prior ipsilateral bypass (20% vs 12%; P = .0005). These patients experience increased in-hospital major adverse events, including myocardial infarction (MI; 8.9% vs 4.2%; P = .002), CHF (6.1% vs 3.4%; P = .044), deterioration in renal function (9.0% vs 4.7%; P = .006), and respiratory complications (4.2% vs 2.3%; P = .034). They were less likely to be discharged home (52% vs 72%; P < .0001) and less likely to be ambulatory on discharge (25% vs 55%; P < .0001). Although patients with a prior contralateral amputation experienced increased rates of graft occlusion (38% vs 17%; P < .0001) and major amputation (16% vs 7%; P < .0001) at 1 year, there was not a significant difference in mortality (16% vs 10%; P = .160). On multivariable analysis, prior contralateral amputation was an independent predictor of both major amputation (odds ratio, 1.73; confidence interval, 1.06-2.83; P = .027) and graft occlusion (odds ratio, 1.93; confidence interval, 1.39-2.68; P < .0001) at 1 year. Patients with prior contralateral amputations who present with CLI in the intact limb represent a high-risk population, even among patients with advanced peripheral arterial disease. When considering LEB in this setting, both physicians and patients should expect increased rates of perioperative adverse events, increased rates of 1-year graft occlusion, and decreased rates of limb salvage, when compared with patients who have not undergone a contralateral amputation.
    Journal of vascular surgery: official publication, the Society for Vascular Surgery [and] International Society for Cardiovascular Surgery, North American Chapter 04/2012; 56(2):353-60. · 3.52 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Organ shortage has resulted in greater emphasis on partial liver transplantation (PLT) as an alternative to whole-organ liver transplantation. This study was conducted to assess outcomes in PLT and to compare outcomes of deceased donor split-liver transplantation (DD-SLT) and live donor liver transplantation (LDLT) in adults transplanted in the USA using data reported to the United Network for Organ Sharing in the era of Model for End-stage Liver Disease (MELD) scores. Between 2002 and 2009, 2272 PLTs were performed in the USA; these represented 5.3% of all liver transplants carried out in the country and included 557 (24.5%) DD-SLT and 1715 LDLT (75.5%) procedures. The most significant differences between the DD-SLT and LDLT groups related to mean MELD scores, which were lower in LDLT recipients (14.5 vs. 20.9; P < 0.001), mean recipient age, which was lower in the LDLT group (50.7 years vs. 52.8 years; P < 0.001), and mean donor age, which was lower in the DD-SLT group (23.0 years vs. 37.3 years; P < 0.001). Allograft survival was comparable between the two groups (P= 0.438), but patient survival after LDLT was better (P= 0.04). In Cox regression analysis, LDLT was associated with better allograft (hazards ratio [HR]= 0.7, 95% confidence interval [CI] 0.630-0.791; P < 0.0001) and patient (HR = 0.6, 95% CI 0.558-0.644; P < 0.0001) survival than DD-SLT. Partial liver transplantation represents a potentially underutilized resource in the USA. Despite the differences in donor and recipient characteristics, LDLT is associated with better allograft and patient survival than DD-SLT. A different allocation system for DD-SLT allografts that takes into consideration cold ischaemia time and recipient MELD score should be considered.
    HPB 11/2011; 13(11):797-801. · 1.94 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: All open and laparoscopic colectomies submitted to the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) were evaluated for trends and improvements in operative outcomes. 48,247 adults (≥18 y old) underwent colectomy in ACS NSQIP, as grouped by surgical approach (laparoscopic versus open), urgency (emergent versus elective), and operative year (2005 to 2008). Primary outcomes measured morbidity, mortality, perioperative, and postoperative complications. The proportion of laparoscopic colectomies performed increased annually (26.3% to 34.0%), while open colectomies decreased (73.7% to 66.0%; P < 0.0001). Most emergent colectomies were open procedures (93.5%) representing 24.3% of all open cases. The overall risk-adjusted morbidity and mortality for all colectomy procedures did not show a statistically significant change over time, however, morbidity and mortality increased among open colectomies (r = 0.03) and decreased among laparoscopic colectomies (r = -0.04; P < 0.0001). Postoperative complications reduced significantly including superficial surgical site infections (9.17% to 8.20%, P < 0.004), pneumonia (4.60% to 3.97%, P < 0.0001), and sepsis (4.72%, 2005; 6.81%, 2006; 5.62%, 2007; 5.09%, 2008; P < 0.0002). Perioperative improvements included operative time (169.2 to 160.0 min), PRBC transfusions (0.27 to 0.25 units) and length of stay (10.5 to 6.61 d; P < 0.0001). It appears that laparoscopic colectomies are growing in popularity over open colectomies, but the need for emergent open procedures remains unchanged. Across all colectomies, however, key postoperative and perioperative complications have improved over time. Participation in ACS NSQIP demonstrates quality improvement and may encourage greater enrollment.
    Journal of Surgical Research 07/2011; 171(1):e9-13. · 2.02 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A growth in the utilization of high-risk allografts is reflective of a critical national shortage and the increasing waiting list mortality. Using risk-adjusted models, the aim of the present study was to determine whether a volume-outcome relationship existed among liver transplants at high risk for allograft failure. From 2002 to 2008, the Scientific Registry of Transplant Recipients (SRTR) database for all adult deceased donor liver transplants (n = 31 587) was queried. Transplant centres (n = 102) were categorized by volume into tertiles: low (LVC; 31 cases/year), medium (MVC: 64 cases/year) and high (HVC: 102 cases/year). Donor risk comparison groups were stratified by quartiles of the Donor Risk Index (DRI) spectrum: low risk (DRI ≤ 1.63), moderate risk (1.64 > DRI > 1.90), high risk (1.91 > DRI > 2.26) and very high risk (DRI ≥ 2.27). HVC more frequently used higher-risk livers (median DRI: LVC: 1.82, MVC: 1.90, HVC: 1.97; P < 0.0001) and achieved better risk adjusted allograft survival outcomes compared with LVC (HR: 0.90, 95%CI: 0.85-0.95). For high and very high risk groups, transplantation at a HVC did contribute to improved graft survival [high risk: hazard ratio (HR): 0.85, 95% confidence interval (CI): 0.76-0.96; Very High Risk: HR: 0.88, 95%CI: 0.78-0.99]. While DRI remains an important aspect of allograft survival prediction models, liver transplantation at a HVC appears to result in improved allograft survival with high and very high risk DRI organs compared with LVC.
    HPB 07/2011; 13(7):447-53. · 1.94 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Numerous reports have documented reduced graft and patient survival after use of hepatitis C (HCV) seropositive allografts in liver transplantation (OLT). We aimed to examine if the use of a HCV+ liver allograft affects patient and graft survivals compared to HCV- donor allografts in a case-controlled analysis of the united network for organ sharing (UNOS) database. We examined 63,149 liver transplants (61,905 donors HCV-; 1,244 donors HCV+) from the UNOS standard transplant analysis and research (STAR) file from 1987 to 2007. Donor and recipient demographics and outcomes were collected in which donor HCV serology was complete. A case-controlled cohort from 11 donor and recipient variables comparing donor HCV- and HCV+ allografts (n=540 in each group) was created using propensity scores with a matching algorithm. Graft and patient survival was estimated using Kaplan-Meier survival curves. Significant differences were evident in the unadjusted cohort between recipients who received HCV+ and HCV- allografts, including HCV+ recipients, donor and recipient age, and model for end-stage liver disease (MELD) exception cases. Use of HCV+ allograft resulted in significantly lower graft survival (8.1 vs. 10.6 years; P=0.001) and patient survival (10.2 vs. 12.3 years; P=0.01) after OLT. In the matched cohort, HCV seropositivity had no detrimental effect on the graft (P=0.57) or patient (P=0.78) survival after OLT. This is the first population-based analysis to show that after adjusting for donor and recipient characteristics there was no difference in graft or patient survival with the use of HCV+ donor liver allografts compared to HCV- donor liver allografts.
    World Journal of Surgery 03/2011; 35(7):1590-5. · 2.23 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Close to 30,000 people die of cirrhosis in the USA each year. Previous studies have shown a survival advantage with high-volume (HV) hospitals for complex surgical procedures. We examined whether a volume benefit exists for hospitals dealing with specialized disorders like complications of cirrhosis. Using the Nationwide Inpatient Sample, we identified all cases of cirrhosis-related complications (n = 217,948) from 1998 to 2006. Hospital volume was divided into tertile-based admissions for cirrhosis per year. The primary outcome was in-hospital mortality, and secondary endpoints included length of stay (LOS) and hospital charges. The number of admissions for cirrhosis increased over time (p < 0.0001). HV centers were more likely to be large (86.8%) and teaching (81.5%) hospitals compared to lower volume centers. The average LOS and hospital charges were greater at the HV centers, but hospitalization at a HV center resulted in an adjusted mortality benefit (HR 0.88; 95% CI 0.83-0.92) compared to care at lower volume hospitals. Despite increased LOS and hospital cost, a mortality benefit exists at HV centers. Future studies are necessary to determine other processes of care that may exist at HV centers that may account for this survival benefit.
    Journal of Gastrointestinal Surgery 02/2011; 15(2):330-5. · 2.36 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Recent United Network for Organ Sharing (UNOS) data suggest that live kidney donation is stagnant. Current practices and trends in laparoscopic donor nephrectomy (LDN) among the transplant community remain largely unknown. From the Nationwide Inpatient Sample (NIS) from 1998 to 2006, patients undergoing LDN (n = 9,437) were identified. Live kidney donation in the United States did not show an increase in the NIS. Of the live donor cases recorded, 58 (0.61%) were associated with a major short-term complication. The number of LDNs performed by transplant surgeons decreased over the study period from 76.5% in 1998 to 30.4% in 2006. In the United Stares, LDNs are performed safely with a low short-term complication rate. Despite the use of laparoscopy and the increased need of donor organs, the rate of LDN in kidney transplantation has not increased proportionally.
    World Journal of Surgery 12/2010; 34(12):2985-90. · 2.23 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Alcohol consumption is a well-documented determinant of adverse perioperative outcome. We sought to determine the effect of active alcohol consumption following elective surgery. We queried discharge records from the American College of Surgeons' National Surgical Quality Improvement Program (NSQIP, 2005-2007) for all elective adult admissions. The 7,631 (2.5%) patients with documented alcohol use (active alcohol use of at least two drinks per day within 2 weeks of surgery; ETOH use) underwent elective surgery; 301,994 (97.5%) patients denied ETOH use. Multivariate analysis was performed with adjustments for demographic and comorbid factors. Primary outcome measures included length of stay (LOS), postoperative complications, and death. ETOH use associated with elective surgery decreased over the course of the study (p < 0.0001). ETOH use was an independent predictor of pneumonia (OR 1.98, 95% CI 1.84-2.13), sepsis (OR 1.19, 95% CI 1.03-1.37), superficial surgical site infection (SSI; OR 1.15, 95% CI 1.02-1.31), wound disruption (OR 1.41, 95% CI 1.11-1.80), and prolonged LOS (OR 1.17, 95% CI 1.08-1.26). Except for SSI, these complications were independent risk factors for postoperative mortality. ETOH use was associated with earlier time to wound disruption (9 vs. 11 days; p = 0.04), longer median hospital stays (5 vs. 3 days; p < 0.0001), and longer LOS after operation (4 vs. 3 days; p < 0.0001). Active alcohol consumption is a significant determinant of adverse outcomes in elective surgery; patients with ETOH use who are scheduled to undergo elective surgery should be appropriately educated and counseled.
    Journal of Gastrointestinal Surgery 11/2010; 14(11):1732-41. · 2.36 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The incidence of hepatocellular carcinoma (HCC) is increasing in the United States, and the care of these patients remains highly specialized and complex. Multiple treatment options are available for HCC but their use and effectiveness remain unknown. Using Surveillance, Epidemiology, and End Results (SEER)-Medicare linked data, 8730 patients who were diagnosed with HCC between 1991 and 2005 were identified. Therapy included surgical resection (8.7%), liver transplantation (1.4%), ablation (3.6%), or transarterial chemoembolization (16%). Patients who received no or palliative-only treatment were grouped together (NoTx; 70.3%). Patient, disease, and tumor factors were examined as determinants of therapy. HCC is increasing in the Medicare population. The median age at diagnosis was 75.1 years and 73.6% of patients were coded as white, 17.2% as Asian, 8.3% as black, and 0.9% as other race. The rate of therapy increased over time, but only 29.7% of patients overall underwent therapy. In patients with early stage HCC, only 43.1% underwent therapy. In the NoTx group, 49.4% did not have cirrhosis, 36.0% had tumors that measured <5 cm, and 39.8% were diagnosed with stage I or II disease when variables were complete. The use of therapy for all HCC patients increased over time, correlating with a commensurate increase in median survival. In multivariate regression analysis, patients who received any modality of treatment achieved significant benefit compared with the NoTx group (odds ratio, 0.41; 95% confidence interval, 0.39-0.43). In the Medicare population, HCC patients who received therapy experienced a substantial survival advantage over their nonoperative peers (NoTx). Despite evidence that many patients had favorable biological characteristics, <30% of patients diagnosed with HCC received any treatment.
    Cancer 10/2010; 117(5):1019-26. · 5.20 Impact Factor
  • Journal of Vascular Surgery 06/2010; 51(6). · 2.88 Impact Factor
  • Gastroenterology 01/2010; 138(5). · 12.82 Impact Factor
  • Gastroenterology 01/2010; 138(5). · 12.82 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: There is controversy over the optimal management strategy for patients with acute pancreatitis (AP). Studies have shown a hospital volume benefit for in-hospital mortality after surgery, and we examined whether a similar mortality benefit exists for patients admitted with AP. Using the Nationwide Inpatient Sample, discharge records for all adult admissions with a primary diagnosis of AP (n = 416,489) from 1998 to 2006 were examined. Hospitals were categorized based on number of patients with AP; the highest third were defined as high volume (HV, >or=118 cases/year) and the lower two thirds as low volume (LV, <118 cases/year). A matched cohort based on propensity scores (n = 43,108 in each group) eliminated all demographic differences to create a case-controlled analysis. Adjusted mortality was the primary outcome measure. In-hospital mortality for patients with AP was 1.6%. Hospital admissions for AP increased over the study period (P < .0001). HV hospitals tended to be large (82%), urban (99%), academic centers (59%) that cared for patients with greater comorbidities (P < .001). Adjusted length of stay was lower at HV compared with LV hospitals (odds ratio, 0.86; 95% confidence interval, 0.82-0.90). After adjusting for patient and hospital factors, the mortality rate was significantly lower for patients treated at HV hospitals (hazard ratio, 0.74; 95% confidence interval, 0.67-0.83). The rates of admissions for AP in the United States are increasing. At hospitals that admit the most patients with AP, patients had a shorter length of stay, lower hospital charges, and lower mortality rates than controls in this matched analysis.
    Gastroenterology 10/2009; 137(6):1995-2001. · 12.82 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Laparoscopic (LAP) surgery has experienced significant growth since the early 1990s and is now considered the standard of care for many procedures like cholecystectomy. Increased expertise, training, and technological advancements have allowed the development of more complex LAP procedures including the removal of solid organs. Unlike LAP cholecystectomy, it is unclear whether complex LAP procedures are being performed with the same growth today. Using the Nationwide Inpatient Sample (NIS) from 1998 to 2006, patients who underwent elective LAP or open colectomy (n = 220,839), gastrectomy (n = 17,289), splenectomy (n = 9,174), nephrectomy (n = 64,171), or adrenalectomy (n = 5,556) were identified. The Elixhauser index was used to adjust for patient comorbidities. To account for patient selection and referral bias, a matched analysis was performed using propensity scores. The main endpoints were adjusted for in-hospital mortality and prolonged length of stay (LOS). Complex LAP procedures account for a small percentage of total elective procedures (colectomy, 3.8%; splenectomy, 8.8%; gastrectomy, 2.4%; nephrectomy, 7.0%; and adrenalectomy, 14.2%). These procedures have been performed primarily at urban (94%) and teaching (64%) centers. Although all LAP procedures trended up, the growth was greatest in LAP colectomy and nephrectomy (P < .001). In a case-controlled analysis, there was a mortality benefit only for LAP colectomy (hazard ratio [HR] = 0.53; 95% confidence interval [CI] = 0.34-0.82) when compared with their respective open procedures. All LAP procedures except gastrectomy had a lower prolonged LOS compared with their open counterparts. Despite the significant benefits of complex LAP procedures as measured by LOS and in-hospital mortality, the growth of these operations has been slow unlike the rapid acceptance of LAP cholecystectomy. Future studies to identify the possible causes of this slow growth should consider current training paradigms, technical capabilities, economic disincentive, and surgical specialization.
    Surgery 09/2009; 146(2):367-74. · 3.37 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The PREVENT III (PIII) critical limb ischemia (CLI) risk score is a simple, published tool derived from the PIII randomized clinical trial that can be used for estimating amputation-free survival (AFS) in CLI patients considered for infrainguinal bypass (IB). The current study sought to validate this risk stratification model using data from the prospectively collected Vascular Study Group of Northern New England (VSGNNE). We calculated the PIII CLI risk score for 1166 patients undergoing IB with autogenous vein by 59 surgeons at 11 hospitals between January 1, 2003, and December 31, 2007. Points (pts) were assigned to each patient for the presence of dialysis (4 pts), tissue loss (3 pts), age >or=75 (2 pts), and coronary artery disease (CAD) (1 pt). Baseline hematocrit was not included due to a large proportion of missing values. Total scores were used to stratify each patient into low-risk (<or=3 pts), med-risk (4-7 pts), and high-risk (>or=8 pts) categories. The Kaplan-Meier method was used to calculate AFS for the three risk groups. Log-rank test was used for intergroup comparisons. To assess validation, comparison to the PIII derivation and validation sets was performed. Stratification of the VSGNNE patients by risk category yielded three significantly different estimates for 1-year AFS (86.4%, 74.0%, and 56.1%, for low-, med-, and high-risk groups). Intergroup comparison demonstrated precise discrimination (P < .0001). For a given risk category (low, med, or high), the 1-year AFS estimates in the VSGNNE dataset were consistent with those observed in the previously published PIII derivation set (85.9%, 73.0%, and 44.6%, respectively), PIII validation set (87.7%, 63.7%, and 45.0%, respectively), and retrospective multicenter validation set (86.3%, 70.1%, and 47.8%, respectively). The PIII CLI risk score has now been both internally and externally validated by testing it against the outcomes of 3286 CLI patients who underwent autogenous vein bypass at 94 institutions by a diverse array of physicians (three independent cohorts of patients). This tool provides a simple and reliable method to risk stratify CLI patients being considered for IB. At initial consultation, calculation of the PIII CLI risk score can reliably stratify patients according to their risk of death or major amputation at 1 year.
    Journal of vascular surgery: official publication, the Society for Vascular Surgery [and] International Society for Cardiovascular Surgery, North American Chapter 08/2009; 50(4):769-75; discussion 775. · 3.52 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Although laparoscopic colectomy is reported to have favorable outcomes compared with open colectomy, it has yet to gain widespread acceptance in the United States. This study sought to investigate whether hospital volume is a factor determining the use of laparoscopy for colectomy. Using the Nationwide Inpatient Sample (NIS, 1998-2006), patients undergoing elective colon resection with and without laparoscopy were identified. Unique hospital identifiers were used to divide hospital volume into equal thirds, with the highest third defined as high volume and the lower two-thirds defined as low volume. The primary end point was the use of laparoscopy after adjustment for patient and hospital covariates. A total of 209,769 colon resections were performed in the study period. Overall, only 8,407 (4%) of these resections were performed with laparoscopy. High-volume centers, which tended to be large, urban teaching hospitals, treated more patients in the highest income bracket and patients with private insurance than low-volume hospitals (p < 0.0001). High-volume hospitals used laparoscopy more often than low-volume hospitals (5.2% vs. 3.4%). After adjustment for covariates using multivariate analysis and propensity scores, analysis showed that patients with private insurance and those in the highest income bracket were more likely to receive laparoscopy (p < 0.0009). High-volume hospitals were more likely to perform laparoscopically assisted colectomy than low-volume hospitals (odds ratio [OR], 1.42; 95% confidence interval [CI], 1.23-1.56). Socioeconomic differences appear to exist between high- and low-volume hospitals in the use of laparoscopy. High hospital volume is associated with an increased likelihood that colectomy will be performed with laparoscopy.
    Surgical Endoscopy 08/2009; 24(3):662-9. · 3.43 Impact Factor