Article

Validation of a Current Definition of Early Allograft Dysfunction in Liver Transplant Recipients and Analysis of Risk Factors

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Translational studies in liver transplantation often require an endpoint of graft function or dysfunction beyond graft loss. Prior definitions of early allograft dysfunction (EAD) vary, and none have been validated in a large multicenter population in the Model for End-Stage Liver Disease (MELD) era. We examined an updated definition of EAD to validate previously used criteria, and correlated this definition with graft and patient outcome. We performed a cohort study of 300 deceased donor liver transplants at 3 U.S. programs. EAD was defined as the presence of one or more of the following previously defined postoperative laboratory analyses reflective of liver injury and function: bilirubin >or=10mg/dL on day 7, international normalized ratio >or=1.6 on day 7, and alanine or aspartate aminotransferases >2000 IU/L within the first 7 days. To assess predictive validity, the EAD definition was tested for association with graft and patient survival. Risk factors for EAD were assessed using multivariable logistic regression. Overall incidence of EAD was 23.2%. Most grafts met the definition with increased bilirubin at day 7 or high levels of aminotransferases. Of recipients meeting the EAD definition, 18.8% died, as opposed to 1.8% of recipients without EAD (relative risk = 10.7 [95% confidence interval: 3.6, 31.9] P < 0.0001). More recipients with EAD lost their grafts (26.1%) than recipients with no EAD (3.5%) (relative risk = 7.4 [95% confidence interval: 3.4, 16.3] P < 0.0001). Donor age and MELD score were significant EAD risk factors in a multivariate model. In summary a simple definition of EAD using objective posttransplant criteria identified a 23% incidence, and was highly associated with graft loss and patient mortality, validating previously published criteria. This definition can be used as an endpoint in translational studies aiming to identify mechanistic pathways leading to a subgroup of liver grafts with clinical expression of suboptimal function.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... We extracted baseline characteristics or perioperative data reported to be associated with clinical outcomes including early allograft dysfunction (EAD) and acute kidney injury (AKI) after liver transplantation (Table 1) [13][14][15][16][17][18] . Postoperative clinical outcomes including the incidence of AKI 13,19,20 , early allograft dysfunction 18 , lengths of stay in intensive care unit (ICU), length of hospital stay, in-hospital all-cause mortality or graft failure, 1-year all-cause mortality or graft failure, and postoperative hemodialysis were collected. ...
... We extracted baseline characteristics or perioperative data reported to be associated with clinical outcomes including early allograft dysfunction (EAD) and acute kidney injury (AKI) after liver transplantation (Table 1) [13][14][15][16][17][18] . Postoperative clinical outcomes including the incidence of AKI 13,19,20 , early allograft dysfunction 18 , lengths of stay in intensive care unit (ICU), length of hospital stay, in-hospital all-cause mortality or graft failure, 1-year all-cause mortality or graft failure, and postoperative hemodialysis were collected. ...
... Early allograft dysfunction was defined when one or more of the following are present: total bilirubin ≥ 10 mg/ dL, prothrombin time: international normalized ratio ≥ 1.6 on the seventh postoperative day, or aspartate or alanine transaminase > 2000 IU/L within the first 7 postoperative days 18 . Surgical complication rates including graft dysfunctions, vascular complications, biliary complications, and wound infection were compared by Clavien-Dindo classification 21 . ...
Article
Full-text available
Although pulmonary artery catheter (PAC) has been used during liver transplantation surgery, the usefulness of PAC has rarely been investigated. We evaluated whether the use of PAC is associated with better clinical outcomes compared to arterial waveform-based monitoring after liver transplantation. A total of 1565 cases undergoing liver transplantation were reviewed. We determined whether patients received PAC or not and divided our cohort into the PAC with hemodynamic monitoring using PAC and the non-PAC with arterial waveform-based monitoring using FloTrac-Vigileo. Propensity score matching was performed. Acute kidney injury (AKI), early allograft dysfunction (EAD) and 1-year all-cause mortality or graft failure were compared in the matched cohorts. Logistic regression analysis was performed in the inverse probability of treatment-weighted (IPTW) cohort for postoperative EAD and AKI, respectively. Five-year overall survival was compared between the two groups. In the matched cohort, there was no significant difference in the incidence of AKI, EAD, length of hospital or ICU stay, and 1-year all-cause mortality between the groups. In the IPTW cohort, the use of PAC was not a significant predictor for AKI or EAD (AKI: odds ratio (95% confidence interval) of 1.20 (0.47–1.56), p = 0.229; EAD: 0.99 (0.38–1.14), p = 0.323). There was no significant difference in the survival between groups after propensity score matching (Log-rank test p = 0.578). In conclusion, posttransplant clinical outcomes were not significantly different between the groups with and without PAC. Anesthetic management without the use of PAC may be possible in low-risk patients during liver transplantation. The risk should be carefully assessed by considering MELD scores, ischemic time, surgical history, previous treatment of underlying liver disease, and degree of portal and pulmonary hypertension. Registration: https://clinicaltrials.gov/ct2/show/NCT05457114 (registration date: July 15, 2022).
... Early allograft dysfunction (EAD) is a functional hepatic insufficiency within one week of orthotopic liver transplantation (OLT). Known risk factors for EAD are shown in " Table 1" [1]. ...
... The widely accepted Olthoff et al. definition of EAD is the primary outcome, specifically the presence of one or more of the following serum laboratory values: total bilirubin level � 10 mg/dL on day 7, international normalized ratio (INR) level � 1.6 on day 7, and peak aspartate transaminase (AST) or alanine transaminase (ALT) level > 2,000 IU/L within the first 7 postoperative days [1]. Secondary outcomes include: (1) the incidence and severity of postoperative acute kidney injury (AKI) as determined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria, (2) number of postoperative days on mechanical ventilation, (3) evidence of postoperative pneumonia on chest x-ray or computed topography scan within the first 30 days postoperatively, (4) surgical wound dehiscence or infection within the first 30 days postoperatively, (5) anastomotic or non-anastomotic biliary strictures, biliary leak, bile duct stones or sludge, or ischemic cholangiopathy within the first 30 days postoperatively, (6) postoperative hepatic artery or portal vein thrombosis within the first 30 days postoperatively, and (7) 30-day all-cause mortality. ...
Article
Full-text available
Early allograft dysfunction (EAD) is a functional hepatic insufficiency within a week of orthotopic liver transplantation (OLT) and is associated with morbidity and mortality. The etiology of EAD is multifactorial and largely driven by ischemia reperfusion injury (IRI), a phenomenon characterized by oxygen scarcity followed by paradoxical oxidative stress and inflammation. With the expanded use of marginal allografts more susceptible to IRI, the incidence of EAD may be increasing. This necessitates an in-depth understanding of the innate molecular mechanisms underlying EAD and interventions to mitigate its impact. Our central hypothesis is peri-reperfusion hyperoxemia and immune dysregulation exacerbate IRI and increase the risk of EAD. We will perform a pilot prospective single-center observational cohort study of 40 patients. The aims are to determine (1) the association between peri-reperfusion hyperoxemia and EAD and (2) whether peri-reperfusion perturbed cytokine, protein, and hypoxia inducible factor-1 alpha (HIF-1α) levels correlate with EAD after OLT. Inclusion criteria include age ≥ 18 years, liver failure, and donation after brain or circulatory death. Exclusion criteria include living donor donation, repeat OLT within a week of transplantation, multiple organ transplantation, and pregnancy. Partial pressure of arterial oxygen (PaO 2 ) as the study measure allows for the examination of oxygen exposure within the confines of existing variability in anesthesiologist-administered fraction of inspired oxygen (FiO 2 ) and the inclusion of patients with intrapulmonary shunting. The Olthoff et al. definition of EAD is the primary outcome. Secondary outcomes include postoperative acute kidney injury, pulmonary and biliary complications, surgical wound dehiscence and infection, and mortality. The goal of this study protocol is to identify EAD contributors that could be targeted to attenuate its impact and improve OLT outcomes. If validated, peri-reperfusion hyperoxemia and immune perturbations could be targeted via FiO 2 titration to a goal PaO 2 and/or administration of an immunomodulatory agent by the anesthesiologist intraoperatively.
... Nevertheless, both in adult and pediatric liver transplantation, the risk factors associated with postoperative complications, such as initial poor graft function (IPGF), are not yet clearly identified. The degree and severity of ischemia-reperfusion injury (IRI) significantly impact the early recovery of graft function, which determines the patient's immediate prognosis (Briceno and Ciria 2010;Lee et al. 2016;Olthoff et al. 2010). ...
... Another factor associated with IPGF in our study is donor age, a result already reported in several studies. In a retrospective study of 300 deceased donors, Olthoff et al. found an adjusted OR of 3.12 for the development of EAD in donors aged > 45 years (Olthoff et al. 2010). These results are consistent with earlier findings identifying donor age > 49 as an independent risk factor for EAD and primary non-function (Ploeg et al. 1993). ...
Article
Full-text available
Introduction Initial allograft function determines the patient’s immediate prognosis in pediatric liver transplantation. Ischemia-reperfusion injuries play a role in initial poor graft function (IPGF). In animal studies, preconditioning with inhaled anesthetic agents has demonstrated a protective effect on the liver. In humans, the few available studies are conflicting. This study assesses the association between the hypnotic agent used to maintain anesthesia during hepatectomy in living donors and the occurrence of IPGF after pediatric transplantation. Methods We conducted a single-center retrospective analysis of children who received a living donor liver transplant (LDLT) between 2010 and 2019. We analyzed the incidence of EAD according to the hypnotic agent used to maintain general anesthesia during donor hepatectomy. Results We included 183 pairs of patients (living donors-recipients). The anesthetics used in the donor were propofol (n = 85), sevoflurane (n = 69), or propofol with sevoflurane started 30 min before clamping (n = 29). Forty-two children (23%) developed IPGF. After multivariate logistic regression analysis, factors significantly associated with the occurrence of IPGF were the anesthesia maintenance agent used in the donor (p = 0.004), age of the donor (p = 0.03), duration of transplant surgery (p = 0.009), preoperative receiver neutrophil to lymphocyte ratio (p = 0.02), and albumin (p = 0.05). Conclusion Significantly fewer children who received a graft from a donor in whom only sevoflurane was used to maintain anesthesia developed IPGF. Although additional research is needed, this preconditioning strategy may provide an option to prevent IPGF after living liver donation.
... Primary non-function (PNF) was defined as peak AST ≥3,000 IU/ L plus at least one of the following criteria: INR ≥2.5, serum lactate ≥4 mmol/L and total bilirubin ≥10 mg/dL (values measured on postoperative day 3, biliary obstruction being excluded) [20]. Early allograft dysfunction (EAD) was defined according to the Olthoff criteria [21]. ...
... Twenty-five patients (22.5%) underwent high urgency (HU) reLT. The median MELD score at reLT was 20 (14)(15)(16)(17)(18)(19)(20)(21)(22)(23)(24)(25)(26). The median BAR score in our cohort was 12 points (9-16) and ranged from 4 to 26 points. ...
Article
Full-text available
Liver retransplantation (reLT) yields poorer outcomes than primary liver transplantation, necessitating careful patient selection to avoid futile reLT. We conducted a retrospective analysis to assess reLT outcomes and identify associated risk factors. All adult patients who underwent a first reLT at the Medical University of Innsbruck from 2000 to 2021 (N = 111) were included. Graft-and patient survival were assessed via Kaplan-Meier plots and log-rank tests. Uni-and multivariate analyses were performed to identify independent predictors of graft loss. Five-year graft-and patient survival rates were 64.9% and 67.6%, respectively. The balance of risk (BAR) score was found to correlate with and be predictive of graft loss and patient death. The BAR score also predicted sepsis (AUC 0.676) and major complications (AUC 0.720). Multivariate Cox regression analysis identified sepsis [HR 5.179 (95% CI 2.575-10.417), p < 0.001] as the most significant independent risk factor for graft loss. At a cutoff of 18 points, the 5 year graft survival rate fell below 50%. The BAR score, a simple and easy to use score available at the time of organ acceptance, predicts and stratifies clinically relevant outcomes following reLT and may aid in clinical decision-making.
... Protocol-defined primary study endpoint was the incidence of EAD, defined by the Olthoff criteria [21] as post-LT presence of total bilirubin ≥ 10 mg/dL at 7 days, international normalized ratio ≥ 1.6 at 7 days, and/or ALT/AST > 2000 IU/L within 7 days and > 24 hours post-reperfusion. Binary assessment of EAD was supplemented in post hoc analysis with the Liver Graft Assessment Following Transplant score based on the AUC of 7-day post-LT variables (L-GrAFT 7 ) including AST, international normalized ratio, total bilirubin, and platelet count. ...
... This was an FDA pivotal trial with a primary endpoint based on noninferiority of EAD between HMP-O 2 and SCS by strict definition. [21] Results demonstrate noninferiority of HMP-O 2 compared with SCS, with a significant decrease in noninfection-related AEs and multiple AEs in HMP-O 2 recipients. Noninfection-related AEs include the occurrence of graft failure due to PNF and ischemic cholangiopathy, both of which only occurred in SCS recipients. ...
Article
Background & Aims In liver transplantation, cold preservation induces ischemia, resulting in significant reperfusion injury. Hypothermic Oxygenated Machine Perfusion (HMP-O 2 ) has shown benefit compared to static cold storage (SCS) by limiting ischemia-reperfusion injury. This study reports outcomes using a novel portable HMP-O 2 device in the first US randomized control trial. Approach & Results The PILOT™ trial (NCT03484455) was a multicenter, randomized, open-label, non-inferiority trial, with participants randomized to HMP-O 2 or SCS. HMP-O 2 livers were preserved using the Lifeport ® Liver Transporter and Vasosol ® perfusion solution. Primary outcome was early allograft dysfunction (EAD). Non-inferiority margin was 7.5%. From 4/3/19-7/12/22, 179 patients were randomized to HMP-O 2 (n=90) or SCS (n=89). Per protocol cohort included 63 HMP-O 2 and 73 SCS. EAD occurred in 11.1% HMP-O 2 (N=7) and 16.4% SCS (N=12). The risk difference between HMP-O 2 and SCS was -5.33% (one-sided 95% upper confidence limit of 5.81%), establishing noninferiority. Risk of graft failure as predicted by L-GrAFT 7 was lower with HMP-O 2 (median [IQR] 3.4% [2.4-6.5] vs. 4.5% [2.9-9.4], p =0.024). Primary nonfunction occurred in 2.2%, all SCS (n=3, p =0.10). Biliary strictures occurred in 16.4% SCS (n=12) and 6.3% (n=4) HMP-O 2 ( p =0.18). Non-anastomotic biliary strictures occurred only in SCS (n=4). Conclusions HMP-O 2 demonstrates safety and noninferior efficacy for liver graft preservation in comparison to SCS. EAD by L-GrAFT 7 was lower in HMP-O 2 , suggesting improved early clinical function. Recipients of HMP-O 2 livers also demonstrated a lower incidence PNF and biliary strictures, although this difference did not reach significance.
... These patients developed high transaminases, a low prothrombin time, high bilirubin, and high lactate within 24 h after liver transplantation [22]. Early allograft dysfunction (EAD) was defined by the presence of at least one of the following criteria, as proposed by Olthoff et al. [23]: bilirubinaemia levels of 170 µmol/L (10 mg/dL) at day 7, international normalized ratio (INR) of 1.6 or greater at day 7, and transaminases ≥ 2000 UI/L within the first 7 days. Acute rejection was classified using the Banff grades [24]. ...
Article
Full-text available
Simple Summary This study examines the viability of liver grafts from donors aged 85 years and older in liver transplantation (LT) compared to those from younger donors under 40 years old. The research, conducted on data from 2005 to 2023, evaluates post-LT outcomes using propensity score matching. Despite lower 5-year survival rates in the elderly group before matching, the proposed nomogram provides a more acceptable 10-year post-LT survival using grafts from older donors. Notably, the study emphasizes the importance of proper matching, particularly for recipients with hepatocellular carcinoma (HCC), in achieving satisfactory long-term results amid organ scarcity. Abstract Background: Despite the ongoing trend of increasing donor ages in liver transplantation (LT) setting, a notable gap persists in the availability of comprehensive guidelines for the utilization of organs from elderly donors. This study aimed to evaluate the viability of livers grafts from donors aged ≥85 years and report the post-LT outcomes compared with those from “ideal” donors under 40 years old. Methods: Conducted retrospectively at a single center from 2005 to 2023, this study compared outcomes of LTs from donors aged ≥85 y/o and ≤40 y/o, with the propensity score matching to the recipient’s gender, age, BMI, MELD score, redo-LT, LT indication, and cause of donor death. Results: A total of 76 patients received grafts from donors ≥85 y/o and were compared to 349 liver grafts from donors ≤40 y/o. Prior to PSM, the 5-year overall survival was 63% for the elderly group and 77% for the young group (p = 0.002). After PSM, the 5-year overall survival was 63% and 73% (p = 0.1). A nomogram, developed at the time of graft acceptance and including HCC features, predicted 10-year survival after LT using a graft from a donor aged ≥85. Conclusions: In the context of organ scarcity, elderly donors emerge as a partial solution. Nonetheless, without proper selection, LT using very elderly donors yields inferior long-term outcomes compared to transplantation from very young donors ≤40 y/o. The resulting nomogram based on pre-transplant criteria allows for the optimization of elderly donor/recipient matching to achieve satisfactory long-term results, in addition to traditional matching methods.
... EAD was defined and evaluated according to the criteria described by Olthoff et al. [15]. Briefly, one of the following criteria had to be present to fulfill Olthoff's criteria for EAD: bilirubin ≥ 10 mg/dL on POD7, international normalized ratio (INR) ≥ 1.6 on POD7, alanine or aspartate aminotransferases (ALAT/ASAT) > 2000 IU/L within the first seven PODs. ...
Article
Full-text available
Background: Platelets were shown to be relevant for liver regeneration. In particular, platelet-stored serotonin (5-HT) proved to be a pro-regenerative factor in this process. The present study aimed to investigate the perioperative course of 5-HT and evaluate associations with patient and graft outcomes after othotopic liver transplantation (OLT). Methods: 5-HT was quantified in plasma and serum of 44 OLT recipients perioperatively, and in their respective donors. Olthoff’s criteria for early allograft dysfunction (EAD) were used to evaluate postoperative outcomes. Results: Patients with higher donor intra-platelet 5-HT per platelet (IP 5-HT PP) values had significantly lower postoperative transaminases (ASAT POD1: p = 0.006, ASAT POD5: p = 0.006, ASAT POD10: p = 0.02, ALAT POD1: p = 0.034, ALAT POD5: p = 0.017, ALAT POD10: p = 0.04). No significant differences were seen between postoperative 5-HT values and the occurrence of EAD. A tendency was measured that donor IP 5-HT PP is lower in donor-recipient pairs that developed EAD (p = 0.07). Conclusions: Donor IP 5-HT PP might be linked to the postoperative development of EAD after OLT, as higher donor levels are correlated with a more favorable postoperative course of transaminases. Further studies with larger cohorts are needed to validate these findings.
... The primary outcomes included in this study are biliary complications, early graft dysfunction and acute graft rejection as shown in table 3. Biliary complications included both bile leak and stricture. Early Allograft dysfunction was defined as meeting one of the following criteria: serum bilirubin >10mg/dl on day 7, INR value >1.6 on day 7, and AST or ALT activity >2000 U/L in the first 7 days after liver transplantation [13]. Acute graft rejection was defined on histology. ...
Article
Liver transplantation (LT) is a highly effective treatment for well-selected recipients with end stage liver disease, fulminant hepatic failure and hepatocellular carcinoma (HCC).The number of patients awaiting LT continues to grow [1]. Unfortunately, the tremendous burden of end-stage liver disease and HCC has not been met with a proportional growth in organ donation and transplantation. Efforts to continually refine organ allocation practices to ensure optimal use of this scarce resource are of paramount importance. Previous studies have identified gender discordant hepatic transplants—specifically female (F) donors to male (M) recipients (FM transplantation)—as resulting in inferior patient and graft survival rates as compared with gender concordance (FF or MM) [2].
... In the EVR group, patients with HCC recurrence showed later EVR introduction (median (IQR) = 52 (26.4) versus 30 (12) days; P<0.001), shorter duration of treatment (median (IQR) = 47.6 (57.0) versus 69.9 (24.8) months; P<0.001), and lower drug exposure (median (IQR) = 3.65 (0.55) versus 5.9 (1.4) ng/mL; P<0.001) ( Table 6). Timing of EVR introduction, median (IQR) (days) * 30 (16) Duration of EVR treatment, median (IQR) (months) * 46.6 (36.1) 10 EVR whole-blood exposure, median (IQR) (ng/mL) * 5.8 (1.7) NOTE: EVR, everolimus; HC, hepatocellular carcinoma; IPTW, inverse probability of treatment weigthing; IQR, interquartile range; MACE, major cardiovascular events. Figure 1 illustrates the OS according to Milan criteria at transplant and type of immunosuppressant (EVR versus TAC). ...
Preprint
Full-text available
To obtain long-term data on the use of everolimus in patients who underwent liver transplantation for hepatocellular carcinoma, we conducted a retrospective, single-center analysis of adult recipients transplanted between 2013 and 2021. Patients on everolimus-incorporating immunosuppression were matched with those on tacrolimus using an inverse probability of treatment weighting methodology. Two propensity-matched groups of patients were thus compared: 233 (45.6%) receiving everolimus versus 278 (54.4%) on tacrolimus. At a median (interquartile range) follow-up of 4.4 (3.8) years after transplantation, everolimus patients showed a reduced risk of recurrence versus tacrolimus (7.7% versus 16.9%; RR=0.45; P=0.002). At multivariable analysis, microvascular infiltration (HR=1.22; P<0.04) and a higher tumor grading (HR=1.27; P<0.04) were associated with higher recurrence rate while being within Milan criteria at transplant (HR=0.56; P<0.001), a successful pre-transplant downstaging (HR=0.63; P=0.01) and use of everolimus (HR=0.46; P<0.001) had a positive impact on the risk of post-transplant recurrence. EVR patients with earlier drug introduction (30 days; P<0.001), longer treatment duration (P<0.001), and higher drug exposure (5.9ng/mL; P<0.001) showed lower recurrence rates versus TAC. Based on our experience, everolimus provides a reduction of the relative risk of hepatocellular carcinoma recurrence, especially for advanced-stage patients and those with earlier drug administration, higher drug exposure, and longer time on treatment. These data advocate for early everolimus introduction after liver transplantation to reduce the attrition rate consequent to chronic immunosuppression.
... Primary nonfunction (PNF) was defined as peak aspartate aminotransferase ≥3000 IU/L and at least 1 of the following criteria: international normalized ratio ≥2.5, serum lactate ≥4 mmol/L, and total bilirubin ≥10 mg/dL at postoperative day 3. Early allograft dysfunction (EAD) was defined according to the Olthoff criteria. 12 Biliary complications were classified as bile duct leaks, anastomotic stenosis (AS), non-AS and cholangitis. Multifocal pathologies affecting the macroscopic donor bile ducts (non-AS, biliary cast syndrome and bile duct necrosis with intrahepatic leakage and bilioma formation) in the absence of thrombosis or severe stenosis of the HA that could not be explained by recurrent disease (ie, primary sclerosing cholangitis) were classified as posttransplant cholangiopathy. ...
Article
Background: Normothermic liver machine perfusion (NLMP) is advancing the field of liver transplantation (LT). Beyond improved preservation and organ assessment, NLMP helps to increase organ utilization. We herein address the feasibility and merit of NLMP in split liver transplantation (SLT) to postpone the transplantation of the second split graft to the following day. Methods: We analyzed the perfusion characteristics and outcomes of all consecutive adult recipients who underwent SLT following NLMP from February 1, 2018, to June 30, 2023. The primary endpoint was 90-d graft and patient survival. Secondary endpoints were posttransplant complications and 90-d morbidity. Results: Three right and 3 extended right SLT following NLMP have been performed. NLMP was uneventful in all cases. Perfusion characteristics differed according to graft volume. Mean perfusion time was 17:00 h (±05:13) and bile production ranged between 8 and 21 mL/h. All split grafts fulfilled predefined center viability criteria during NLMP and were transplanted on the following day. The 90-d graft and patient survival rate was 100%. Three patients (50%) required an early relaparotomy, and 2 patients (33.3%) developed biliary complications. The 90-d morbidity as recorded by the comprehensive complication index was 62.7 (±24.7). Conclusion: NLMP of split liver grafts is technically feasible and safe. Through prolongation of preservation time, NLMP allows to safely postpone transplantation of the second split liver graft to the next day.
... The warm ischemia time (WIT) was the duration of ischemia during graft implantation. The definition of EAD was based on the research of Olthoff et al. [16]. Retrospectively, MELD scores were recalculated using the laboratory data available at the time of transplantation. ...
Article
Full-text available
Simple Summary Everolimus is an immunosuppressive drug used to prevent rejection after liver transplantation. It is an attractive alternative to tacrolimus for patients with hepatocellular carcinoma who are undergoing liver transplantation due to its antiproliferative effects. In our study, we investigated whether liver transplant patients who received everolimus after transplantation had a reduced risk of hepatocellular carcinoma recurrence compared to those on tacrolimus. In a group of 511 patients, recipients treated with everolimus exhibited a reduced risk of tumor recurrence after transplantation. This was particularly true for patients with more advanced tumors and who received the drug earlier and for longer periods. We recommend including everolimus in the post-transplant immunosuppressive regimen to optimize outcomes of liver transplantation for hepatocellular carcinoma. Abstract To obtain long-term data on the use of everolimus in patients who underwent liver transplantation for hepatocellular carcinoma, we conducted a retrospective, single-center analysis of adult recipients transplanted between 2013 and 2021. Patients on everolimus-incorporating immunosuppression were matched with those on tacrolimus using an inverse probability of treatment weighting methodology. Two propensity-matched groups of patients were thus compared: 233 (45.6%) receiving everolimus versus 278 (54.4%) on tacrolimus. At a median (interquartile range) follow-up of 4.4 (3.8) years after transplantation, everolimus patients showed a reduced risk of recurrence versus tacrolimus (7.7% versus 16.9%; RR = 0.45; p = 0.002). At multivariable analysis, microvascular infiltration (HR = 1.22; p < 0.04) and a higher tumor grading (HR = 1.27; p < 0.04) were associated with higher recurrence rate while being within Milan criteria at transplant (HR = 0.56; p < 0.001), a successful pre-transplant downstaging (HR = 0.63; p = 0.01) and use of everolimus (HR = 0.46; p < 0.001) had a positive impact on the risk of post-transplant recurrence. EVR patients with earlier drug introduction (≤30 days; p < 0.001), longer treatment duration (p < 0.001), and higher drug exposure (≥5.9 ng/mL; p < 0.001) showed lower recurrence rates versus TAC. Based on our experience, everolimus provides a reduction in the relative risk of hepatocellular carcinoma recurrence, especially for advanced-stage patients and those with earlier drug administration, higher drug exposure, and longer time on treatment. These data advocate for early everolimus introduction after liver transplantation to reduce the attrition rate consequent to chronic immunosuppression.
... Therefore, patients who died or were re-transplanted in the early postoperative period (i.e., before POD 30) were excluded, as the causes were presumably related to primary graft dysfunction or postoperative complications (usually vascular causes). This could explain the reason why Child-Pugh grade was not a risk factor for poor outcomes in our study as these risk factors usually lead to early death or re-transplantation [24]. ...
Article
Full-text available
Background The pharmacokinetics of tacrolimus (TAC) show high intra-patient variability (IPV), which is associated with poor long-term outcomes following adult liver transplantation (LT). However, this relationship remains to be confirmed in pediatric liver transplant (PLT) recipients. The present study aimed to investigate the association between TAC IPV and grafts or patient outcomes after pediatric liver transplantion. Methods This retrospective study included 848 PLT recipients (including infants) between January, 2016, and June, 2021. The IPV of TAC concentrations was estimated by calculating the coefficient of variation (CV) of trough concentrations in whole blood within 1 month after transplantation. Patients were categorized into two groups, low IPV (CV < 45%) and high IPV (CV ≥ 45%), based on the third quartile of the CV distribution. Results A total of 848 patients were included in our study. The low CV group included 614 patients, with a mean TAC trough concentration of 8.59 ± 1.65 ng/ml and a median CV of 32.37%. In contrast, the high CV group included 214 patients, the mean TAC trough concentration and median CV were 8.81 ± 2.00 ng/ml and 54.88%, respectively. The median hospital duration was significantly higher in the high CV group (22 days vs. 20 days, P = 0.01). Univariate analysis was performed to evaluate the significant differences in 1-year recipient survival (P = 0.041) and 1-year graft survival (P = 0.005) between the high- and low-CV groups. Moreover, high CV (HR 2.316, 95%CI 1.026–5.231, P = 0.043) and persistent EBV viremia (HR 13.165, 95%CI 3.090–56.081, P < 0.001) were identified as independent risk factors for 1- year mortality after PLT. Conclusions PLT recipients with high TAC trough concentration of CV in the first month were associated with poor 1-year outcomes. This CV calculation provides a valuable strategy to monitor TAC exposure.
... Posttransplant, the recipient met the criteria for early allograft dysfunction (EAD), had a Model for Early Allograft Function (MEAF) score of 6.15, and had stage 2 acute kidney injury (AKI; Table 1). 3,4 Extubation occurred 8.6 h after case completion. Intensive care unit (ICU) and hospital length of stay (LOS) were 1 and 5 d, respectively. ...
... Early allograft dysfunction (EAD) served as primary endpoint, Model for Early Allograft Function (MEAF [23]) Liver Graft Assessment Following Transplantation (L-GrAFT [24,25]), graft and patient survival, length of stay and biliary complications served as secondary endpoints. EAD was defined as the presence of one or more of i) bilirubin ≥10 mg·dL −1 on day seven after transplantation, ii) international normalized ratio (INR) ≥ 1.6 on day seven, and iii) alanine (ALT) or aspartate aminotransferases (AST) > 2000 IU·L −1 within the first 7 days after liver transplantation [26]. ...
Article
Full-text available
Donor organ biomarkers with sufficient predictive value in liver transplantation (LT) are lacking. We herein evaluate liver viability and mitochondrial bioenergetics for their predictive capacity towards the outcome in LT. We enrolled 43 consecutive patients undergoing LT. Liver biopsy samples taken upon arrival after static cold storage were assessed by histology, real-time confocal imaging analysis (RTCA), and high-resolution respirometry (HRR) for mitochondrial respiration of tissue homogenates. Early allograft dysfunction (EAD) served as primary endpoint. HRR data were analysed with a focus on the efficacy of ATP production or P - L control efficiency, calculated as 1- L / P from the capacity of oxidative phosphorylation P and non-phosphorylating respiration L . Twenty-two recipients experienced EAD. Pre-transplant histology was not predictive of EAD. The mean RTCA score was significantly lower in the EAD cohort (−0.75 ± 2.27) compared to the IF cohort (0.70 ± 2.08; p = 0.01), indicating decreased cell viability. P - L control efficiency was predictive of EAD (0.76 ± 0.06 in IF vs. 0.70 ± 0.08 in EAD-livers; p = 0.02) and correlated with the RTCA score. Both RTCA and P - L control efficiency in biopsy samples taken during cold storage have predictive capacity towards the outcome in LT. Therefore, RTCA and HRR should be considered for risk stratification, viability assessment, and bioenergetic testing in liver transplantation.
... Two patients developed initial poor graft function, a less severe form of PNF. According to Olthoff et al. [13], using the Pittsburgh definition, initial poor graft function can be diagnosed by meeting at least one of the following criteria: bilirubin concentration ≥ 10 mg/dL on day 7, INR value ≥ 1.6 on day 7, and AST or ALT activity > 2000 U/L in the first 7 days after transplantation. Two patients suffered complications with hepatic artery thrombosis and one patient had right portal vein thrombosis. ...
Article
Full-text available
Liver transplantation is the treatment of choice for end-stage liver disease and despite accumulated experience over the years, improved surgical techniques, better immunosuppression and adequate intensive care management, it still represents the greatest challenge for anesthesiologists. The aim of the study was the characterization of the hemodynamic profile of patients with liver cirrhosis undergoing liver transplantation with the help of the PiCCO system during the three surgical stages, the impact of bleeding on hemodynamic status and correlation between the amount of bleeding, lactate levels, severity scores and survival rate and complications. Another focus of this study was the amount of transfused blood products and their impact on postoperative complications. Our study included 70 patients who underwent liver transplantation in our center and were hemodynamically monitored with the PiCCO system. Data were processed using the Python 3.9 programming language. Results: The mean MELD severity score was 18 points. During surgery, significant variations in the hemodynamic parameters occurred. All patients had a decrease in cardiac output in the anhepatic phase, with 50% presenting a decrease of more than 40%. In total, 78% of patients showed a decrease in the global ejection fraction, with a median value of 30%.Overall, 75% of patients had a total blood loss of less than 6000 mL and 31 patients developed immediate postoperative complications with a 50% probability with blood loss exceeding 6500 mL. Seven patients (10%) did not survive after 30 days. An amount of 5 mmol/L of serum neohepatic lactate determines a 50% probability of complications. Conclusions: Surgical technique causes an important decrease in cardiac output. Intraoperative bleeding has a major impact on outcome and the first month represents a critical period after liver transplantation. Statistical tests describe the probability of 30/90-day survival and the occurrence of complications according to variables such as intraoperative bleeding and MELD severity score. Intraoperative transfusion correlates with the occurrence of postoperative complications.
... Cold ischemia time (CIT) was defined as the time from cross-clamping until the removal of the organ from the ice for implantation, and warm ischemia time (WIT) was defined as the time of ischemia during graft implantation. EAD was defined according to Olthoff et al. [12]. MELD scores at transplant were recalculated retrospectively based on available laboratory data. ...
Article
Full-text available
Background: In Italy, data on long-term survivors after liver transplantation are lacking. Materials and Methods: We conducted a hybrid design study on a cohort of 359 adult recipients who received transplants between 1996 and 2002 to identify predictors of survival and the prevalence of co-morbidities among long-term survivors. Results: The actuarial (95% CI) patient survival was 96% (94.6–98.3%), 69% (64.2–73.6%), 55% (49.8–59.9%), 42.8% (37.6–47.8%), and 34% (29.2–38.9%) at 1, 5, 10, 15, and 20 years, respectively. The leading causes of death were hepatitis C virus recurrence (24.6%), extrahepatic malignancies (16.9%), infection (14.4%), and hepatocellular carcinoma recurrence (14.4%). The factors associated with the survival probability were younger donor and recipient ages (p = 0.001 and 0.004, respectively), female recipient sex (p < 0.001), absence of HCV (p < 0.01), absence of HCC (p = 0.001), and absence of diabetes mellitus at one year (p < 0.01). At the latest follow-up, the leading comorbidities were hypertension (53.6%), obesity (18.7%), diabetes mellitus (17.1%), hyperlipidemia (14.7%), chronic kidney dysfunction (14.7%), and extrahepatic malignancies (13.8%), with 73.9% of patients having more than one complication. Conclusions: Aging with a liver graft is associated with an increased risk of complications and requires ongoing care to reduce the long-term attrition rate resulting from chronic immunosuppression.
... Cold ischemia time (CIT) was defined as the time from cross-clamping until removal of the organ from the ice for implantation, and warm ischemia time (WIT) was defined as the time of ischemia during graft implantation. EAD was defined according to Olthoff et al. [12]. MELD scores at transplant were recalculated retrospectively based on available laboratory data. ...
Preprint
Full-text available
We conducted a hybrid-design study on a historical cohort of 359 adult recipients of a liver graft who received transplants between 1996 and 2002. The study aimed to identify predictors of survival and investigate the prevalence of co-morbidities among long-term survivors. The actuarial (95% CI) patient survival of the overall study cohort was 96% (94.6%-98.3%), 69% (64.2%-73.6%), 55% (49.8%-59.9%), 42.8% (37.6%-47.8%), and 34% (29.2%-38.9%) at 1, 5, 10, 15, and 20 years, respectively. The actuarial (95% CI) graft survival was 93.8% (91.6%-97.2%), 67.6% (62.2%-71.5%), 54.3% (49.6%-59.3%), 42.1% (37.3%-47.4%), and 33.8% (29.0%-38.4%) at 1, 5, 10, 15, and 20 years, respectively. The leading cause of death was hepatitis C virus recurrence (24.6%), followed by extrahepatic malignancies (16.9%), infection (14.4%), and hepatocellular carcinoma recurrence (14.4%). The Cox proportional hazards analysis revealed that several independent factors were associated with the survival probability. These factors include younger donor and recipient ages (p=0.001 and 0.004, respectively), female recipient sex (p<0.001), absence of HCV (p<0.01), absence of HCC (p=0.001), and absence of diabetes mellitus at one year (p<0.01). At the latest follow-up, the leading comorbidities in the long-term survivors were hypertension (53.6%), obesity (18.7%), diabetes mellitus (17.1%), hyperlipidemia (14.7%), chronic kidney dysfunction (14.7%), and extrahepatic malignancies (13.8%), with 73.9% of patients having more than one complication. The Kaplan-Meier probability (95% CI) of complication-free survival was 65.4% (60.2%-70.3%), 38.4% (33.4%-43.7%), 17.8% (14.1%-22.2%), 7.2% (4.8%-10.5%), and 1.94% (0.85%-4.15%) at 1, 5, 10, 15, and 20 years, respectively. Aging with a liver graft is associated with an increased risk of complications and requires ongoing care to reduce the long-term attrition rate resulting from chronic immunosuppression.
... The 4-hour viability assessment period was long enough only to recover the metabolic function to clear lactate, but insufficient to restore bile secretion, and 3 out of 4 of those livers developed early allograft dysfunction. [27] The implementation of objective evaluation of livers from donors currently deemed too high risk can unlock a pool of organs that until now have not been considered of sufficient quality for transplantation, including steatotic organs. [8] The unique feature of the VITTAL F I G U R E 3 Biliary features and risks for development of nonanastomotic biliary strictures. ...
Article
Normothermic machine perfusion (NMP) enables pretransplant assessment of high-risk donor livers. The VITTAL trial demonstrated that 71% of the currently discarded organs could be transplanted with 100% 90-day patient and graft survivals. Here, we report secondary end points and 5-year outcomes of this prospective, open-label, phase 2 adaptive single-arm study. The patient and graft survivals at 60 months were 82% and 72%, respectively. Four patients lost their graft due to nonanastomotic biliary strictures, one caused by hepatic artery thrombosis in a liver donated following brain death, and 3 in elderly livers donated after circulatory death (DCD), which all clinically manifested within 6 months after transplantation. There were no late graft losses for other reasons. All the 4 patients who died during the study follow-up had functioning grafts. Nonanastomotic biliary strictures developed in donated after circulatory death livers that failed to produce bile with pH >7.65 and bicarbonate levels >25 mmol/L. Histological assessment in these livers revealed high bile duct injury scores characterized by arterial medial necrosis. The quality of life at 6 months significantly improved in all but 4 patients suffering from nonanastomotic biliary strictures. This first report of long-term outcomes of high-risk livers assessed by normothermic machine perfusion demonstrated excellent 5-year survival without adverse effects in all organs functioning beyond 1 year (ClinicalTrials.gov number NCT02740608).
... The secondary endpoints for the trial are summarised in box 2. [22][23][24][25][26] Vasopressors are reported as raw doses or norepinephrine equivalent dosage. 13 Categorical variables are summarised by frequency and percentage and continuous variables as median and IQR by arm. ...
Article
Full-text available
Introduction Catecholamine vasopressors such as norepinephrine are the standard drugs used to maintain mean arterial pressure during liver transplantation. At high doses, catecholamines may impair organ perfusion. Angiotensin II is a peptide vasoconstrictor that may improve renal perfusion pressure and glomerular filtration rate, a haemodynamic profile that could reduce acute kidney injury. Angiotensin II is approved for vasodilatory shock but has not been rigorously evaluated for treatment of hypotension during liver transplantation. The objective is to assess the efficacy of angiotensin II as a second-line vasopressor infusion during liver transplantation. This trial will establish the efficacy of angiotensin II in decreasing the dose of norepinephrine to maintain adequate blood pressure. Completion of this study will allow design of a follow-up, multicentre trial powered to detect a reduction of organ injury in liver transplantation. Methods and analysis This is a double-blind, randomised clinical trial. Eligible subjects are adults with a Model for End-Stage Liver Disease Sodium Score ≥25 undergoing deceased donor liver transplantation. Subjects are randomised 1:1 to receive angiotensin II or saline placebo as the second-line vasopressor infusion. The study drug infusion is initiated on reaching a norepinephrine dose of 0.05 µg kg ⁻¹ min ⁻¹ and titrated per protocol. The primary outcome is the dose of norepinephrine required to maintain a mean arterial pressure ≥65 mm Hg. Secondary outcomes include vasopressin or epinephrine requirement and duration of hypotension. Safety outcomes include incidence of thromboembolism within 48 hours of the end of surgery and severe hypertension. An intention-to-treat analysis will be performed for all randomised subjects receiving the study drug. The total dose of norepinephrine will be compared between the two arms by a one-tailed Mann-Whitney U test. Ethics and dissemination The trial protocol was approved by the local Institutional Review Board (#20–30948). Results will be posted on ClinicalTrials.gov and published in a peer-reviewed journal. Trial registration number ClinicalTrials.govNCT04901169
... The following clinical endpoints were described: occurrence of acute kidney injury (AKI) requiring renal replacement therapy, rate of postreperfusion syndrome (PRS), rate of EAD (as defined by Olthoff et al 15 ), peak levels of AST and ALT within the first 7 postoperative d, peak lactate level in the recipient at the transplant operation, grade ≥3 complications rate according to Dindo et al, 16 length of hospital and ICU stay, the Comprehensive Complication Index 17 at the discharge from the index admission, acute rejection rate, biliary complication rate, and patient and graft survival. ...
Article
Full-text available
Background Although hypothermic oxygenated perfusion (HOPE) improves posttransplant outcomes, setting up machine perfusion programs may be subjected to specific obstacles under different conditions. This study aims to describe the establishment of HOPE in a real-life setting in Brazil. Methods Extended criteria donors in donation after brain death organs preserved by HOPE were accepted for higher-risk candidates needing expedited transplantation, perceived as those who would benefit most from the technique because of its limited availability. Extended criteria donors was defined by the Eurotransplant criteria. High-risk transplant candidates were characterized by suboptimal surgical conditions related to the recipient or the procedure. Results Six HOPE-preserved grafts were transplanted from February 2022 to August 2022. The mean donor risk index was 1.7 (SD 0.5). One organ was severely steatotic, and 3 had an anticipated cold ischemia time above 12 h. Recipients’ mean model for end-stage liver disease was 28.67 (SD 6.79), with 1 case of retransplant, 1 of refractory ascites, and 1 of acute-on-chronic liver failure. The mean cold ischemia time was 5 h 42 min (SD 82 min), HOPE 6 h 3 min (SD 150 min), and total preservation time 11 h 46 min (SD 184 min). No case had early allograft dysfunction. The mean length of hospital stay was 10 d with 100% graft and patient survival and no ischemic cholangiopathies at a median follow-up of 15 mo (min 12, max 18). Costs and country-specific legal regulations for device utilization were the major hurdles to implementing the program. Conclusion We presented a pathway to introduce and rationalize the use of HOPE in a scenario of challenging donor-recipient matching with good results. These findings may aid in implementing machine perfusion programs, especially in settings with limited resources or complex transplant logistics.
Chapter
Liver transplantation has evolved to become safe standard-of-care therapy for patients with end-stage liver disease and certain hepatic malignancies. Careful recipient selection and appropriate utilization of brain dead, deceased from cardiac death, or living donors allow for excellent recipient outcomes. Meticulous attention to surgical detail and proactive management of potential technical complications are important for success. Most post-transplant complications can be efficiently managed at experienced transplant centers. In quality measurement, regulatory monitoring, and allocation policy, there is decreased emphasis on one-year patient and graft survival as stand-alone metrics and increased attention to composite outcomes that include waitlist mortality, allocation system equity, healthcare cost, and patient satisfaction. Novel strategies for organ preservation, the expanding field of transplant oncology, and progress in pursuit of immunologic tolerance indicate rising value of liver transplantation as a curative modality.
Article
The need for retransplantation after living donor liver transplantation can occur early, mainly because of technical difficulties such as hepatic artery thrombosis or as a result of early allograft dysfunction as a symptom of small-for-size syndrome. Patients with autoimmune diseases may develop progressive graft failure from recurrent disease. The ethics of retransplantation can be complicated by the cause of the initial liver disease, which may be self-inflicted or the outcome of malignancy. This is especially true in countries without the availability of deceased donors for salvage, and a second living donor would be needed. Nevertheless, patients who experience early or late graft failure should be considered for retransplant if they are deemed acceptable candidates. When a living donor is required for retransplant, the equipoise between donor risk and autonomy and recipient outcome should be considered.
Article
Background Individual events during donation after circulatory death (DCD) procurement, such as hypotensive or hypoxic warm ischemia, or circulatory arrest are all a part of donor warm ischemia time (dWIT), and may have differing effects on the outcome of the liver graft. This study aimed to identify risk factors for postreperfusion syndrome (PRS), a state of severe hemodynamic derangement following graft reperfusion, and its impact on DCD liver transplantation (LT) outcomes. Methods This was a retrospective analysis using 106 DCD LT. Detailed information for events during procurement (withdrawal of life support; systolic blood pressure < 80 mmHg; oxygen saturation < 80%; circulatory arrest; aortic cold perfusion) and their association with the development of PRS were examined using logistic regression. Results The overall incidence of PRS was 26.4%, occurring in 28 patients. Independent risk factors for PRS were asystolic dWIT (odds ratio (OR) 3.65, 95% confidence interval (CI) 1.38–9.66) and MELD score (OR 1.06, 95% CI 1.01–1.10). Total bilirubin was significantly higher in the PRS group at postoperative day (POD) 1 ( p = .02; 5.2 mg/dL vs. 3.4 mg/dL), POD 3 ( p = .049; 4.5 mg/dL vs. 2.8 mg/dL), and POD 7 ( p = .04; 3.1 mg/dL vs. 1.9 mg/dL). Renal replacement therapy after LT was more likely to be required in the PRS group ( p = .01; 48.2% vs. 23.1%). Conclusion Asystolic dWIT is a risk factor for the development of PRS in DCD LT. Our results suggest that asystolic dWIT should be considered when selecting DCD liver donors.
Article
Background While liver transplant effectiveness in treating life-limiting liver disease is uncontested, challenges remain in organ preservation. Methods Following PRISMA guidelines, a systematic review was performed to determine the impact of Hypothermic Oxygenated Perfusion (HOPE) on liver transplant outcomes compared to static cold storage (SCS). Results A total of five studies were included, totaling 586 patients, out of which 267 patients had HOPE-preserved grafts and 319 SCS-preserved grafts. Analysis showed a significant decrease in early graft dysfunction and biliary complications in the HOPE group when compared to SCS (RR = 0.52; 95% CI = 0.33-0.81]; p = 0.01) (RR = 0.75; 95% CI = [0.60-0.94]; p = 0.02), respectively. Similarly, non-anastomotic biliary strictures were significantly reduced in the HOPE group (RR = 0.41; 95% CI = [0.20-0.86]; p = 0.03). Of note, no statistical significance was found on the one-year graft loss and recipient death (RR = 0.40; 95% CI = [0.09-1.83]; p = 0.12 and RR = 0.62; 95% CI = [0.29-1.32]; p = 0.14, respectively). Likewise, no statistical difference was evident in acute rejection (RR = 0.54; 95% CI = [0.04-7.14]; p = 0.20) and postreperfusion syndrome rate (RR = 0.92; 95% CI = [0.35-2.41]; p = 0.73). After statistical analysis, no significant differences in major complications, primary nonfunction, re-transplantation, hepatic artery thrombosis, need for renal replacement therapy, intensive care unit, and hospital length of stay were evident. Conclusions Liver preservation techniques are gaining popularity by enabling rescue and transplantation of marginal livers. In this study, HOPE showed statistically significant differences in reducing rates of biliary complications, biliary stricture, and early graft dysfunction. Further studies are needed to evaluate financial burden and long-term outcomes to completely elucidate the impact of this organ preservation technique.
Chapter
Full-text available
Complications following lifesaving liver transplantation can be devastating and must be managed properly to optimize the patient and allograft survival. There are non-immune, non-infectious complications which present a severe risk to survival of both the patient and the allograft. These include primary graft non-function (PNF) and hepatic artery thrombosis (HAT). Other complications manifest less urgently but continue to represent potentially lethal consequences to both the patient and the hepatic allograft. These include vena cava outflow disruptions, portal venous outflow derangements, and portal vein thrombosis (PVT). Successful management of these complications is optimized with a multidisciplinary approach to the care of liver transplant recipients. We describe their definition, epidemiology, pathophysiology, related factors, presentation, operative and non-operative management, outcomes, and future directions of these potentially catastrophic complications.
Article
Objective The adverse effects of ischemia-reperfusion injury (IRI) remain a principal barrier to a successful outcome after lifesaving orthotopic liver transplantation (OLT). Gene expression during different phases of IRI is dynamic and modified by individual exposures, making it attractive for identifying potential therapeutic targets for improving the number of suitable organs for transplantation and patient outcomes. However, data remain limited on the functional landscape of gene expression during liver graft IRI, spanning procurement to reperfusion and recovery. Therefore, we sought to characterize transcriptomic profiles of IRI during multiple phases in human OLT. Methods We conducted clinical data analyses, histologic evaluation, and RNA sequencing of 17 consecutive human primary OLT. We performed liver allograft biopsies at 4 time points: baseline (B, before donor cross-clamp), at the end of cold ischemia (CI), during early reperfusion (ER, after revascularization), and during late reperfusion (LR). Data were generated and then recipients grouped by post-OLT outcomes categories: immediate allograft function (IAF; n = 11) versus early allograft dysfunction (EAD; n = 6) groups. Results We observed that CI (vs B) modified a transcriptomic landscape enriched for a metabolic and immune process. Expression levels of hallmark inflammatory response genes were higher transitioning from CI to ER and decreased from ER to LR. IAF group predominantly showed higher bile and fatty acid metabolism activity during LR compared with EAD group, while EAD group maintained more immunomodulatory activities. Throughout all time points, EAD specimens exhibited decreased metabolic activity in both bile and fatty acid pathways. Conclusions We report transcriptomic profiles of human liver allograft IRI from prepreservation in the donor to posttransplantation in the recipient. Immunomodulatory and metabolic landscapes across ER and LR phases were different between IAF and EAD allografts. Our study also highlights marker genes for these biological processes that we plan to explore as novel therapeutic targets or surrogate markers for severe allograft injury in clinical OLT.
Article
Background: Ischemia-reperfusion injury (IRI) causes significant morbidity in liver transplantation among other medical conditions. IRI following liver transplantation contributes to poor outcomes and early graft loss. Histone/protein deacetylases (HDACs) regulate diverse cellular processes, play a role in mediating tissue responses to IRI, and may represent a novel therapeutic target in preventing IRI in liver transplantation. Methods: Using a previously described standardized model of murine liver warm IRI, aspartate aminotransferase (AST) and alanine aminotransferase (ALT) levels were assessed at 24 and 48 h after reperfusion to determine the effect of different HDAC inhibitors. Results: Broad HDAC inhibition with trichostatin-A (TSA) was protective against hepatocellular damage (P < 0.01 for AST and P < 0.05 for ALT). Although HDAC class I inhibition with MS-275 provided statistically insignificant benefit, tubastatin-A (TubA), an HDAC6 inhibitor with additional activity against HDAC10, provided significant protection against liver IRI (P < 0.01 for AST and P < 0.001 for ALT). Surprisingly genetic deletion of HDAC6 or -10 did not replicate the protective effects of HDAC6 inhibition with TubA, whereas treatment with an HDAC6 BUZ-domain inhibitor, LakZnFD, eliminated the protective effect of TubA treatment in liver ischemia (P < 0.01 for AST and P < 0.01 for ALT). Conclusions: Our findings suggest TubA, a class IIb HDAC inhibitor, can mitigate hepatic IRI in a manner distinct from previously described class I HDAC inhibition and requires the HDAC6 BUZ-domain activity. Our data corroborate previous findings that HDAC targets for therapeutic intervention of IRI may be tissue-specific, and identify HDAC6 inhibition as a possible target in the treatment of liver IRI.
Article
BACKGROUND Prolonged donor hepatectomy time may be implicated in early and late complications of liver transplantation. AIM To evaluate the impact of donor hepatectomy time on outcomes of liver transplant recipients, mainly early allograft dysfunction. METHODS This multicenter retrospective study included brain-dead donors and adult liver graft recipients. Donor-recipient matching was obtained through a crossover list. Clinical and laboratory data were recorded for both donors and recipients. Donor hepatectomy, cold ischemia, and warm ischemia times were recorded. Primary outcome was early allograft dysfunction. Secondary outcomes included need for retransplantation, length of intensive care unit and hospital stay, and patient and graft survival at 12 months. RESULTS From January 2019 to December 2021, a total of 243 patients underwent a liver transplant from a brain-dead donor. Of these, 57 (25%) developed early allograft dysfunction. The median donor hepatectomy time was 29 (23–40) min. Patients with early allograft dysfunction had a median hepatectomy time of 25 (22–38) min, whereas those without it had a median time of 30 (24–40) min (P = 0.126). CONCLUSION Donor hepatectomy time was not associated with early allograft dysfunction, graft survival, or patient survival following liver transplantation.
Article
Background In Italy, 20 min of continuous, flat-line electrocardiogram are required for death declaration. Despite prolonged warm ischemia time, Italian centers reported good outcomes in controlled donation after circulatory death (cDCD) liver transplantation by combining normothermic regional and end-ischemic machine perfusion (MP). The aim of this study was to evaluate the safety and feasibility of the use of septuagenarian and octogenarian cDCD donors with this approach. Methods All cDCD older than 70 y were evaluated during normothermic regional perfusion and then randomly assigned to dual hypothermic or normothermic MP. Results In the period from April 2021 to December 2022, 17 cDCD older than 70 y were considered. In 6 cases (35%), the graft was not considered suitable for liver transplantation, whereas 11 (65%) were evaluated and eventually transplanted. The median donor age was 82 y, being 8 (73%) older than 80. Median functional warm ischemia and no-flow time were 36 and 28 min, respectively. Grafts were randomly assigned to ex situ dual hypothermic oxygenated MP in 6 cases (55%) and normothermic MP in 5 (45%). None was discarded during MP. There were no cases of primary nonfunction, 1 case of postreperfusion syndrome (9%) and 2 cases (18%) of early allograft dysfunction. At a median follow-up of 8 mo, no vascular complications or ischemic cholangiopathy were reported. No major differences were found in terms of postoperative hospitalization or complications based on the type of MP. Conclusions The implementation of sequential normothermic regional and end-ischemic MP allows the safe use of very old donation after circulatory death donors.
Article
Orthotopic liver transplantation (OLT) is the most effective treatment for patients with end-stage liver disease (ESLD). Hepatic insufficiency within a week of OLT, termed early allograft dysfunction (EAD), occurs in 20% to 25% of deceased donor OLT recipients and is associated with morbidity and mortality. Primary nonfunction (PNF), the most severe form of EAD, leads to death or retransplantation within 7 days. The etiology of EAD is multifactorial, including donor, recipient, and surgery-related factors, and largely driven by ischemia-reperfusion injury (IRI). IRI is an immunologic phenomenon characterized by dysregulation of cellular oxygen homeostasis and innate immune defenses in the allograft after temporary cessation (ischemia) and later restoration (reperfusion) of oxygen-rich blood flow. The rising global demand for OLT may lead to the use of marginal allografts, which are more susceptible to IRI, and thus lead to an increased incidence of EAD. It is thus imperative the anesthesiologist is knowledgeable about EAD, namely its pathophysiology and intraoperative strategies to mitigate its impact. Intraoperative strategies can be classified by 3 phases, specifically donor allograft procurement, storage, and recipient reperfusion. During procurement, the anesthesiologist can use pharmacologic preconditioning with volatile anesthetics, consider preharvest hyperoxemia, and attenuate the use of norepinephrine as able. The anesthesiologist can advocate for normothermic regional perfusion (NRP) and machine perfusion during allograft storage at their institution. During recipient reperfusion, the anesthesiologist can optimize oxygen exposure, consider adjunct anesthetics with antioxidant-like properties, and administer supplemental magnesium. Unfortunately, there is either mixed, little, or no data to support the routine use of many free radical scavengers. Given the sparse, limited, or at times conflicting evidence supporting some of these strategies, there are ample opportunities for more research to find intraoperative anesthetic strategies to mitigate the impact of EAD and improve postoperative outcomes in OLT recipients.
Article
Full-text available
Introduction The most relevant limiting factor for performing end-to-end anastomosis is portal vein thrombosis (PVT), which leads to challenging vascular reconstructions. This study aimed to analyze a single center’s experience using the left gastric vein (LGV) for portal flow reconstruction in liver transplantation (LT). Methods This retrospective observational study reviewed laboratory and imaging tests, a description of the surgical technique, and outpatient follow-up of patients with portal system thrombosis undergoing LT with portal flow reconstruction using the LGV. This study was conducted at a single transplant reference center in the northeast region of Brazil from January 2016 to December 2021. Results Between January 2016 and December 2021, 848 transplants were performed at our center. Eighty-two patients (9.7%) presented with PVT, most of whom were treated with thrombectomy. Nine patients (1.1% with PVT) had extensive thrombosis of the portal system (Yerdel III or IV), which required end-to-side anastomosis between the portal vein and the LGV without graft, and had no intraoperative complications. All patients had successful portal flow in Doppler ultrasound control evaluations. Discussion The goal was to reestablish physiological flow to the graft. A surgical strategy includes using the LGV graft. According to our reports, using LGV fulfilled the requirements for excellent vascular anastomosis and even allowed the dispensing of venous grafts. This is the largest case series in a single center of reconstruction of portal flow with direct anastomosis with the LGV without needing a vascular graft.
Article
Full-text available
Ex vivo normothermic machine perfusion (NMP) preserves donor organs and permits real-time assessment of allograft health, but the most effective indicators of graft viability are uncertain. Mitochondrial DNA (mtDNA), released consequent to traumatic cell injury and death, including the ischemia-reperfusion injury inherent in transplantation, may meet the need for a biomarker in this context. We describe a real time PCR-based approach to assess cell-free mtDNA during NMP as a universal biomarker of allograft quality. Measured in the perfusate fluid of 29 livers, the quantity of mtDNA correlated with metrics of donor liver health including International Normalized Ratio (INR), lactate, and warm ischemia time, and inversely correlated with inferior vena cava (IVC) flow during perfusion. Our findings endorse mtDNA as a simple and rapidly measured feature that can inform donor liver health, opening the possibility to better assess livers acquired from extended criteria donors to improve organ supply.
Preprint
Full-text available
Background and Aims: The present study analyzed retrospectively the association between socioeconomic deprivation and graft and patient survival in a cohort of 2,568 adult recipients of a liver transplant between 1996 and 2022. Materials and methods: The primary exposure was a nationally validated socioeconomic deprivation index (DI) at census block level and ranking from 1-5, with higher ranks indicating more significant socioeconomic deprivation. Results: At a median (IQR) follow-up of 144.8 (204) months, the overall patient and graft survival rates were 92% and 90.4% at 1 year, 78.4% and 72.2% at 5 years, and 58% and 56.7% at 10 years. Recipients with a DI rank above the median (i.e., more deprived) had 1, 5, and 10-year patient and graft survival rates of 91% and 89.5%, 70.4% and 68.4%, 48% and 46.7%, respectively, versus 92% and 90.4%, 78.4% and 72.2%, and 58% and 56.7% for less deprived patients (log-rank p<0.001). More deprived patients had a higher risk of death (p<0.0001), hypertension (p<0.0001), obesity (p=0.02), diabetes mellitus (p=0.02), graft rejection (p<0.0001), chronic kidney dysfunction (p<0.0001), major cardiovascular events (p<0.0001) and de novo malignancies (p<0.0001) than less deprived recipients. The factors associated with the survival probability were younger donor and recipient ages (p=0.03 and 0.02, respectively), female recipient sex (p=0.04), absence of HCV (p<0.01), absence of HCC (p=0.02), absence of DM at transplantation (p=0.03) and 1 year (p=0.01), lower DI (p=0.02), lower MELD (0.02), shorter CIT (p=0.03), TAC (p=0.01), and EVR in the immunosuppressive regimen (p=0.02). Conclusions: Patients from more deprived areas have a higher risk of death after liver transplantation. Pre- and post-transplant socioeconomic risk profiling is warranted to better tailor care to patient’s needs and expectations.
Article
Background Liver transplantation is traditionally performed around the clock to minimize organ ischemic time. However, the prospect of prolonging preservation times holds the potential to streamline logistics and transform liver transplantation into a semi-elective procedure, reducing the need for nighttime surgeries. Dual hypothermic oxygenated machine perfusion (DHOPE) of donor livers for 1–2 h mitigates ischemia-reperfusion injury and improves transplant outcomes. Preclinical studies have shown that DHOPE can safely extend the preservation of donor livers for up to 24 h. Methods We conducted an IDEAL stage 2 prospective clinical trial comparing prolonged (≥4 h) DHOPE to conventional (1–2 h) DHOPE for brain-dead donor livers, enabling transplantation the following morning. Liver allocation to each group was based on donor hepatectomy end times. The primary safety endpoint was a composite of all serious adverse events (SAE) within 30 days after transplantation. The primary feasibility endpoint was defined as the number of patients assigned and successfully receiving a prolonged DHOPE-perfused liver graft. Trial registration at: WHO International Clinical Trial Registry Platform, number NL8740. Findings Between November 1, 2020 and July 16, 2022, 24 patients were enrolled. The median preservation time was 14.5 h (interquartile range [IQR], 13.9–15.5) for the prolonged group (n = 12) and 7.9 h (IQR, 7.6–8.6) for the control group (n = 12; p = 0.01). In each group, three patients (25%; 95% CI 3.9–46%, p = 1) experienced a SAE. Markers of ischemia-reperfusion injury and oxidative stress in both perfusate and recipients were consistently low and showed no notable discrepancies between the two groups. All patients assigned to either the prolonged group or control group successfully received a liver graft perfused with either prolonged DHOPE or control DHOPE, respectively. Interpretation This first-in-human clinical trial demonstrates the safety and feasibility of DHOPE in prolonging the preservation time of donor livers to enable daytime transplantation. The ability to extend the preservation window to up to 20 h using hypothermic oxygenated machine preservation at a 10 °C temperature has the potential to reshape the landscape of liver transplantation. Funding 10.13039/501100005075University Medical Center Groningen, the Netherlands.
Article
Full-text available
ABBREVIATIONS LDLT-living donor liver transplantation GRWR-Graft to recipient body weight ratio POD-post operative day TB-total bilirubin AST-aspartate aminotransferase ALT-alanine aminotransferase GGT-Gamma glutamyl transferase FBSL-fasting blood sugar level TG-serum triglycerides HDL-high density lipoprotein LDL-low density lipoprotein
Article
Objective The primary objectives were to compare intra operative hemodynamic parameters, blood loss, renal function, and duration of surgery with and without TPCS in live donor liver transplantation (LDLT) recipients. Secondary objectives were post-operative early graft dysfunction (EGD), morbidity, mortality, total ICU and hospital stay. Background Blood loss during recipient hepatectomy for liver transplantation (LT) remains a major concern. Routine use of temporary portocaval shunt (TPCS) during LT is not yet elucidated. Methods A single centre, open label, randomized control trial. The sample size was calculated based on intraoperative blood loss. After exclusion, a total of 60 patients, 30 in each arm (TPCS versus no TPCS) were recruited in the trial. Results The baseline recipient and donor characteristics were comparable between the groups. The median intra-operative blood loss ( P = 0.004) and blood product transfusions ( P <0.05) were significantly less in TPCS group. TPCS group had significantly improved intraoperative hemodynamics in anhepatic phase as compared to no-TPCS group ( P <0.0001), requiring significantly less vasopressor support. This led to significantly better renal function as evidenced by higher intraoperative urine output in TPCS group ( P =0.002). Because of technical simplicity, TPCS group had significantly fewer IVC injuries (3.3 vs. 26.7%, P =0.026) and substantially shorter hepatectomy time and total duration of surgery (529.4 ± 35.54 vs. 606.83 ± 48.13 mins, P <0.0001). ). Time taken for normalisation of lactate in the immediate post-operative period was significantly shorter in TPCS group (median, 6 h vs. 13 h; P =0.04). Although post-operative endotoxemia, major morbidity, 90day mortality, total ICU and hospital stay were comparable between both the groups, tolerance to enteral feed was earlier in the TPCS group. Conclusion In LDLT, TPCS is a simple and effective technique that provides superior intraoperative hemodynamics and reduces blood loss and duration of surgery.
Article
Dynamic organ preservation is a relatively old technique which has regained significant interest in the last decade. Machine perfusion (MP) techniques are applied in various fields of solid organ transplantation today. The first clinical series of ex situ MP in liver transplantation was presented in 2010. Since then, the number of research and clinical applications has substantially increased. Despite the notable beneficial effect on organ quality and recipient outcome, MP is still not routinely used in liver transplantation. Based on the enormous need to better preserve organs and the subsequent demand to continuously innovate and develop perfusion equipment further, this technology is also beneficial to test and deliver future therapeutic strategies to livers before implantation. This article summarizes the various challenges observed during the current shift from static to dynamic liver preservation in the clinical setting. The different organ perfusion strategies are discussed first, together with ongoing clinical trials and future study design. The current status of research and the impact of costs and regulations is highlighted next. Factors contributing to costs and other required resources for a worldwide successful implementation and reimbursement are presented third. The impact of research on cost-utility and effectivity to guide the tailored decision-making regarding the optimal perfusion strategy is discussed next. Finally, this article provides potential solutions to the challenging field of innovation in healthcare considering the various social and economic factors and the role of clinical, regulatory, and financial stakeholders worldwide.
Article
The current study is aiming to prove the effectiveness and compare “Model for Early Allograft Function” (MEAF) and “postoperative Model for End-stage Liver Disease” (pMELD) in the early posttransplant setting in children. Methods. We did a retrospective study on 43 liver transplant patients for a 17-year period between the ages 0 – 18 years. MEAF and pMELD were calculated on the third and fifth postoperative day, respectively, and a Cox regression analysis was performed to find the correlation between them and mortality in the early postoperative period (EPOP). Results. Both scores proved to be statistically significant and applicable in EPOP. MEAF had P value of 0.0003 and a hazard ratio of 10.99, while pMELD demonstrated P value of 0.003 and a hazard ratio of 1.24. Conclusions. Both MEAF and pMELD can be used for the diagnostics of early allograft dysfunction and predicting the outcome of the transplantation, with MEAF having the upper hand.
Article
Full-text available
BACKGROUND There is no consensus on the usage of extended criteria donor (ECD) grafts in liver transplantation (LT) for acute-on-chronic liver failure (ACLF) patients. AIM To summarize the experience of using ECD livers in ACLF-LT. METHODS A retrospective cohort study was conducted, enrolling patients who underwent LT at the First Affiliated Hospital of Sun Yat-Sen University from January 2015 to November 2021. The patients were divided into ECD and non-ECD groups for analysis. RESULTS A total of 145 recipients were enrolled in this study, of which ECD and non-ECD recipients accounted for 53.8% and 46.2%, respectively. Donation after cardiac death (DCD) recipients accounted for the minority compared with donation after brain death (DBD) recipients (16.6% vs 83.4%). Neither overall survival nor graft survival significantly differed between ECD and non-ECD and DCD and DBD recipients. ECD grafts were associated with a significantly higher incidence of early allograft dysfunction (EAD) than non-ECD grafts (67.9% vs 41.8%, P = 0.002). Postoperative outcomes between DCD and DBD recipients were comparable (P > 0.05). ECD graft (P = 0.009), anhepatic phase (P = 0.034) and recipient gamma glutamyltransferase (P = 0.016) were independent risk factors for EAD. Recipient preoperative number of extrahepatic organ failures > 2 (P = 0.015) and intraoperative blood loss (P = 0.000) were independent predictors of poor post-LT survival. CONCLUSION Although related to a higher risk of EAD, ECD grafts can be safely used in ACLF-LT. The main factors affecting post-LT survival in ACLF patients are their own severe preoperative disease and intraoperative blood loss.
Article
Full-text available
A representation and interpretation of the area under a receiver operating characteristic (ROC) curve obtained by the "rating" method, or by mathematical predictions based on patient characteristics, is presented. It is shown that in such a setting the area represents the probability that a randomly chosen diseased subject is (correctly) rated or ranked with greater suspicion than a randomly chosen non-diseased subject. Moreover, this probability of a correct ranking is the same quantity that is estimated by the already well-studied nonparametric Wilcoxon statistic. These two relationships are exploited to (a) provide rapid closed-form expressions for the approximate magnitude of the sampling variability, i.e., standard error that one uses to accompany the area under a smoothed ROC curve, (b) guide in determining the size of the sample required to provide a sufficiently reliable estimate of this area, and (c) determine how large sample sizes should be to ensure that one can statistically detect differences in the accuracy of diagnostic techniques.
Article
Full-text available
In the absence of prior knowledge about population relations, investigators frequently employ a strategy that uses the data to help them decide whether to adjust for a variable. The authors compared the performance of several such strategies for fitting multiplicative Poisson regression models to cohort data: 1) the "change-in-estimate" strategy, in which a variable is controlled if the adjusted and unadjusted estimates differ by some important amount; 2) the "significance-test-of-the-covariate" strategy, in which a variable is controlled if its coefficient is significantly different from zero at some predetermined significance level; 3) the "significance-test-of-the-difference" strategy, which tests the difference between the adjusted and unadjusted exposure coefficients; 4) the "equivalence-test-of-the-difference" strategy, which significance-tests the equivalence of the adjusted and unadjusted exposure coefficients; and 5) a hybrid strategy that takes a weighted average of adjusted and unadjusted estimates. Data were generated from 8,100 population structures at each of several sample sizes. The performance of the different strategies was evaluated by computing bias, mean squared error, and coverage rates of confidence intervals. At least one variation of each strategy that was examined performed acceptably. The change-in-estimate and equivalence-test-of-the-difference strategies performed best when the cut-point for deciding whether crude and adjusted estimates differed by an important amount was set to a low value (10%). The significance test strategies performed best when the alpha level was set to much higher than conventional levels (0.20).
Article
Full-text available
Poor graft function early after liver transplantation is an important cause of morbidity and mortality. We defined early allograft dysfunction (EAD) using readily available indices of function and identified donor, graft, and pretransplant recipient factors associated with this outcome. This study examined 710 adult recipients of a first, single-organ liver transplantation for non-fulminant liver disease at three United States centers. EAD was defined by the presence of at least one of the following between 2 and 7 days after liver transplantation: serum bilirubin >10 mg/dl, prothrombin time (PT) > or =17 sec, and hepatic encephalopathy. EAD incidence was 23%. Median intensive care unit (ICU) and hospital stays were longer for recipients with EAD than those without (4 days vs. 3 days, P = 0.0001; 24 vs. 15 days, P = 0.0001, respectively). Three-year recipient and graft survival were worse in those with EAD than in those without (68% vs. 83%, P = .0001; 61% vs. 79%, P = 0.0001). A logistic regression model combining donor, graft, and recipient factors predicted EAD better than models examining these factors in isolation. Pretransplant recipient elevations in PT and bilirubin, awaiting a graft in hospital or ICU, donor age > or =50 years, donor hospital stay >3 days, preprocurement acidosis, and cold ischemia time > or =15 hr were independently associated with EAD. Recipients who develop EAD have longer ICU and hospital stays and greater mortality than those without. Donor, graft, and recipient risk factors all contribute to the development of EAD. Results of these analyses identify factors that, if modified, may alter the risk of EAD.
Article
Full-text available
Transplant physicians and candidates have become increasingly aware that donor characteristics significantly impact liver transplantation outcomes. Although the qualitative effect of individual donor variables are understood, the quantitative risk associated with combinations of characteristics are unclear. Using national data from 1998 to 2002, we developed a quantitative donor risk index. Cox regression models identified seven donor characteristics that independently predicted significantly increased risk of graft failure. Donor age over 40 years (and particularly over 60 years), donation after cardiac death (DCD), and split/partial grafts were strongly associated with graft failure, while African-American race, less height, cerebrovascular accident and 'other' causes of brain death were more modestly but still significantly associated with graft failure. Grafts with an increased donor risk index have been preferentially transplanted into older candidates (>50 years of age) with moderate disease severity (nonstatus 1 with lower model for end-stage liver disease (MELD) scores) and without hepatitis C. Quantitative assessment of the risk of donor liver graft failure using a donor risk index is useful to inform the process of organ acceptance.
Article
Full-text available
Extended criteria donor (ECD) liver allografts are often allocated to less severely ill liver transplant (LT) candidates who are at a relatively lower risk of pretransplant mortality, but it is not clear that the use of ECD allografts will decrease center waitlist mortality (WLM). Individual patient data from the UNOS OPTN database (2002-2005) were aggregated to obtain center-specific data. Deceased donor allografts with any of the following characteristics were defined as ECDs: from a donor with any of the criteria described by the New York State Department of Health Workgroup; or 12+ h of cold ischemia. Multivariate regression was used to examine the relationship between WLM and ECD, non-ECD and LDLT use after adjusting for candidate severity of illness. A total of 3555 ECD transplants, 11,660 standard criteria donor (SCD) transplants, and 717 LDLTs were performed at 100 centers during this period. The model demonstrated that SCD and ECD LTs were inversely correlated with a center's WLM (beta=-0.242 and -0.221, respectively; p <or= 0.003 for each). LDLTs did not significantly reduce WLM (beta=-0.048, p=0.55). In summary, increasing ECD liver allograft use significantly decreased WLM at US centers. Policies encouraging the increase used of ECDs would further reduce WLM.
Article
Full-text available
The survival benefit of liver transplantation depends on candidate disease severity, as measured by MELD score. However, donor liver quality may also affect survival benefit. Using US data from the SRTR on 28 165 adult liver transplant candidates wait-listed between 2001 and 2005, we estimated survival benefit according to cross-classifications of candidate MELD score and deceased donor risk index (DRI) using sequential stratification. Covariate-adjusted hazard ratios (HR) were calculated for each liver transplant recipient at a given MELD with an organ of a given DRI, comparing posttransplant mortality to continued wait-listing with possible later transplantation using a lower-DRI organ. High-DRI organs were more often transplanted into lower-MELD recipients and vice versa. Compared to waiting for a lower-DRI organ, the lowest-MELD category recipients (MELD 6-8) who received high-DRI organs experienced significantly higher mortality (HR = 3.70; p < 0.0005). All recipients with MELD > or =20 had a significant survival benefit from transplantation, regardless of DRI. Transplantation of high-DRI organs is effective for high but not low-MELD candidates. Pairing of high-DRI livers with lower-MELD candidates fails to maximize survival benefit and may deny lifesaving organs to high-MELD candidates who are at high risk of death without transplantation.
Article
Full-text available
Liver transplantation in 2006 generally resembled previous years, with fewer candidates waiting for deceased donor liver transplants (DDLT), continuing a trend initiated with the implementation of the model for end-stage liver disease (MELD). Candidate age distribution continued to skew toward older ages with fewer children listed in 2006 than in any prior year. Total transplants increased due to more DDLT with slightly fewer living donor liver transplants (LDLT). Waiting list deaths and time to transplant continued to improve. In 2006, there also were fewer DDLT for patients with MELD <15, fewer pediatric Status 1A/B transplants and more transplants from donation after cardiac death (DCD) donors. Adjusted patient and graft survival rates were similar for LDLT and DDLT. This article also contains in-depth analyses of transplantation for hepatocellular carcinoma (HCC). Recipients with HCC had lower adjusted 3-year posttransplant survival than recipients without HCC. HCC recipients who received pretransplant ablative treatments had superior adjusted 3-year posttransplant survival compared to HCC recipients who did not. Intestinal transplantation continued to slowly increase with the largest number of candidates on the waiting list since 1997. Survival rates have increased over time. Small children waiting for intestine grafts continue to have the highest waiting list mortality.
Article
Full-text available
Growth in the number of active patients on the kidney transplant waiting list has slowed. Projections based on the most recent 5-year data suggest the total waiting list will grow at a rate of 4138 registrations per year, whereas the active waiting list will increase at less than one-sixth that rate, or 663 registrations per year. The last 5 years have seen a small trend toward improved unadjusted allograft survival for living and deceased donor kidneys. Since 2004 the overall number of pancreas transplants has declined. Among pancreas recipients, those with simultaneous kidney-pancreas transplants experienced the highest pancreas graft survival rates. In response to the ongoing shortage of deceased donor organs, the US Health Resources and Services Administration launched the Organ Donation Breakthrough Collaborative in September 2003 and the Organ Transplantation Breakthrough Collaborative (OTBC) in October 2005. The 58 DSA Challenge is prominent among the goals adopted by the OTBC. Its premise: were each of the 58 existing donation service areas to increase the number of kidney transplants performed within their boundaries by 10 per month, an additional 7000 transplants over current annual levels would result.Such an increase could potentially eliminate the national kidney transplantation waiting list by 2030.
Article
Full-text available
Deceased organ donation has increased rapidly since 2002, coinciding with implementation of the Organ Donation Breakthrough Collaborative. The increase in donors has resulted in a corresponding increase in the numbers of kidney, liver, lung and intestinal transplants. While transplants for most organs have increased, discard and nonrecovery rates have not improved or have increased, resulting in a decrease in organs recovered per donor (ORPD) and organs transplanted per donor (OTPD). Thus, the expansion of the consent and recovery of incremental donors has frequently outpaced utilization. Meaningful increases in multicultural donation have been achieved, but donations continue to be lower than actual rates of transplantation and waiting list registrations for these groups. To counteract the decline in living donation, mechanisms such as paired donation and enhanced incentives to organ donation are being developed. Current efforts of the collaborative have focused on differentiating ORPD and OTPD targets by donor type (standard and expanded criteria donors and donors after cardiac death), utilization of the OPTN regional structure and enlisting centers to increase transplants to match increasing organ availability.
Article
Background. Poor graft function early after liver transplantation is an important cause of morbidity and mortality. We defined early allograft dysfunction(EAD) using readily available indices of function and identified donor, graft, and pretransplant recipient factors associated with this outcome. Methods. This study examined 710 adult recipients of a first, single-organ liver transplantation for non-fulminant liver disease at three United States centers. EAD was defined by the presence of at least one of the following between 2 and 7 days after liver transplantation: serum bilirubin > 10 mg/dl, prothrombin time (PT) ≥17 sec, and hepatic encephalopathy. Results. EAD incidence was 23%. Median intensive care unit (ICU) and hospital stays were longer for recipients with EAD than those without (4 days vs. 3 days,P=0.0001; 24 vs. 15 days, P=0.0001, respectively). Three-year recipient and graft survival were worse in those with EAD than in those without (68% vs. 83%, P=0.0001; 61% vs. 79%,P=0.0001). A logistic regression model combining donor, graft, and recipient factors predicted EAD better than models examining these factors in isolation. Pretransplant recipient elevations in PT and bilirubin, awaiting a graft in hospital or ICU, donor age ≥50 years, donor hospital stay >3 days, preprocurement acidosis, and cold ischemia time ≥15 hr were independently associated with EAD. Conclusion. Recipients who develop EAD have longer ICU and hospital stays and greater mortality than those without. Donor, graft, and recipient risk factors all contribute to the development of EAD. Results of these analyses identify factors that, if modified, may alter the risk of EAD.
Article
Several statistics have recently been proposed for the purpose of assessing the goodness of fit of an estimated logistic regression model. These statistics are reviewed and compared to other, less formal, procedures in the context of applications in epidemiologic research. One statistic is recommended for use and its computation is illustrated using data from a recent study of mortality of intensive care unit patients.
Article
We have observed an increased rate of delayed nonfunction (DNF) of liver grafts procured from older donors. The aim of this study was to correlate donor age and the patterns of graft failure after transplantation. Pattern of liver injury, synthetic function, and graft survival in recipients receiving liver grafts from donor older than age 50 (group I, n = 95) were compared with matched cohort of recipients transplanted with grafts from donors age 20-30 (group III, n = 50). Primary nonfunction (PNF) of the graft was defined as non-recoverable hepatocellular function necessitating emergency retransplantation within 72 hr. DNF was defined as marginal graft function necessitating retransplantation within one month. Recipient characteristics, including age and preoperative UNOS status, were similar between groups. Ischemic/reperfusion injury, reflected by SGOT and SGPT was more severe in older donors. PNF occurred at similar frequencies for all groups (7%). Normal liver function was regained in 76% of recipients in group I, and in 92% in group II. However, cholestatic pattern was observed in recipient of grafts from group I donors. Rapid rise in bilirubin, despite normalization of prothrombin time and liver transaminases, was the hallmark of DNF. DNF resulted in higher retransplantation rate in group I (24% vs. 8% in group II). Donor age did not affect patient survival. Liberalizing criteria for donor selection, and acceptance of older donors is a calculated risk. Over 75% of the recipients will regain normal liver function. However, a higher number of these grafts will exhibit slow recovery after transplantation, and a significant rate of DNF. Recognition of such pattern and early retransplantation should decrease mortality.
Article
Initial poor function and primary nonfunction are important problems in clinical transplantation. The incidence of primary nonfunction is about 6% and that of initial poor function is about 15%. Grafts with initial poor function have a higher graft failure rate in the first 3 mo after transplantation. Severe steatosis and cold preservation in University of Wisconsin solution for over 30 hr will alone cause primary nonfunction. However, primary nonfunction is probably most often caused by the presence of multiple relative risk factors. The major donor-relative risk factors are moderate steatosis, cold preservation over 12 hr and donor age over 50 yr, whereas retransplantation, high (United Network of Organ Sharing class 4) medical status and kidney failure are recipient relative risk factors. The most important perioperative risk factor is warm ischemia time. Rates of primary nonfunction and initial poor function might be reduced by avoidance of combinations of risk factors. Several tests have been developed to predict primary nonfunction and initial poor function, but none is yet clinically efficient.
Article
To identify factors predictive of early postoperative graft function, we analyzed 54 variables--including easily available clinical and laboratory data prospectively obtained from organ donors, transplant recipients and surgical procedures in 168 consecutive liver transplantations. Early postoperative graft function was classified into three groups according to a scoring system ranging from 3 to 9 based on peak serum ALT values, mean bile output and lowest prothrombin activity measured during the 72 hr after transplant: group 1 (score 3 to 4, good graft function; n = 73), group 2 (score 5 to 6, moderate dysfunction; n = 50) and group 3 (score, 7 to 9, severe dysfunction; n = 45). In univariate analyses, 8 of the 54 variables analyzed were statistically significant (p < 0.05) predictors of severe graft dysfunction: high serum sodium concentration and brain death caused by cranial trauma in organ donors, advanced age and low prothrombin activity in transplant recipients, prolonged total ischemia time and large transfusions of red blood cells, fresh frozen plasma and platelets during surgery. After introduction of these eight variables in a multivariate analysis, only four were found to independently predict early postoperative graft function: donor serum sodium concentration, total ischemia time, platelet transfusion during surgery and recipient prothrombin activity. In 52 liver transplantations, in which the predictive value of liver tissue adenine nucleotide concentration and several biochemical sensitive markers of donor nutritional status was also analyzed, only the ATP level in liver tissue obtained at the time of organ reperfusion was identified as an independent predictor of initial graft function.(ABSTRACT TRUNCATED AT 250 WORDS)
Article
In a retrospective analysis on 323 orthotopic liver transplant procedures performed between July 1984 and October 1991 the incidence of two forms of primary dysfunction (PDF) of the liver: primary nonfunction (PNF), and initial poor function (IPF) were studied. The incidence of PDF was 22% (73/323) with 6% PNF (20/323) and 16% IPF (53/323), while 78% (250/323) had immediate function (IF). Occurrence of both IPF and PNF resulted in a higher graft failure rate (P < 0.001), retransplantation rate (P < 0.001), and patient mortality (P < 0.003) within the first three months after OLTx. Univariate analyses of donor and recipient factors and their influence on PDF demonstrated that longer donor hospitalization (> 3 days), older donor age (> 49 years), extended preservation times (> 18 hr), and fatty changes in the donor liver biopsy, as well as reduced-size livers, younger recipient age, and renal insufficiency prior to OLTx, significantly affected the incidence of IPF and PNF. Multivariate analysis of potential risk factors showed that reduced-size liver (P = 0.0001), fatty changes on donor liver biopsy (P = 0.001), older donor age (P = 0.009), retransplantation (P = 0.01), renal insufficiency (P = 0.02), and prolonged cold ischemia times (P = 0.02) were independently associated with a higher incidence of IPF and PNF. No statistical correlation was found between PDF and etiology of ESLD, nutritional status of the recipient, UNOS status, and Child-Pugh classification in this study. We conclude that PNF and IPF are both separate clinical entities that have a significant effect on outcome after OLTx. Routine donor liver biopsies are recommended to decrease the rate of IPF and PNF. The combination of risk factors shown to be significant for PDF should be avoided--and, if that is not possible, the only variable that can be controlled, the preservation time, should be kept as short as possible.
Article
Increased graft ischemic time and donor age are risk factors for early death after heart transplantation, but the effect of these variables on survival after lung transplantation has not been determined in a large, multinational study. All recipients of cadaveric lung transplantations performed between October 1, 1987 and June 30, 1997 which were reported to the United Network for Organ Sharing/International Society for Heart and Lung Transplantation (UNOS/ISHLT) Registry were analyzed. Patient survival rates were estimated using Kaplan-Meier methods. Multivariate logistic regression was used to determine the impact of donor and recipient characteristics on patient survival after transplantation. To examine whether the impact of donor age varied with ischemic time, interactions between the 2 terms were examined in a separate multivariate logistic regression model. Kaplan-Meier survival did not differ according to the total lung graft ischemia time, but recipient survival was significantly adversely affected by young (-10 years) or old (-51 years) donor age (p = 0.01). On multivariate analysis, neither donor age nor lung graft ischemic time per se were independent predictors of early survival after transplantation, except if quadratic terms of these variables were included in the model. The interaction between donor age and graft ischemia time, however, predicted 1 year mortality after lung transplantation (p = 0.005), especially if donor age was greater than 55 years and ischemic time was greater than 6 to 7 hours. Graft ischemia time alone is not a risk factor for early death after lung transplantation. Very young or old donor age was associated with decreased early survival, whereas the interaction between donor age and ischemic time was a significant predictor of 1 year mortality after transplantation. Cautious expansion of donor acceptance criteria (especially as regards ischemic time) is advisable, given the critical shortage of donor lung grafts.
Article
The mechanisms underlying the initial graft dysfunction in liver transplantation are not completely understood, although much of the liver graft injury derives from the ischemia/reperfusion-induced oxidative stress. Thus, the purpose of our study was to determine the involvement of oxidative stress in the initial graft dysfunction in human liver transplantation. Liver biopsies were taken at different times of the transplantation procedure, at the organ donor laparatomy (T1), before graft reperfusion (T2), and 5-60 min after graft reperfusion (T3), determining the levels of GSH, GSSG, as well as peroxides and malondialdehyde in liver homogenates. Patients were graded into two groups depending on whether the peak serum alanine aminotransferases within the first 3 postoperative days were lower (group A, mild to moderate injury: 32 patients) or higher (group B, severe injury: 5 patients) than 2500 U/l. The levels of GSH at time intervals T1-T3 were similar for groups A and B, with a trend to lower GSSG levels in group B at T2 and T3 samples. This outcome was accompanied by unchanged levels of malondialdehyde and hydrogen peroxide in the same samples in both groups of patients. No patient developed primary graft nonfunction. One-year cumulative survival was 81% and 60% in groups A and B, respectively (p>0.05). These findings indicate a lack of significant generation of reactive oxygen species and consequent oxidative stress as a major factor involved in the pathogenesis of the initial graft dysfunction in human liver transplantation.
Article
s: Primary graft failure (PGF) is a devastating acute lung injury syndrome following lung transplantation. We sought to identify donor, recipient, and operative risk factors for its development. We conducted a cohort study of 255 consecutive lung transplant procedures performed between October 1991 and July 2000. We defined PGF as follows: (1) diffuse alveolar opacities exclusively involving allograft(s) and developing within 72 h of transplant, (2) a ratio of PaO(2) to fraction of inspired oxygen < 200 beyond 48 h postoperatively, and (3) no other secondary cause of graft dysfunction identified. Risk factors were assessed individually and adjusted for confounding using multivariable logistic regression models. Tertiary-care academic medical center. The overall incidence was 11.8% (95% confidence interval [CI], 7.9 to 15.9). Following multivariable analysis, the risk factors independently associated with development of PGF were as follows: a recipient diagnosis of primary pulmonary hypertension (PPH; adjusted odds ratio [OR], 4.52; 95% CI, 1.29 to 15.9; p = 0.018), donor female gender (adjusted OR, 4.11; 95% CI, 1.17 to 14.4; p = 0.027), donor African-American race (adjusted OR, 5.56; 95% CI, 1.57 to 19.8; p = 0.008), and donor age < 21 years (adjusted OR, 4.06; 95% CI, 1.34 to 12.3; p = 0.013) and > 45 years (adjusted OR, 6.79; 95% CI, 1.61 to 28.5; p = 0.009). Recipient diagnosis of PPH, donor African-American race, donor female gender, and donor age are independently and strongly associated with development of PGF.
Article
Liver allocation policy in the U.S. was recently changed to a continuous disease severity scale with minimal weight given to time waiting in an effort to better prioritize deceased donor liver transplant candidates. We compared rates of waiting list registrations, removals, transplants, and deaths during the year prior to implementation of the new liver allocation policy (2/27/01-2/26/02, Era 1) with the first year's experience (2/27/02-2/26/03, Era 2) under this new policy. Rates were adjusted for 1,000 patient years on the waiting list and compared using z-tests. A 1-sided test was used to compare death rates; 2-sided tests were used to compare transplant rates. Overall and subgroup analyses were performed for demographic, geographic, and medical strata. In Era 2, we observed a 12% reduction in new liver transplant waiting list registrations, with the largest reductions seen in new registrants with low MELD/PELD scores. In Era 2, there was a 3.5% reduction in waiting list death rate (P =.076) and a 10.2% increase in cadaveric transplants (P <.001). The reduction in waiting list mortality and increase in transplantation rates were evenly distributed across all demographic and medical strata, with some variation across geographic variables. Early patient and graft survival after deceased donor liver transplantation remains unchanged. In conclusion, by eliminating the categorical waiting list prioritization system that emphasized time waiting, the new system has been associated with reduced registrations and improved transplantation rates without increased mortality rates for individual groups of waiting candidates or changes in early transplant survival rates.
Article
Markers that purport to distinguish subjects with a condition from those without a condition must be evaluated rigorously for their classification accuracy. A single approach for statistical evaluation and comparison of markers is not yet established. We suggest a standardization that uses the marker distribution in unaffected subjects as a reference. For an affected subject with marker value Y, the standardized placement value is the proportion of unaffected subjects with marker values that exceed Y. We applied the standardization to 2 illustrative datasets. As a marker for pancreatic cancer, the CA-19-9 marker had smaller placement values than the CA-125 marker, indicating that CA-19-9 was the better marker. For detecting hearing impairment, the placement values for the test output (the marker) were smaller when the input sound stimulus was of lower intensity, which indicates that the test better distinguishes hearing-impaired from unimpaired ears when a lower intensity sound stimulus is used. Explicit connections are drawn between the distribution of standardized marker values and the receiver operating characteristic curve, one established statistical technique for evaluating classifiers. The standardization is an intuitive procedure for evaluating markers. It facilitates direct and meaningful comparisons between markers. It also provides a new view of receiver operating characteristic analysis that may render it more accessible to those as yet unfamiliar with it. The general approach provides a statistical tool to address important questions that are typically not addressed in current marker research, such as quantifying and controlling for covariate effects.
Article
The objective of this study was to evaluate the effect of systematic utilization of extended donor criteria liver allografts (EDC), including living donor allografts (LDLT), on patient access to liver transplantation (LTX). Utilization of liver allografts that do not meet traditional donor criteria (EDC) offer immediate expansion of the donor pool. EDC are typically allocated by transplant center rather than regional wait-list priority (RA). This single-institution series compares outcomes of EDC and RA allocation to determine the impact of EDC utilization on donor use and patient access to LTX. The authors conducted a retrospective analysis of 99 EDC recipients (49 deceased donor, 50 LDLT) and 116 RA recipients from April 2001 through April 2004. Deceased-donor EDC included: age >65 years, donation after cardiac death, positive viral serology (hepatitis C, hepatitis B core antibody, human T-cell lymphotrophic), split-liver, hypernatremia, prior carcinoma, steatosis, and behavioral high-risk donors. Outcome variables included patient and graft survival, hospitalization, initial graft function, and complication categorized as: biliary, vascular, wound, and other. EDC recipients were more frequently diagnosed with hepatitis C virus or hepatocellular carcinoma and had a lower model for end-stage liver disease (MELD) score at LTX (P < 0.01). Wait-time, technical complications, and hospitalization were comparable. Log-rank analysis of Kaplan-Meier survival estimates demonstrated no difference in patient or graft survival; however, deaths among deceased-donor EDC recipients were frequently the result of patient comorbidities, whereas LDLT and RA deaths resulted from graft failure (P < 0.01). EDC increased patient access to LTX by 77% and reduced pre-LTX mortality by over 50% compared with regional data (P < 0.01). Systematic EDC utilization maximizes donor use, increases access to LTX, and significantly reduces wait-list mortality by providing satisfactory outcomes to select recipients.
Article
Exactly what constitutes a marginal donor remains ill defined. The authors set out to create a scoring system that objectively classifies a donor as marginal or nonmarginal and to define what the maximum acceptable preservation period is for the marginal liver to minimize early graft dysfunction. The authors performed an analysis on data collected prospectively of 397 cadaveric liver transplants. Both univariate and multivariate analyses were performed on donor, recipient, and perioperative factors with relation to early allograft dysfunction. A score was developed that classified donors into marginal and nonmarginal populations, and the influence of cold ischemia was determined for each group. Multivariate analysis-determined donor age and steatosis (moderate to severe) were independent predictors of deranged function. This enabled the authors to produce a scoring system to differentiate marginal donors with respect to risk of early allograft dysfunction as follows: Formula=(20.06xsteatosis)+(0.44xdonor age), cutoff 23.1. In the marginal group, the cutoff value of cold ischemia time was 12.6 hr. The authors developed a scoring system that classified an organ as marginal or nonmarginal depending on the donor age and degree of steatosis. Marginal livers have a strong risk of developing early allograft dysfunction with increasing cold ischemia times and should be transplanted within 12 hr. Cold ischemia time was not found to be an important factor in the development of early allograft dysfunction in nonmarginal donors.
Article
Extended-donor criteria liver allografts do not meet traditional criteria for transplantation. Although these organs offer immediate expansion of the donor pool, transplantation of extended-donor criteria liver allografts increases potential short- and long-term risk to the recipient. This risk may manifest as impaired allograft function or donor-transmitted disease. Guidelines defining this category of donor, level of acceptable risk, principles of consent, and post-transplantation surveillance have not been defined. This article reviews the utilization, ethical considerations, and outcomes of extended-donor criteria liver allografts.
Article
Early cholestasis is not uncommon after liver transplantation and usually signifies graft dysfunction. The aim of this study was to determine if serum synthetic and cholestatic parameters measured at various time points after transplantation can predict early patient outcome, and graft function. The charts of 92 patients who underwent 95 liver transplantations at Rabin Medical Center between 1991 and 2000 were reviewed. Findings on liver function tests and levels of serum bilirubin, alkaline phosphatase (ALP), and gamma glutamyl transpeptidase (GGT) on days 2, 10, 30, and 90 after transplantation were measured in order to predict early (6 months) patient outcome (mortality and sepsis) and initial poor functioning graft. Pearson correlation, chi(2) test, and Student's t-test were performed for univariate analysis, and logistic regression for multivariate analysis. Univariate analysis. Serum bilirubin >/=10 mg/dL and international normalized ratio (INR) >1.6 on days 10, 30, and 90, and high serum ALP and low albumin levels on days 30 and 90 were risk factors for 6-month mortality; serum bilirubin >/=10 mg/dL on days 10, 30, and 90, high serum ALP, high GGT, and low serum albumin, on days 30 and 90, and INR >/=1.6 on day 10 were risk factors for sepsis; high serum alanine aminotransferase, INR >1.6, and bilirubin >/=10 mg/dL on days 2 and 10 were risk factors for poor graft function. The 6-month mortality rate was significantly higher in patients with serum bilirubin >/=10 mg/dL on day 10 than in patients with values of <10 mg/dL (29.4% vs. 4.0%, p = 0.004). Patients who had sepsis had high mean serum ALP levels on day 30 than patients who did not (364.5 +/- 229.9 U/L vs. 70.8 +/- 125.6 U/L, p = 0.005). Multivariate analysis. Significant predictors of 6-month mortality were serum bilirubin >/=10 mg/dL [odds ratio (OR) 9.05, 95% confidence intervals (CI) 1.6-49.6] and INR >1.6 (OR 9.11, CI 1.5-54.8) on day 10; significant predictors were high serum ALP level on day 30 (OR 1.005, 1.001-1.01) and high GGT level on day 90 (OR 1.005, CI 1.001-1.01). None of the variables were able to predict initial poor graft functioning. Several serum cholestasis markers may serve as predictors of early outcome of liver transplantation. The strongest correlation was found between serum bilirubin >/=10 mg/dL on day 10 and early death, sepsis, and poor graft function. Early intervention in patients found to be at high risk may ameliorate the high morbidity and mortality associated with early cholestasis.
A simple scoring system to evaluate the effects of cold ischemia on marginal liver donors
  • K Tekin
  • Imber Cj
  • M Atli
  • Gunson Bk
  • Sr
  • D Mayer
Tekin K, Imber CJ, Atli M, Gunson BK, Bramhall SR, Mayer D, et al. A simple scoring system to evaluate the effects of cold ischemia on marginal liver donors. Trans-plantation 2004;77:411-416.
Effects of Donor and Recipient Genetic Expression on Heart, Lung, Liver, or
Clinical Trials in Organ Transplantation 3. Effects of Donor and Recipient Genetic Expression on Heart, Lung, Liver, or Kidney Transplant Survival. National Institute of Allergy and Infectious Diseases (NIAID). http://clinicaltrials. gov/ct2/show/NCT00531921. Accessed April 2010.
Effects of Donor and Recipient Genetic Expression on Heart, Lung, Liver, or Kidney Transplant Survival. National Institute of Allergy and Infectious Diseases (NIAID)
Clinical Trials in Organ Transplantation 3. Effects of Donor and Recipient Genetic Expression on Heart, Lung, Liver, or Kidney Transplant Survival. National Institute of Allergy and Infectious Diseases (NIAID). http://clinicaltrials. gov/ct2/show/NCT00531921. Accessed April 2010.