[Show abstract][Hide abstract] ABSTRACT: Heme Oxygenase-1 and its product biliverdin/bilirubin have been demonstrated to protect against ischemia/reperfusion injury (IRI). We investigated if increased pre-operative bilirubin values of transplant recipients decrease IRI. Pre-operative bilirubin levels of live-donor liver recipients were correlated to post-operative liver transaminase as marker of IRI. Additionally, two recipient groups with pre-transplant bilirubin levels >24μmol/l (n=348) and ≤24μmol/l (n=118) were compared. Post-transplant liver function, complications, length of hospital stay, and patient and graft survival were assessed. Pre-operative bilirubin levels were negatively correlated to the post-operative increase in transaminases suggesting a protective effect against IRI. The maximal rise of ALT after transplantation in high vs low bilirubin patients was 288 [-210-2457] U/L vs 375 [-11-2102] U/L, P=0.006. Bilirubin remained a significant determining factor in a multivariate linear regression analysis. The MELD score and its individual components as marker of severity of chronic liver disease were significantly higher in the high vs low bilirubin group (P<0.001). Despite this, overall complication rate (21.0% vs 21.2%, P=0.88), hospital stay (13 [4-260] vs 14 [6-313] days, P=0.93), and 1-year graft survival (90.8% vs 89.0%, P=0.62) were similar in both groups. High bilirubin levels of liver recipients before live donor transplantation is associated with decreased post-operative IRI. This article is protected by copyright. All rights reserved.
This article is protected by copyright. All rights reserved.
Transplant International 07/2015; DOI:10.1111/tri.12634 · 3.16 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: To compare the outcome of adult live donor liver transplantation (LDLT) with grafts from older versus younger donors.
Using older donor grafts for adult LDLT may help expand the donor pool. However, the risks of LDLT with older donors remain controversial, and many centers are reluctant to use live donors aged 45 years or older for adult LDLT.
Outcomes of patients receiving a LDLT graft from donors aged 50 years or older (n = 91) were compared with those receiving a live donor graft from donors younger than 50 years (n = 378).
Incidences of biliary (LDLT <50: 24% vs LDLT ≥50: 23%; P = 0.89) and major complications (LDLT <50: 24% vs LDLT ≥50: 24%; P = 1) were similar between both groups of recipients. No difference was observed in 30-day recipient mortality (LDLT <50: 3% vs LDLT ≥50: 0%; P = 0.13). The 1- (90% vs 90%), 5- (82% vs 73%), and 10- (71% vs 58%) year graft survival was statistically similar between both groups (P = 0.075). Likewise, patient survival after 1- (92% vs 96%), 5- (83% vs 79%), and 10- (76% vs 69%) years was also similar (P = 0.686). Overall, donors rate of major complications (Dindo-Clavien ≥3b) within 30 days was low (n = 2.3%) and not different in older versus younger donors (P = 1). Donor median hospital stay in both groups was identical [LDLT <50: 6 (4-17) vs LDLT ≥50: 6 (4-14) days; P = 0.65]. No donor death occurred and all donors had full recovery and returned to baseline activity.
Right lobe LDLT with donors aged 50 years or older results in acceptable recipient outcome without increased donor morbidity or mortality. Potential live donors should not be declined on the basis of age alone.
15th Annual State of the Art Winter Symposium of the; 06/2015
[Show abstract][Hide abstract] ABSTRACT: Simkania negevensis infection has been hypothesized to play a role in lung transplant rejection. The incidence of S. negevensis infection and its association with acute cellular rejection (ACR) were determined in a prospective cohort study of 78 lung transplant recipients (LTRs) in Toronto, Canada and Pittsburgh, USA from July 2007 to January 2010. S. negevensis testing was detected by quantitative polymerase chain reaction (PCR) on bronchoalveolar lavage fluid. The relationship between S. negevensis and ACR was examined using Cox proportional hazards models and generalized linear and latent mixed models. Cumulative incidence estimates for time-to-ACR in S. negevensis PCR-positive vs. PCR-negative LTRs were 52.7% vs. 31.1% at 6 months and 68.9% vs. 44.6% at 1 year, respectively. Although not statistically significant, there was a trend towards a higher risk of ACR among S. negevensis PCR-positive vs. PCR-negative LTRs in all statistical models. This article is protected by copyright. All rights reserved.
This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: Donor-specific antibodies (DSA) have been associated with increased rejection and lower kidney transplant survival. The impact of DSA on pancreas transplant outcomes in patients with negative T-cell cytotoxicity cross-matches remains unclear. We performed a retrospective analysis of DSA in 171 (69 PAK and 102 SPK) consecutive pancreas transplants from a single center since January 2009, when routine DSA examination using luminex single antigen assays began. Pre-transplant DSA (Pre-DSA) was detected in 36 (21%) recipients, and was more prevalent in PAK than SPK recipients (32% vs. 16%, p=0.02). All patients with pre-DSA had a negative CDC or flow cross-match and were treated routinely with IVIG during surgery. De novo DSA developed in 44 (33%) recipients during follow-up (33% in PAK vs. 32% in SPK, p=0.89). Kidney and/or pancreas rejection occurred in 25% (42), and most 64% (22) of these occurred in the presence of DSA (p=0.012). The presence of DSA also increased the risk of having > 1 rejection episode (p=0.004). DSA became undetectable in 21 (26%) patients: in 5 patients, the same DSA reappeared in 3 and new DSA developed in 2 (reappearance of DSA was associated with an acute rejection episode in 4 of 5 recipients); in 16 patients in whom DSA remained undetectable, 5 had a subsequent rejection episode with no evidence of new DSA. Pancreas graft survival among those with and without pre-transplant DSA was 92% and 88% at 1-year and 80% and 88% at 3-years (p=0.26). Similarly, pancreas graft survival among those with and without de novo DSA was 86% and 91% at 1-year and 81% and 89% at 3-years (p=0.14). 24 graft failures occurred during follow-up time (9 in the DSA- and 15 in the DSA+), only 6 were due to rejection (3 in the DSA- and 3 in the DSA+; p=0.64). Pre-transplant DSA also did not increase the risk graft loss from rejection (13% in Pre-DSA- and 28% of Pre-DSA+; p=0.57). Conclusion: Pre-transplant DSA with a negative cross-match and post-transplant de novo DSA are risk factors for rejection, but have little impact on early pancreas transplant survival.
[Show abstract][Hide abstract] ABSTRACT: Long-term biliary complications after living donor liver transplantation (LDLT) are not well described in the literature. This study was undertaken to determine the long-term impact of biliary complications after adult right-lobe LDLT.
This retrospective review analyzed an 11-year experience of 344 consecutive right-lobe LDLT with at least 2 years of follow-up.
Biliary leaks occurred in 50 patients (14.5%), and strictures occurred in 67 patients (19.5%). Cumulative biliary complication rates at 1, 2, 5, and 10 years were 29%, 32%, 36%, and 37%, respectively. Most early biliary leaks were treated with surgical drainage (N = 29, 62%). Most biliary strictures were treated first with endoscopic retrograde cholangiography (42%). There was no association between biliary strictures and the number of ducts (hazard ratio [HR] 1.017 [0.65-1.592]), p = 0.94), but freedom from biliary stricture was associated with a more recent era (2006-2010) (HR 0.457 [0.247-0.845], p = 0.01). Long-term graft survival did not differ between those who had or did not have biliary complications (66% versus 67% at 10 years).
Biliary strictures are common after LDLT but may decline with a center's experience. With careful follow-up, they can be successfully treated, with excellent long-term graft survival rates. This article is protected by copyright. All rights reserved.
This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: Background
Pancreas-kidney transplantation with enteric drainage has become a standard treatment in diabetic patients with renal failure. Leaks of the graft duodenum (DL) remain a significant complication after transplantation. We studied incidence and predisposing factors of DLs in both simultaneous pancreas-kidney (SPK) and pancreas after kidney (PAK) transplantation.Method
Between January 2002 and April 2013 284 pancreas transplantations were performed including 191 SPK (67.3%) and 93 PAK (32.7%). Patient data were analyzed for occurrence of DLs, risk factors, leak etiology, and graft survival.ResultsOf 18 DLs (incidence 6.3%), 12 (67%) occurred within the first 100 days after transplantation. Six grafts (33%) were rescued by duodenal segment resection. Risk factors for a DL were: PAK transplantation sequence (odds ratio 3.526, p=0.008) and preoperative immunosuppression (odds ratio 3.328, p=0.012). In the SPK subgroup, postoperative peak amylase as marker of preservation/reperfusion injury and recipient pre-transplantation cardiovascular interventions as marker of atherosclerosis severity were associated with an increased incidence of DLs. CMV mismatch constellations showed an increased incidence in the SPK subgroup, however without significance probability.Conclusion
Long-term immunosuppression in PAK transplantation is a major risk factor for DLs. Early surgical revision offers the chance of graft rescue.This article is protected by copyright. All rights reserved.
Transplant International 02/2015; 28(6). DOI:10.1111/tri.12535 · 3.16 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: In nonalcoholic fatty liver disease (NAFLD), hepatic gene expression and fatty acid (FA) composition have been reported independently but a comprehensive gene expression profiling in relation to FA composition is lacking. The aim was to assess this relationship. In a cross-sectional study, hepatic gene expression (Illumina Microarray) was first compared among 20 patients with simple steatosis (SS), 19 with nonalcoholic steatohepatitis (NASH), and 24 healthy controls (HC). FA composition in hepatic total lipids was compared between SS and NASH, and associations between gene expression and FA were examined. Gene expression differed mainly between HC and patients (SS and NASH), including genes related to unsaturated FA metabolism. Twenty-two genes were differentially expressed between NASH and SS; most of them correlated with disease severity and related more to cancer progression than to lipid metabolism. Biologically long-chain polyunsaturated FA (PUFA) (eicosapentaenoic acid + docosahexaenoic acid, arachidonic acid) in hepatic total lipids were lower in NASH than in SS. This may be related to overexpression of FADS1, FADS2, and PNPLA3. The degree and direction of correlations between PUFA and gene expression were different among SS and NASH which may suggest that low PUFA content in NASH modulates gene expression in a different way compared with SS or, alternatively, gene expression influences PUFA content differently depending on disease severity (SS versus NASH). Conclusion: Well-defined subjects with either healthy liver, SS or NASH showed distinct hepatic gene expression profiles including genes involved in unsaturated FA metabolism. In NASH, hepatic PUFA were lower and associations with gene expression were different than in SS. This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: Pegylated interferon-α and ribavirin (PEG-IFN/RBV) is widely used to treat chronic hepatitis C virus infection with notorious adverse reactions since the broad expression of IFN-α receptors on all nucleated cells. Accordingly, a Type III IFN with restricted receptors distribution is much safer as an alternative for HCV therapy. In addition, single nucleotide polymorphisms (SNPs) near the human IFN-λ3 gene, IL-28B, correlate strongly with the ability to achieve a sustained virological response (SVR) to therapy with pegylated IFN-α plus ribavirin in patients infected with chronic hepatitis C. Furthermore, we also discuss the most recent findings: IFN-λ4 predicts treatment outcomes of HCV infection. In consideration of the apparent limitations of current HCV therapy, especially high failure rate and universal side effects, prediction of treatment outcomes prior to the initiation of treatment and developing new alternative drugs are two important goals in HCV research.
Gastroenterology Research and Practice 01/2015; 2015:1-9. DOI:10.1155/2015/796461 · 1.75 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: To identify prognostic factors after hepatocellular carcinoma (HCC) recurrence after liver transplantation (LT).
We retrospectively reviewed the combined experience at Toronto General Hospital and Hospital Vall d'Hebron managing HCC recurrence after LT (n = 121) between 2000 and 2012. We analyzed prognostic factors by uni- and multi-variate analysis. Median follow-up from LT was 29.5 (range 2-129.4) months. Median follow-up from HCC recurrence was 12.2 (range 0.1-112.5) months.
At recurrence, 31.4 % were treated with curative-intent treatments (surgery or ablation), 42.1 % received palliative treatment, and 26.4 % received best supportive care. The 1-, 3-, and 5-year survivals, respectively, after HCC recurrence were 75, 60, and 31 %, vs. 60, 19, and 12 %, vs. 52, 4, and 5 % (p < 0.001). By multivariate analysis, not being amenable to a curative-intent treatment [hazard ratio (HR) 4.7, 95 % confidence interval (CI) 2.7-8.3, p < 0.001], α-fetoprotein of ≥100 ng/mL at the time of HCC recurrence (HR 2.1, 95 % CI 1.3-2.3, p = 0.002) and early recurrence (<12 months) after LT (HR 1.6, 95 % CI 1.1-2.5, p = 0.03) were found to be poor prognosis factors. A prognostic score was devised on the basis of these three independent variables. Patients were divided into three groups, as follows: good prognosis, 0 points (n = 22); moderate prognosis, 1 or 2 points (n = 84); and poor prognosis, 3 points (n = 15). The 1-, 3-, and 5-year actuarial survival for each group was 91, 50, and 50 %, vs. 52, 7, and 2 %, vs. 13, 0, and 0 %, respectively (p < 0.001).
Patients with HCC recurrence after transplant amenable to curative-intent treatments can experience significant long-term survival (~50 % at 5 years), so aggressive management should be offered. Poor prognosis factors after recurrence are not being amenable to a curative-intent treatment, α-fetoprotein of ≥100 ng/mL, and early (<1 year) recurrence after LT.
[Show abstract][Hide abstract] ABSTRACT: Liver transplantation from donation-after-circulatory death (DCD) donors has been associated with a high rate of ischemic-type biliary strictures (ITBS) and inferior graft survival. To investigate the impact of intraoperative tissue plasminogen activator (tPA) on outcomes following DCD liver transplantation (DCD-LT), we conducted a retrospective analysis of DCD liver transplants at the Toronto General Hospital (TGH) and Ochsner Medical Center (OMC). Between 2009 and 2013, 85 DCD liver transplants were performed with intraoperative tPA injection (N=30 TGH, 55 OMC) and compared to 33 DCD liver transplants without tPA. Donor and recipient characteristics were similar between both groups. There was no significant difference in intra-operative packed red blood cell transfusion requirement (3.2 ± 3.4 vs 3.1 ± 2.3, P=0.74). Overall, biliary strictures occurred less commonly in the tPA treated group (16.5% vs 33.3%, P=0.07) with a much lower rate of diffuse intra-hepatic strictures (3.5% vs 21.2%, P=0.005). After 1- and 3-years the tPA vs non tPA group had superior patient survival (97.6% vs 87.0% and 92.7% vs 79.7%; P=0.016) and graft survival (96.4% vs 69.7% and 90.2% vs 63.6%; p<0.001). In conclusion, tPA injection into the hepatic artery during DCD liver transplantation reduces ITBS and improves graft and patient survival without increasing the risk for bleeding. This article is protected by copyright. All rights reserved.
[Show abstract][Hide abstract] ABSTRACT: IntroductionOncological implications of laparoscopic resection in primary hepatic malignancy are not well defined. Laparoscopic liver resection (LLR) for hepatocellular carcinoma (HCC) in comparison to an open liver resection (OLR) in peri-operative and long-term oncological outcomes are described from a single North American institution.Methods
From 2006 to 2013, all forty-three LLR patients for HCC were evaluated. Each patient was matched to two OLR patients for age at operation, maximal tumour size and tumour number.ResultsWhen compared with OLR, LLR had a lower severity of complication (0% versus 27%, P = 0.050) and lower 30-day readmission rate (2.3% versus 18.6%, P = 0.010). The length of stay (LOS) was shorter in LLR patients (5 versus 7 days, P < 0.001) and the estimated blood loss was also lower in LLR (300 versus 700 ml, P = 0.004). Admission to intensive care unit (ICU), emergency room (ER) visits and complication rates were similar. Overall, recurrence-free and intra-hepatic recurrence-free survival were comparable between LLR and OLR.DiscussionLLR confers the widely-accepted benefits of laparoscopic surgery, namely severity of complication, 30-day readmission rate, LOS and blood loss. Further studies are required to examine intra- and extra-hepatic recurrence after LLR. LLR for HCC should be considered for appropriately selected patients in centres with requisite volume and expertise.
[Show abstract][Hide abstract] ABSTRACT: Outcomes of living versus deceased donor liver transplantation in patients with chronic liver disease and hepatorenal syndrome (HRS) was compared using a matched pair study design. Thirty patients with HRS receiving a live donor liver transplantation (LDLT) and 90 HRS patients receiving a full graft deceased donor liver transplantation (DDLT) were compared. LDLT versus DDLT of patients with HRS was associated with decreased peak aspartate aminotransferase levels (339 ± 214 vs. 935 ± 1253 U/L; p = 0.0001), and similar 7-day bilirubin (8.42 ± 7.89 vs. 6.95 ± 7.13 mg/dL; p = 0.35), and international normalized ratio levels (1.93 ± 0.62 vs. 1.78 ± 0.78; p = 0.314). LDLT vs. DDLT had a decreased intensive care unit (2 [1–39] vs. 4 [0–93] days; p = 0.004), and hospital stay (17 [4–313] vs. 26 [0–126] days; p = 0.016) and a similar incidence of overall postoperative complications (20% vs. 27%; p = 0.62). No difference was detected between LDLT and DDLT patients regarding graft survival at 1 (80% vs. 82%), at 3 (69% vs. 76%) and 5 years (65% vs. 76%) (p = 0.63), as well as patient survival at 1 (83% vs. 82%), 3 (72% vs. 77%) and 5 years (72% vs. 77%) (p = 0.93). The incidence of chronic kidney disease post-LT (10% vs. 6%; p = 0.4) was similar between both groups. LDLT results in identical long-term outcome when compared with DDLT in patients with HRS.
American Journal of Transplantation 10/2014; 14(12). DOI:10.1111/ajt.12975 · 6.19 Impact Factor