[show abstract][hide abstract] ABSTRACT: Presence of donor-specific antibodies (Abs) is detrimental to posttransplant allograft function. Some sensitized recipients have successfully undergone transplantation after pretransplant conditioning regimen using plasmapheresis and/or intravenous immunoglobulin therapy, but underlying mechanisms that confer such allograft protection are undefined.
We developed a single human leukocyte antigen (HLA)-mismatched heterotopic murine heart transplant model (HLA-A2 into HLA-A2-sensitized-C57BL/6) to determine whether pretreatment of donors with low concentration of HLA class I (W6/32) or control Ab (C1.18.4) will confer protection. Expression levels of survival genes, Bcl-2 and heme oxygenase-1, were analyzed by gene array analysis and quantitative real-time polymerase chain reaction. Expression levels of cytokine panel were analyzed by Luminex. Role of Bcl-2 in the induction of allograft protection was analyzed by silencing the Bcl-2 expression in the donor hearts using a small hairpin (shRNA) specific for Bcl-2.
Control Ab-pretreated hearts were rejected in less than 5 days demonstrating hemorrhage, Ab, and C4 deposition. In contrast, W6/32-pretreated hearts were rejected at 15 days (P<0.05) that was prolonged to 25 days with antilymphocyte serum treatment. W6/32-pretreated hearts on day 5 exhibited increased expression of Bcl-2 (5.5-folds), Bcl-xl (5.5-folds), and heme oxygenase-1 (4.4-folds); decreased expression of ICAM-1, VCAM-1 (3.2-fold), along with reduced levels of cytokines interleukin (IL)-1β (4.4-folds), tumor necrosis factor α (3.7-folds), IL-6 (7.5-folds), IL-12 (2.3-folds) and chemokines monocyte chemotactic protein 1 (4.5-folds), MIG (4.4-folds), MIP-1α (3.4-folds), and IL-8 (3.1-folds). Silencing of Bcl-2 in accommodated hearts before transplant resulted in loss of protection with rejection (9±3 vs. 15±2days, P<0.05).
Pretreatment of hearts with low levels of anti-HLA Abs increases expression of antiapoptotic genes that inhibits caspases, leading to decreased inflammatory cytokines and chemokines, which promote allograft survival.
[show abstract][hide abstract] ABSTRACT: Selection of donors for kidney transplantation depends on accurate prediction of risk factors for immunologic rejection. Historically, cytotoxicity crossmatch (CXM) examining lysis of donor cells by preformed anti-human leukocyte antigen (HLA) antibodies (Abs) has been considered the best predictor of immunologic rejection. However, there is much interest in defining anti-HLA Ab specificity in recipient sera by immunoassay to predict crossmatch results and aid in donor selection. Current immunoassays for anti-HLA Abs are highly sensitive, though correlation between Abs detected by immunoassay and their functional relevance in CXM and subsequent transplantation is not well defined. In this study, we retrospectively examined the predictive value of detection of donor-specific anti-HLA Abs (DSA) by Luminex Single Antigen assay from 149 consecutive living donor kidney transplant recipients. We demonstrate that detection of DSA by immunoassay accurately predicted negative crossmatch and graft survival. However, this approach had limited sensitivity for predicting positive crossmatch, attributable to either limited typing of donor HLA-DQ and -DP alleles or due to non-HLA Abs. False-positive prediction of CXM correlated with detection of "weak" Abs with low mean fluorescence intensity (MFI < 2000). Furthermore, we found that a ratio of the MFI of the DSA bead to the MFI of the positive control bead was a better method for identifying weak DSA that did not result in CXM-positive reactions. Interestingly, patients with weak DSA and negative CXM had equivalent graft survival over an 18 month follow-up period, suggesting that weak DSA may not preclude transplantation.
Human immunology 03/2010; 71(3):268-73. · 2.55 Impact Factor
[show abstract][hide abstract] ABSTRACT: Little information has been published about the suitability of candidates for living organ donation who have a past or current psychiatric diagnosis. A retrospective review of 445 living donor kidney transplants performed at Barnes-Jewish Hospital's transplant center from 1995 to 2005 disclosed 42 donor candidates with such a history, prompting detailed psychological evaluation. Although 41 candidates (10% of the donor pool) met criteria for 1 or more psychiatric diagnoses, none were considered psychologically unfit for donation. Of these, 22 candidates underwent kidney donation without medical or surgical complications and without development of subsequent active psychological problems. Several donors maintained long-term contact up to 12 years to report good health and a high degree of satisfaction with the decision to donate. This experience suggests that for donor candidates with a psychiatric diagnosis, formal psychiatric evaluation to evaluate current mental health stability is warranted. Stable individuals, on or off therapy, can be considered fit to donate with expected short- and long-term outcome prognoses similar to those for the general population.
[show abstract][hide abstract] ABSTRACT: Presensitization of donor human leukocyte antigens (HLA) demonstrated through a positive crossmatch is detrimental to allograft function and best avoided through donor exclusion. The clinical significance of alloantibody detectable by sensitive solid-phase assay is not completely defined and is the focus of this study. Pretransplant sera from 64 consecutive living-donor renal transplant recipients were screened by enzyme-linked immunosorbent assay (ELISA) and Luminex assays. Results were analyzed for correlation with clinical outcome. Luminex proved more sensitive than ELISA for alloantibody detection, with three identifiable patterns. Twenty-eight patients were antibody negative, 24 had non-donor-specific antibody (non-DSA), and 12 had donor-specific antibody (DSA). The highest number of rejections (n = 4) and graft losses (n = 6) occurred in the antibody-negative group. The non-DSA group had two graft losses, as did the DSA group. The two graft losses in the DSA group were caused by recurrent focal segmental glomerulosclerosis (FSGS) at 35 months and death with a functioning graft at 32 months. Overall, there were no cases of antibody-mediated rejection and allograft function to 4 years was comparable among all three groups. Under our standard immunosuppression protocol and crossmatch criteria for histocompatibility, alloantibody detectable by Luminex was not detrimental to successful living-donor transplantation.
Human immunology 06/2009; 70(8):584-8. · 2.55 Impact Factor
[show abstract][hide abstract] ABSTRACT: In recent years, transplantation of islets and pancreas has become a viable option for patients debilitated with type I diabetes. The success of islet transplantation has been attributed to the ability to isolate high quality islets for transplantation and capacity to maintain the recipient's immunosuppressive levels within a specific target range following transplantation. The purpose of this study was to determine the role of pretransplant sensitization to human leukocyte antigen (HLA) in islet transplantation.
We retrospectively analyzed seven patients that were transplanted with islets under the auspices of the Juvenile Diabetes Research Foundation and Islet Cell Resource Center/National Institutes of Health. Humoral sensitization towards donor antigens both prior to and following islet transplantation was detected by FLOW panel reactive antibodies (PRA) and donor-specific cellular sensitization was detected by performing enzyme-linked immunospot assay analysis for cytokines interferon-gamma and interleukin-2.
Our analysis demonstrates that humoral and cellular sensitization to histocompatibility antigens prior to and after islet transplantation are associated with the failure of transplanted islets
Patient selection based on sensitization to donor HLA may be one of the factors crucial for the success of islet transplant. Further, in some patients, rejection of islets can be associated with sensitization to mismatched donor histocompatibility antigens.
[show abstract][hide abstract] ABSTRACT: Although paired donation, list donation and non-directed donation allow more recipients to receive living donor transplants, policy makers do not know how willing incompatible potential donors are to participate. We surveyed 174 potential donors ruled out for ABO-incompatibility or positive cross-match about their participation willingness. They were more willing to participate in paired donation as compared to list donation where the recipient receives the next deceased donor kidney (63.8% vs. 37.9%, p < 0.001) or non-directed donation (63.8% vs. 12.1%, p < 0.001). Their list donation willingness was greater when their intended recipients moved to the top versus the top 20% of the waiting list (37.9% vs. 19.0%, p < 0.001). Multivariate logistic regression modeling revealed that potential donors' empathy, education level, relationship with their intended recipient and the length of time their intended recipient was on dialysis also affected willingness. For paired donation, close family members of their intended recipient (odds ratio (OR) = 3.01, confidence intervals (CI) = 1.29, 7.02), with high levels of empathy (OR = 2.68, CI = 1.16, 6.21) and less than a college education (OR = 2.67, CI = 1.08, 6.61) were more willing to participate compared to other donors. Extrapolating these levels of willingness nationally, a 1-11% increase in living donation rates yearly (84-711 more transplants) may be possible if donor-exchange programs were available nationwide.
American Journal of Transplantation 08/2006; 6(7):1631-8. · 6.19 Impact Factor
[show abstract][hide abstract] ABSTRACT: Allografts transplanted across ABO incompatibility or human leucocyte antigen (HLA)-sensitization undergoes antibody (Ab) mediated hyperacute rejection. Depleting anti-graft Ab from the recipient by plasmapheresis prior to transplantation can prevent this Ab-mediated rejection. Under these conditions, allografts have been shown to function even when the Ab rebound in the recipients. We have developed an in vitro model using human aortic endothelial cells (EC) and elucidated the ability of W6/32 HLA class I monoclonal Ab to provide signals following binding to MHC class I molecules. Using this model, we show that ECs undergo caspase 3-dependent cell death by apoptosis upon exposure to saturating concentrations of W6/32 and complement. In contrast, exposure of ECs to sub-saturating concentrations of W6/32 conferred resistance towards Ab/complement-mediated lysis that has been termed accommodation. Accommodated ECs exhibited a significant increase in the expression of anti-apoptotic genes Bcl-xL, Bcl-2 and Heme Oxygenase-1 and the induction of Phosphatidylinositol 3 kinase (PI3K) and cyclic adenosine monophosphate (cAMP) dependent protein kinase A activities that facilitate the phosphorylation of Bad at positions Ser(136) and Ser(112). In conclusion, exposure of sub-saturating concentrations of HLA class I Ab results in the induction of signals downstream that confers resistance to endothelial cells against Ab-complement mediated cell death. Together, the observations made in this study will provide the basis for delineating the molecular mechanisms involved in mediating accommodation and developing strategies to induce accommodation in grafts prior to transplantation in highly sensitized patients.
[show abstract][hide abstract] ABSTRACT: After liver transplantation, patients with stage III hepatocellular carcinoma (HCC) experience survivals similar to those of patients with less advanced disease and of matched control subjects.
Retrospective review of prospectively collected database.
Fifty-one adults with HCC and 153 matched adults without HCC who underwent orthotopic liver transplantation.
One-, 3-, and 5-year survivals for all groups. After matching for year of transplantation, age, sex, and underlying liver disease, long-term survival was compared between groups. Rates of recurrence were also measured in the HCC groups.
From August 1, 1985, to February 28, 2002, we performed 635 adult liver transplantations, including 51 (8%) in patients with HCC. One hundred fifty-one patients without HCC who underwent transplantation were selected as controls. Patient demographic features were similar between case-control groups. The overall 5-year survival trend was worse for patients with HCC vs their matched controls (48% vs 65%; P = .07); however, this survival disadvantage was eliminated when patients with stages I through III HCC were combined and compared with their matched controls (59% vs 63%; P = .96). Survival of patients with stage III disease was comparable to that of matched controls (65% vs 59%; P = .44).
For patients with stages I through III disease, long-term survival is comparable to that of matched controls, and only patients with stage IV disease experience poorer survival. Consideration should be given to granting exception points to patients with stage III disease.
Archives of Surgery 06/2005; 140(5):459-64; discussion 464. · 4.10 Impact Factor
[show abstract][hide abstract] ABSTRACT: Early experience with deceased donor (DD) organ recovery outside of the hospital setting was found to be safe, efficient and cost effective. A 2-year experience under current practice protocols implemented to further process improvements is now reviewed. From December 1, 2001 to December 31, 2003, 123 criteria eligible DDs were transferred from local and regional hospitals to the Mid-America Transplant Services (MTS) facility for organ and tissue recovery. In this retrospective analysis, outcome comparisons were made with 79 conventional hospital-based recoveries. Compared to hospital recoveries, MTS facility recoveries were associated with significantly reduced critical care unit time (819 vs. 502 min), time to cross-clamp following brain death (966 vs. 731 min), operating room delay (54 vs. 9 min) and a trend toward reduced organ cold ischemia times which reached significance for heart and lungs when compared to regional hospital recoveries (147 vs. 221 and 192 vs. 327 min). MTS facility recovery afforded substantial cost savings over local and regional hospital recoveries (US 6,690 dollars and US 5,452 dollars per donor, respectively). The current practice of DD recovery at the MTS facility was applicable for most recoveries, improved process efficiency, and afforded substantial cost savings without donor compromise.
American Journal of Transplantation 06/2005; 5(5):1105-10. · 6.19 Impact Factor
[show abstract][hide abstract] ABSTRACT: Targeting 2-hr postdose cyclosporine (C2) levels to 1,000 to 1,700 mg/dL during the first 6 months after renal transplantation is recommended for triple immunosuppressive regimens. This trial determines whether lower C2 levels could be targeted safely in de novo kidney transplant recipients under a quadruple regimen compared with a similar cohort monitored with trough (C0) levels.
This single-center, sequential, cohort-designed trial included patients who received Thymoglobulin, corticosteroids, an antimetabolite, and cyclosporine monitored by C2 (n=50) or C0 (n=50). Cyclosporine was tapered to maintain the C2 between 1,000 and 1,200 ng/mL months 0 to 3 and between 600 and 1,000 ng/mL thereafter and C0 between 250 and 350 ng/mL months 0 to 3 and between 100 and 250 ng/mL thereafter.
Baseline patient and donor characteristics were similar. There were no differences in graft survival (100% C2 vs. 100% C0), acute rejection (4% C2 vs. 6% C0), allograft function, or adverse events at 6 months. C2 levels were lower than the suggested guidelines throughout the study (33% lower at 1 month and 48% lower at 6 months). Lower cyclosporine doses were achieved in the C2 arm compared with the C0 arm by 1 month and were sustained throughout the trial, which translated into an average cyclosporine cost savings of USD $773 in the C2 arm during the 6-month period (P<0.001).
With a quadruple immunosuppressive regimen and lower C2 targets than recommended for triple therapy, safe and effective cyclosporine minimization was achieved. Lower cyclosporine doses were achieved in C2-monitored patients compared with C0-monitored patients, translating into lower immunosuppressive costs.
[show abstract][hide abstract] ABSTRACT: Chronic lung allograft rejection in the form of bronchiolitis obliterans syndrome and its histopathologic correlate, obliterative bronchiolitis (OB), are a major source of morbidity and mortality after lung transplantation. Murine heterotopic tracheal transplants into fully allogeneic mismatched recipients develop obliterative airway disease (OAD), which is a suitable model of OB. Using this murine heterotopic tracheal allograft model, we evaluated the effect of pirfenidone, a novel antifibrotic agent, on the development of OAD.
Mice transplanted with complete MHC-mismatched tracheal allografts received pirfenidone (0.5%) in pulverized food according to different schedules: daily for the first 14 days after transplantation or daily for the duration of the study beginning on posttransplantation days 0, 5, or 10.
Mice on a continuous daily regimen of pirfenidone failed to develop evidence of chronic allograft rejection at the termination of the study (60 days). Mice receiving pirfenidone limited to the early posttransplantation period had delayed onset of OAD to 60 days. Forty percent (2/5) of mice receiving a continuous regimen of pirfenidone beginning on day 5 after transplantation had no evidence of OAD at 28 days. However, when the drug was started on day 10, all mice developed OAD by 28 days.
Our results demonstrate a delay of onset or abrogation of OAD when pirfenidone is administered in the early posttransplantation period. These findings suggest that pirfenidone is a candidate drug to be evaluated for prevention of the fibrotic changes seen in OB in human recipients of lung transplants.