Javier Briceño

Universidad Católica de Córdoba, Córdoba, Córdoba, Argentina

Are you Javier Briceño?

Claim your profile

Publications (64)159.04 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: There is an increasing discrepancy between the number of potential liver grafts recipients and the number of organs available. Organ allocation should follow the concept of benefit of survival, avoiding human-innate subjectivity.The aim of this study is to use artificial-neural-networks (ANN) for donor-recipient (D-R) matching in liver transplantation (LT) and to compare its accuracy with validated scores (MELD, D-MELD, DRI, P-SOFT, SOFT and BAR) of graft survival.
    Journal of hepatology. 11/2014; J Hepatol. 2014 Nov;(61(5)):1020-8.
  • J Briceño, R Ciria
    [Show abstract] [Hide abstract]
    ABSTRACT: Liver donation is the cornerstone for the expansion of liver transplantation. Although big efforts have been performed to release alternatives for increasing the donor pool, only extended-criteria donors have become a feasible option. The success of the Spanish Model for organ transplantation is well known. Approximately 5.4% of all the liver transplants (LT) are performed in Spain, with a rate of 22.9 LT per million people (pmp). Approximately 70 papers on extended-criteria donors have been reported from Spanish LT teams. Pioneering works in donor steatosis, non-heart-beating donors, donor age-hepatitis C virus, ischemia/reperfusion injury, normothermic extracorporeal membrane oxygenation, and donor steatosis-hepatitis C virus are among the main contributions in the field. Considering data from the Spanish National Registry, it can be observed that an accumulation of donor and recipient factors leads to a continuum of risk for liver transplantation. Donors are not "bad" enough to decline a liver offer per se. In Spain, clear efforts should be made to work on more stable and homogeneous criteria for donor acceptance. In this sense, defining a specific Spanish donor risk index would be helpful.
    Transplantation Proceedings 01/2014; 46(9):3079-81. · 0.95 Impact Factor
  • ANZ Journal of Surgery 10/2013; · 1.50 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The current methods available for screening and detecting hepatocellular carcinoma (HCC) have insufficient sensitivity and specificity, and only a low percentage of diagnosis of small tumours is based on these assays. Because HCC is usually asymptomatic at potentially curative stages, identification of biomarkers for the early detection of HCC is essential to improve patient survival. The aim of this study was to identify candidate markers for HCC development in the plasma from hepatitis C virus (HCV)-infected cirrhotic patients. We compared protein expression profiles of plasma samples from HCV-infected cirrhotic patients with and without HCC, using two-dimensional fluorescence difference gel electrophoresis (2-D DIGE) coupled with MALDI-TOF/TOF mass spectrometry. The 2-D DIGE results were analysed statistically using Decyder™ software, and verified by western blot and enzyme-linked immunosorbent assay (ELISA). In the plasma of HCV-infected HCC patients, we observed decreased expression of complement component 9, ficolin-3 (FCN3), serum amyloid P component (SAP), fibrinogen-gamma and immunoglobulin gamma-1 chain, and increased expression of vitronectin (VTN) and galectin-3 binding protein (G3BP) by DIGE analysis. ELISA confirmed DIGE results for VTN and G3BP but not for SAP or FCN3 in a larger patient population. The proteins VTN and SAP are candidate biomarkers for HCC development in HCV-infected cirrhotic patients.
    Liver international: official journal of the International Association for the Study of the Liver 07/2013; · 3.87 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Recurrence of hepatocellular carcinoma (HCC) is a major complication after liver transplantation (LT). The initial immunosuppression protocol may influence HCC recurrence, but the optimal regimen is still unknown. 219 HCC consecutive patients under Milan criteria who received a LT at 2 european centres between 2000-2010 were included. Median follow-up was 51 months (IQR 26-93). Demographic characteristics, HCC features, and immunosuppression protocol within the first month after LT were evaluated against HCC recurrence by using Cox regression. In the explanted liver 110 patients (50%) had multinodular HCC, and largest nodule diameter was 3±2.1 cm. Macrovascular invasion was incidentally detected in 11 patients (5%), and microvascular invasion was present in 41 patients (18.7%). HCC recurrence rates were 13.3% at 3 years and 17.6% at 5 years. HCC recurrence was not influenced by the use/non use of steroids and antimetabolites (p=0.69 and p=0.70 respectively), and was similar with tacrolimus or cyclosporine (p=0.25). Higher exposure to calcineurin inhibitors within the first month after LT (mean tacrolimus trough concentrations >10ng/mL or cyclosporine trough concentrations >300ng/mL), but not thereafter, was associated with increased risk of HCC recurrence (27.7% vs 14.7% at 5 years; p=0.007). The independent predictors of HCC recurrence by multivariate analysis were: high exposure to calcineurin inhibitors defined as above (RR=2.82; p=0.005), diameter of the largest nodule (RR=1.31; p<0.001), microvascular invasion (RR=2.98; p=0.003) and macrovascular invasion (RR=4.57; p=0.003). Immunosuppression protocols with early CNI minimization should be preferred in LT patients with HCC in order to minimize tumour recurrence.
    Journal of Hepatology 07/2013; · 9.86 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Hepatitis C virus (HCV) is a major health problem that leads to chronic hepatitis, cirrhosis and hepatocellular carcinoma, being the most frequent indication for liver transplantation in several countries. Unfortunately, HCV re-infects the liver graft almost invariably following reperfusion, with an accelerated history of recurrence, leading to 10%-30% of patients progressing to cirrhosis within 5 years of transplantation. In this sense, some groups have even advocated for not re-transplanting this patients, as lower patient and graft outcomes have been reported. However, the management of HCV recurrence is being optimized and several strategies to reduce post-transplant recurrence could improve outcomes, decrease the rate of re-transplantation and optimize the use of available grafts. Three moments may be the focus of potential actions in order to decrease the impact of viral recurrence: the pre-transplant moment, the transplant environment and the post-transplant management. In the pre-transplant setting, it is not well established if reducing the pre transplant viral load affects the risk for HCV progression after transplant. Obviously, antiviral treatment can render the patient HCV RNA negative post transplant but the long-term benefit has not yet been fully established to justify the cost and clinical risk. In the transplant moment, factors as donor age, cold ischemia time, graft steatosis and ischemia/reperfusion injury may lead to a higher and more aggressive viral recurrence. After the transplant, discussion about immunosuppression and the moment to start the treatment (prophylactic, pre-emptive or once-confirmed) together with new antiviral drugs are of interest. This review aims to help clinicians have a global overview of post-transplant HCV recurrence and strategies to reduce its impact on our patients.
    World journal of hepatology. 05/2013; 5(5):237-250.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: The optimal allocation of organs in liver transplantation is a problem that can be resolved using machine-learning techniques. Classical methods of allocation included the assignment of an organ to the first patient on the waiting list without taking into account the characteristics of the donor and/or recipient. In this study, characteristics of the donor, recipient and transplant organ were used to determine graft survival. We utilised a dataset of liver transplants collected by eleven Spanish hospitals that provides data on the survival of patients three months after their operations. METHODS AND MATERIAL: To address the problem of organ allocation, the memetic Pareto evolutionary non-dominated sorting genetic algorithm 2 (MPENSGA2 algorithm), a multi-objective evolutionary algorithm, was used to train radial basis function neural networks, where accuracy was the measure used to evaluate model performance, along with the minimum sensitivity measurement. The neural network models obtained from the Pareto fronts were used to develop a rule-based system. This system will help medical experts allocate organs. RESULTS: The models obtained with the MPENSGA2 algorithm generally yielded competitive results for all performance metrics considered in this work, namely the correct classification rate (C), minimum sensitivity (MS), area under the receiver operating characteristic curve (AUC), root mean squared error (RMSE) and Cohen's kappa (Kappa). In general, the multi-objective evolutionary algorithm demonstrated a better performance than the mono-objective algorithm, especially with regard to the MS extreme of the Pareto front, which yielded the best values of MS (48.98) and AUC (0.5659). The rule-based system efficiently complements the current allocation system (model for end-stage liver disease, MELD) based on the principles of efficiency and equity. This complementary effect occurred in 55% of the cases used in the simulation. The proposed rule-based system minimises the prediction probability error produced by two sets of models (one of them formed by models guided by one of the objectives (entropy) and the other composed of models guided by the other objective (MS)), such that it maximises the probability of success in liver transplants, with success based on graft survival three months post-transplant. CONCLUSION: The proposed rule-based system is objective, because it does not involve medical experts (the expert's decision may be biased by several factors, such as his/her state of mind or familiarity with the patient). This system is a useful tool that aids medical experts in the allocation of organs; however, the final allocation decision must be made by an expert.
    Artificial intelligence in medicine 03/2013; · 1.65 Impact Factor
  • Javier Briceño
    Journal of Hepatology 10/2012; · 9.86 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper reports on a decision support system for assigning a liver from a donor to a recipient on a waiting-list that maximises the probability of belonging to the survival graft class after a year of transplant and/or minimises the probability of belonging to the non-survival graft class in a two objective framework. This is done with two models of neural networks for classification obtained from the Pareto front built by a multi-objective evolutionary algorithm – called MPENSGA2. This type of neural network is a new model of the generalised radial basis functions for obtaining optimal values in C (Correctly Classified Rate) and MS (Minimum Sensitivity) in the classifier, and is compared to other competitive classifiers. The decision support system has been proposed using, as simply as possible, those models which lead to making the correct decision about receptor choice based on efficient and impartial criteria.
    European Journal of Operational Research 10/2012; 222(2):317–327. · 2.04 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Shortage of organs for transplantation has led to the renewed interest in donation after circulatory-determination of death (DCDD). We conducted a retrospective analysis (2001-2009) and a subsequent prospective validation (2010) of liver Maastricht-Category-3-DCDD and donation-after-brain-death (DBD) offers to our program. Accepted and declined offers were compared. Accepted DCDD offers were divided into donors who went on to cardiac arrest and those who did not. Donors who arrested were divided into those producing grafts that were transplanted or remained unused. Descriptive comparisons and regression analyses were performed to assess predictor models of donor cardiac arrest and graft utilization. Variables from the multivariate analysis were prospectively validated. Of 1579 DCDD offers, 621 were accepted, and of these, 400 experienced cardiac arrest after withdrawal of support. Of these, 173 livers were transplanted. In the DCDD group, donor age < 40 years, use of inotropes and absence of gag/cough reflexes were predictors of cardiac arrest. Donor age >50 years, BMI >30, warm ischemia time >25 minutes, ITU stay >7 days and ALT ≥ 4× normal rates were risk factors for not using the graft. These variables had excellent sensitivity and specificity for the prediction of cardiac arrest (AUROC = 0.835) and graft use (AUROC = 0.748) in the 2010 prospective validation. These models can feasibly predict cardiac arrest in potential DCDDs and graft usability, helping to avoid unnecessary recoveries and healthcare expenditure.
    American Journal of Transplantation 09/2012; · 6.19 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: OBJECTIVE: To identify peri-transplant predictors of early graft survival and post-transplant parameters to predict early graft outcome after paediatric liver transplantation (LT). BACKGROUND: Children's response to liver dysfunction after LT is poor. No data have been reported regarding early predictors of poor graft survival that would be of potential value in rescuing children at risk after LT. METHODS: A retrospective, cohort study of 422 paediatric LT performed from 2000-2010 in a single center was conducted. Multiple peri-transplant variables were analyzed. Univariate and multivariate analyses with ROC curves were performed to identify predictors of early (30-, 60-, and 90-days) graft loss (EGL). The number of patients needed to treat (NNT) was calculated when risk factors were identified. Comparisons with Olthoff criteria of early-graft-dysfunction in adults were performed. RESULTS: Overall 30-, 60-, and 90-days graft survival was 93.6%, 92.6% and 90.7%, respectively. Recipient age (0-2 and 6-16 years), acute liver failure (ALF) and post-transplant day-7-serum bilirubin >200 µmol/L were risk factors of graft loss in the three-strata Cox-models. The product of peak-AST, day-2-INR and day-7-bilirubin (30-, 60-, and 90-days AUROCs of 0.77, 0.75 and 0.71) and Day-7-bilirubin levels >200 µmol/L (30-, 60-, and 90-days AUROCs of 0.75, 0.66 and 0.63) had excellent rates of prediction of EGL in the paediatric population (Sensitivity=72.7%; Specificity=96.6%; Positive-predictive-value=95.5%; Negative-predictive-value=78%). The NNT with early re-transplantation when Day-7-bilirubin is >200 µmol/L would be 2.17 (non-adjusted) and 2.76 (adjusted to graft survival). CONCLUSIONS: Two scores have been identified (Peak AST*Day-2-INR*Day-7-Bilirubin and/or post-transplant day-7 bilirubin >200 µmol/L) as clinically valuable tools with high accuracy to predict EGL. A more aggressive attitude to considering early re-transplantation in this group may further improve survival after LT. Liver Transpl, 2012. © 2012 AASLD.
    Liver Transplantation 08/2012; · 3.94 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: BACKGROUND: Orthotopic liver transplantation (OLT) is currently the elective treatment for advanced liver cirrhosis and acute liver failure. Ischemia/reperfusion damage may jeopardize graft function during the postoperative period. Cardiotrophin-1 (CT-1) has demonstrated cytoprotective properties in different experimental models of liver injury. There is no evidence to demonstrate its potential use in the prevention of the ischemia/reperfusion injury that occurs during OLT. The present study is the first report to show that the administration of CT-1 to donors would benefit the outcome of OLT. MATERIALS AND METHODS: We tested the cytoprotective effect of CT-1 administered to the donor prior to OLT in an experimental pig model. Hemodynamic changes, hepatic histology, cell death parameters, activation of cell signaling pathways, oxidative and nitrosative stress, and animal survival were analyzed. RESULTS: Our data showed that CT-1 administration to donors increased animal survival, improved cardiac and respiratory functions, and reduced hepatocellular injury as well as oxidative and nitrosative stress. These beneficial effects, related to the activation of AKT, ERK, and STAT3, reduced caspase-3 activity and diminished IL-1β and TNF-α expression together with IL-6 upregulation in liver tissue. CONCLUSIONS: The administration of CT-1 to donors reduced ischemia/reperfusion injury and improved survival in an experimental pig model of OLT.
    Journal of Surgical Research 08/2012; · 2.02 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The continuing shortage of donors has led to the increasing use of marginal grafts. Surgical techniques such as split, domino, and living donations have not been able to decrease waiting list mortality. Donation after cardiac death (DCD) was the only source of grafts prior to the establishment of brain death criteria in 1968. Thereafter, donation after brain death emerged as the leading source of grafts. The context in which irreversible cessation of circulatory and respiratory functions happens was the cornerstone to definite the four categories of DCD by the First International Workshop on DCD held in Maastricht in 1995. Controlled (CDCD) and uncontrolled (UDCD) categories now account for 10%-20% of the donor pool in several countries. Despite initial high rates of primary nonfunction and ischemic-type biliary lesions, refinements in protocols and surgical techniques have led to excellent 1- and 3-year graft survivals of 80% and 70%, respectively with PNF and ITBL rates below 3%. The institution of UDCD and CDCD depends on legal considerations of presumed consent and withdrawal of maneuvers, respectively. The potential for DCD programs is huge; it may be the only real, effective way to increase the grafts pool, both in adult and pediatric populations. Recent advances in perfusion machines will surely optimize this donor pool and allow new therapies for graft resuscitation.
    Transplantation Proceedings 07/2012; 44(6):1470-4. · 0.95 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Donor-recipient matching constitutes a complex scenario difficult to model. The risk of subjectivity and the likelihood of falling into error must not be underestimated. Computational tools for the decision-making process in liver transplantation can be useful, despite the inherent complexity involved. Therefore, a multi-objective evolutionary algorithm and various techniques to select individuals from the Pareto front are used in this paper to obtain artificial neural network models to aid decision making. Moreover, a combination of two pre-processing methods has been applied to the dataset to offset the existing imbalance. One of them is a resampling method and the other is a outlier deletion method. The best model obtained with these procedures (with AUC = 0.66) give medical experts a probability of graft survival at three months after the operation. This probability can help medical experts to achieve the best possible decision without forgetting the principles of fairness, efficiency and equity.
    Soft Computing 02/2012; · 1.30 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper proposes a novel algorithm for ordinal classification based on combining ensemble techniques and discriminant analysis. The proposal is applied to a real application of liver transplantation, where the objective is to predict survival rates of the graft. Ordinal classification is used for this problem because the classes are defined by the following temporal order: 1) failure of the graft before the first 15 days after transplantation, 2) failure between 15 days and 3 months, 3) failure between 3 months and one year, and 4) no failure presented (taking into account that the patient follow-up is up to one year after the transplantation). When compared to other state-of-the-art classifiers like AdaBoost, EBC(SVM) or KDLOR, the proposed algorithm is shown to be competitive. The models obtained could allow medical experts to predict survival rates without knowing exactly the number of days the transplanted organ survived.
    Neural Networks (IJCNN), The 2012 International Joint Conference on; 01/2012
  • [Show abstract] [Hide abstract]
    ABSTRACT: Liver resection is a feasible treatment for multiple liver diseases. There is no evidence about the impact of age on liver regeneration. To assess the effect of age on liver regeneration in an experimental in vivo animal model of 70%-partial hepatectomy. Forty young (Y) and old (O) Wistar male rats (n = 80) were distributed into four groups [controls (C), sham operated (SO), hepatectomy 6 h (H6), and 48 h (H48)]. Different morphometric and biochemical factors, oxidative and nitrosative stress, lipid peroxidation, cytokines kinetics, and histopathologic tissular parameters were determined. Early postoperative mortality was higher in aged rats (P = 0.049). Morphometric determinations, liver regeneration index, and total volume weight were favorable to young rats. Serum transaminase levels were higher in aged rats. Parameters of necrosis (measured by histopathologic injury [HI: 0-I-II-III]), regeneration (measured by bromodeoxyuridine-BrdU incorporation) and apoptosis (determined by the TDT-mediated dUTP nick end labeling-TUNEL) were well-synchronized in young rats. Parameters of oxidative stress such as reduced (GSH), oxidized (GSSG) glutathione and lipid peroxidation (measured by hepatic malondialdehyde -MDA-) were lower in young animals throughout the studied period. Nitrosative stress measured by nitric oxide (NO) end-products was higher in late stages in resected old rats. Pro-inflammatory cytokines (TNF- α) reached higher and earlier levels in aged rats while pro-regenerative cytokines (IL-6) were significantly higher in early stages for young rats and in late stages for aged rats. The levels of TGF-β were higher in young rats. Liver regeneration is delayed and reduced in aged animals submitted to liver resection.
    Journal of Surgical Research 12/2011; 175(1):e1-9. · 2.02 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Rifampicin has been used for the treatment of patients with jaundice and pruritus. This study evaluated the effect of rifampicin on the expression of different detoxification systems and bile acid transporters during in-vivo and in-vitro experimental models of cholestasis. Rifampicin was administered to glycochenodeoxycholic acid (GCDCA)-treated human hepatocytes and bile duct-obstructed rats. Different parameters related to cell death, and the expression of phase I and II drug metabolizing enzymes (DME) and bile acid transporters were determined. The induction of hepatocellular injury induced by cholestasis was associated with a reduction in cytochrome P4503A4 (CYP3A4), CYP7A1, and UDP-glucuronosyltransferase 2B4 (UGT2B4) expression, as well as an increase in import (Na(+)-taurocholate co-transporting polypeptide, NTCP) system expression. The beneficial properties of rifampicin were associated with an increase in DME and export bile acid systems (multidrug resistance-associated protein 4, MRP4, and bile acid export pump to bile duct, BSEP) expression, as well as a reduction in NTCP expression. The beneficial effect of rifampicin in cholestasis is associated with an increase in DME expression involved in toxic, bile acid and cholesterol metabolism, as well as a reduction in the bile acid importing system in hepatocytes.
    Journal of hepato-biliary-pancreatic sciences. 04/2011; 18(5):740-50.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Hepatorenal syndrome (HRS) is a complication of cirrhosis with a poor prognosis without transplantation. The aim of this study is to analyze the influence of extended criteria donors (ECD) on the postoperative outcome of recipients with HRS. The last 498 patients were divided according to pre-transplant type 1 or 2 HRS. Sixty-six (13.25%) recipients fulfilled HRS criteria. Three-month graft survival was 84% with at-listing recipient serum creatinine ranging from 0-0.8 mg/dL; 80% with s-creatinine = 0.9-1.5 mg/dL; 79% with s-creatinine = 1.6-2.5 mg/dL; and 58% with s-creatinine >2.6 mg/dL (log-rank = 18.039; p = 0.001). Recipients with HRS presented higher levels of pre-transplant creatinine and lower levels of sodium, more episodes of hemodialysis and ascitis, and higher model of end-stage liver disease-scores. Three-month graft survival in recipients with HRS relative to ECD-variables showed differences in univariate analysis according to graft steatosis (85% in absent steatosis = 0-10%; 78% in mild steatosis = 10-30%; 76% in moderate steatosis = 30-60%; and 49% when severe steatosis >60%; log-rank = 5.146; p = 0.023). Cox-proportional-hazard-model revealed that graft macrosteatosis per-30%-increments (p = 0.000; HR = 1.303 [1.24-1.33] per-30%-increment) and donors >65 yr (p = 0.089; HR = 1.622 [1.17-1.94]) were independent predictors of graft loss in recipients with HRS. In conclusion, the use of ECD in recipients with cirrhosis and HRS is a good option. However, grafts from moderate-to-severe steatosis and those from aged donors must be carefully allocated in candidates with HRS.
    Clinical Transplantation 02/2011; 25(3):E257-63. · 1.63 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In liver transplantation, matching donor and recipient is a problem that can be solved using machine learning techniques. In this paper we consider a liver transplant dataset obtained from eleven Spanish hospitals, including the patient survival or the rejection in liver transplantation one year after it. To tackle this problem, we use a multi-objective evolutionary algorithm for training generalized radial basis functions neural networks. The obtained models provided medical experts with a mathematical value to predict survival rates allowing them to come up with a right decision according to the principles of justice, efficiency and equity.
    13th Annual Genetic and Evolutionary Computation Conference, GECCO 2011, Companion Material Proceedings, Dublin, Ireland, July 12-16, 2011; 01/2011
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Donor-Recipient matching constitutes a complex scenario not easily modelable. The risk of subjectivity and the likelihood of falling into error must not be underestimated. Computational tools for decision-making process in liver transplantation can be useful, despite its inherent complexity. Therefore, a Multi-Objective Evolutionary Algorithm and various techniques of selection of individuals are used in this paper to obtain Artificial Neural Network models to assist in making decisions. Thus, the experts will have a mathematical value that enables them to make a right decision without deleting the principles of justice, efficiency and equity.
    Advances in Computational Intelligence - 11th International Work-Conference on Artificial Neural Networks, IWANN 2011, Torremolinos-Málaga, Spain, June 8-10, 2011, Proceedings, Part II; 01/2011

Publication Stats

668 Citations
159.04 Total Impact Points


  • 2013
    • Universidad Católica de Córdoba
      • Departamento de Tecnología de los Alimentos
      Córdoba, Córdoba, Argentina
  • 2010–2013
    • Instituto Maimónides de Investigación Biomédica de Córdoba
      Cordoue, Andalusia, Spain
  • 2008–2013
    • Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas
      Barcino, Catalonia, Spain
  • 1995–2013
    • Hospital Universitario Reina Sofía
      Cordoue, Andalusia, Spain