Background Although prior reports have evaluated the clinical and cost impacts of cardiovascular magnetic resonance (CMR) for low-to-intermediate-risk patients with suspected significant coronary artery disease (CAD), the cost-effectiveness of CMR compared to relevant comparators remains poorly understood. We aimed to summarize the cost-effectiveness literature on CMR for CAD and create a cost-effectiveness calculator, useable worldwide, to approximate the cost-per-quality-adjusted-life-year (QALY) of CMR and relevant comparators with context-specific patient-level and system-level inputs. Methods We searched the Tufts Cost-Effectiveness Analysis Registry and PubMed for cost-per-QALY or cost-per-life-year-saved studies of CMR to detect significant CAD. We also developed a linear regression meta-model (CMR Cost-Effectiveness Calculator) based on a larger CMR cost-effectiveness simulation model that can approximate CMR lifetime discount cost, QALY, and cost effectiveness compared to relevant comparators [such as single-photon emission computed tomography (SPECT), coronary computed tomography angiography (CCTA)] or invasive coronary angiography. Results CMR was cost-effective for evaluation of significant CAD (either health-improving and cost saving or having a cost-per-QALY or cost-per-life-year result lower than the cost-effectiveness threshold) versus its relevant comparator in 10 out of 15 studies, with 3 studies reporting uncertain cost effectiveness, and 2 studies showing CCTA was optimal. Our cost-effectiveness calculator showed that CCTA was not cost-effective in the US compared to CMR when the most recent publications on imaging performance were included in the model. Conclusions Based on current world-wide evidence in the literature, CMR usually represents a cost-effective option compared to relevant comparators to assess for significant CAD.
People living with human immunodeficiency virus (PLWH) have significantly increased risk for cardiovascular disease in part due to inflammation and immune dysregulation. Clonal hematopoiesis of indeterminate potential (CHIP), the age-related acquisition and expansion of hematopoietic stem cells due to leukemogenic driver mutations, increases risk for both hematologic malignancy and coronary artery disease (CAD). Since increased inflammation is hypothesized to be both a cause and consequence of CHIP, we hypothesized that PLWH have a greater prevalence of CHIP. We searched for CHIP in multi-ethnic cases from the Swiss HIV Cohort Study (SHCS, n = 600) and controls from the Atherosclerosis Risk in the Communities study (ARIC, n = 8111) from blood DNA-derived exome sequences. We observed that HIV is associated with a twofold increase in CHIP prevalence, both in the whole study population and in a subset of 230 cases and 1002 matched controls selected by propensity matching to control for demographic imbalances (SHCS 7%, ARIC 3%, p = 0.005). We also observed that ASXL1 is the most commonly mutated CHIP-associated gene in PLWH. Our results suggest that CHIP may contribute to the excess cardiovascular risk observed in PLWH.
Background The use of stress perfusion-cardiovascular magnetic resonance (CMR) imaging remains limited in patients with implantable devices. The primary goal of the study was to assess the safety, image quality, and the diagnostic value of stress perfusion-CMR in patients with MR-conditional transvenous permanent pacemakers (PPM) or implantable cardioverter-defibrillators (ICD). Methods Consecutive patients with a transvenous PPM or ICD referred for adenosine stress-CMR were enrolled in this single-center longitudinal study. The CMR protocol was performed using a 1.5 T system according to current guidelines while all devices were put in MR-mode. Quality of cine, late-gadolinium-enhancement (LGE), and stress perfusion sequences were assessed. An ischemia burden of ≥ 1.5 segments was considered significant. We assessed the safety, image quality and the occurrence of interference of the magnetic field with the implantable device. In case of ischemia, we also assessed the correlation with the presence of significant coronary lesions on coronary angiography. Results Among 3743 perfusion-CMR examinations, 66 patients had implantable devices (1.7%). Image quality proved diagnostic in 98% of cases. No device damage or malfunction was reported immediately and at 1 year. Fifty patients were continuously paced during CMR. Heart rate and systolic blood pressure remained unchanged during adenosine stress, while diastolic blood pressure decreased (p = 0.007). Six patients (9%) had an ischemia-positive stress CMR and significant coronary stenoses were confirmed by coronary angiography in all cases. Conclusion Stress perfusion-CMR is safe, allows reliable ischemia detection, and provides good diagnostic value.
Background Increased total tau (t-tau) in cerebrospinal fluid (CSF) is a key characteristic of Alzheimer’s disease (AD) and is considered to result from neurodegeneration. T-tau levels, however, can be increased in very early disease stages, when neurodegeneration is limited, and can be normal in advanced disease stages. This suggests that t-tau levels may be driven by other mechanisms as well. Because tau pathophysiology is emerging as treatment target for AD, we aimed to clarify molecular processes associated with CSF t-tau levels. Methods We performed a proteomic, genomic, and imaging study in 1380 individuals with AD, in the preclinical, prodromal, and mild dementia stage, and 380 controls from the Alzheimer’s Disease Neuroimaging Initiative and EMIF-AD Multimodality Biomarker Discovery study. Results We found that, relative to controls, AD individuals with increased t-tau had increased CSF concentrations of over 400 proteins enriched for neuronal plasticity processes. In contrast, AD individuals with normal t-tau had decreased levels of these plasticity proteins and showed increased concentrations of proteins indicative of blood–brain barrier and blood-CSF barrier dysfunction, relative to controls. The distinct proteomic profiles were already present in the preclinical AD stage and persisted in prodromal and dementia stages implying that they reflect disease traits rather than disease states. Dysregulated plasticity proteins were associated with SUZ12 and REST signaling, suggesting aberrant gene repression. GWAS analyses contrasting AD individuals with and without increased t-tau highlighted several genes involved in the regulation of gene expression. Targeted analyses of SNP rs9877502 in GMNC , associated with t-tau levels previously, correlated in individuals with AD with CSF concentrations of 591 plasticity associated proteins. The number of APOE-e4 alleles, however, was not associated with the concentration of plasticity related proteins. Conclusions CSF t-tau levels in AD are associated with altered levels of proteins involved in neuronal plasticity and blood–brain and blood-CSF barrier dysfunction. Future trials may need to stratify on CSF t-tau status, as AD individuals with increased t-tau and normal t-tau are likely to respond differently to treatment, given their opposite CSF proteomic profiles.
Lumbar spine bone mineral density (BMD) and trabecular bone score (TBS) are both calculated on L1-L4 vertebrae. This study investigated the ability to predict osteoporotic fractures of BMD and TBS as calculated based on all possible adjacent L1-L4 vertebrae combinations. Present findings indicate that L1-L3 is an optimal combination to calculate LS-BMD or TBS. Introduction Lumbar spine (LS) BMD and TBS are both assessed in the LS DXA scans in the same region of interest, L1-L4. We aimed to investigate the ability to predict osteoporotic fractures of all the possible adjacent LS vertebrae combinations used to calculate BMD and TBS and to evaluate if any of these combinations performs better at osteoporotic fracture prediction than the traditional L1-L4 combination. Methods This study was embedded in OsteoLaus-women cohort in Switzerland. LS-DXA scans were performed using Discovery A System (Hologic). The incident vertebral fractures (VFs) and major osteoporotic fractures (MOFs) were assessed from VF assessments using Genant’s method or questionnaires (non-VF MOF). We ran logistic models using TBS and BMD to predict MOF, VF, and non-VF MOF, combining different adjustment factors (age, fracture level, or BMD). Results One thousand six hundred thirty-two women (mean ± SD) 64.4 ± 7.5 years, BMI 25.9 ± 4.5 kg/m ² , were followed for 4.4 years and 133 experienced MOF. The association of one SD decrease L1-L3 BMD with the odds ratios (ORs) of MOF was OR 1.32 (95%CI 1.15–1.53), L2-L4 BMD was 1.25 (95%CI 1.09–1.42), and L1-L4 BMD was 1.30 (95%CI 1.14–1.48). One SD decrease in L1-L3 TBS was more strongly associated with the odds of having a MOF (OR 1.64, 95% CI 1.34–2.00), than one SD decrease in L2-L4 TBS (OR 1.48, 95% CI 1.21–1.81), or in L1-L4 TBS (OR 1.60, CI 95% 1.32–1.95). Conclusion Current findings indicate that L1-L3 is an optimal combination for the TBS or LS-BMD calculation.
Background Clarithromycin may act as immune-regulating treatment in sepsis and acute respiratory dysfunction syndrome. However, clinical evidence remains inconclusive. We aimed to evaluate whether clarithromycin improves 28-day mortality among patients with sepsis, respiratory and multiple organ dysfunction syndrome. Methods We conducted a multicenter, randomized, clinical trial in patients with sepsis. Participants with ratio of partial oxygen pressure to fraction of inspired oxygen less than 200 and more than 3 SOFA points from systems other than the respiratory function were enrolled between December 2017 and September 2019. Patients were randomized to receive 1 gr of clarithromycin or placebo intravenously once daily for 4 consecutive days. The primary endpoint was 28-day all-cause mortality. Secondary outcomes were 90-day mortality; sepsis response (defined as at least 25% decrease in SOFA score by day 7); sepsis recurrence; and differences in peripheral blood cell populations and leukocyte transcriptomics. Results Fifty-five patients were allocated to each arm. By day 28, 27 (49.1%) patients in the clarithromycin and 25 (45.5%) in the placebo group died (risk difference 3.6% [95% confidence interval (CI) − 15.7 to 22.7]; P = 0.703, adjusted OR 1.03 [95%CI 0.35–3.06]; P = 0.959). There were no statistical differences in 90-day mortality and sepsis response. Clarithromycin was associated with lower incidence of sepsis recurrence (OR 0.21 [95%CI 0.06–0.68]; P = 0.012); significant increase in monocyte HLA-DR expression; expansion of non-classical monocytes; and upregulation of genes involved in cholesterol homeostasis. Serious and non-serious adverse events were equally distributed. Conclusions Clarithromycin did not reduce mortality among patients with sepsis with respiratory and multiple organ dysfunction. Clarithromycin was associated with lower sepsis recurrence, possibly through a mechanism of immune restoration. Clinical trial registration clinicaltrials.gov identifier NCT03345992 registered 17 November 2017; EudraCT 2017-001056-55.
The 5A score predicts in-hospital mortality of patients suffering from accidental hypothermia, including those not in cardiac arrest. The HOPE score was specifically developed to predict survival for the subgroup of hypothermic patients in cardiac considered for extracorporeal life support rewarming. The C-statistic in the external validation study of the HOPE score was 0.825 (95% CI: 0.753–0.897), confirming its excellent discrimination. In addition, its good calibration allows for a reliable interpretation of the corresponding survival probability after rewarming. The HOPE score should be used for predicting outcome and selecting hypothermic patients in cardiac arrest for rewarming.
Background Bioimpedance vector analysis (BIVA) has been suggested as a valuable tool in assessing volume status in critically ill patients. However, its effectiveness in guiding fluid removal by continuous renal replacement therapy (CRRT) has not been evaluated. Methods In this randomized controlled trial, 65 critically ill patients receiving CRRT were allocated on a 1:1 ratio to have UF prescribed and adjusted using BIVA fluid assessment in the intervention group (32 patients) or conventional clinical parameters (33 patients). The primary outcome was the lean body mass (LBM) water content at CRRT discontinuation, and the secondary outcomes included the mortality rate, urinary output, the duration of ventilation support, and ICU stay. Results The study group was associated with a lower water content of LBM (80.7 ± 9.4 vs. 85.9 ± 10.4%; p < 0.05), and a higher mean UF-rate and urinary output (1.5 ± 0.8 vs. 1.2 ± 0.5 ml/kg/h and 0.9 ± 0.9 vs 0.5 ± 0.6 ml/kg/h, both: p < 0.05). The mortality rate, the length of ICU stay, and ventilation support duration were similar. Conclusion BIVA guided UF prescription may be associated with a lower rate of fluid overload. Larger studies are required to evaluate its impact on patients' outcomes.
The current study expands on and integrates previous theoretical models concerning the pathways that link child maltreatment to substance use disorder. The proposed model, based on the self-medication hypothesis, suggests that experiences of neglect and abuse during childhood can lead to substance use and abuse both directly and indirectly, via dissociation resulting from failed attempts to integrate experiences of maltreatment in childhood. The model was tested on ten substances (painkillers, stimulants, sedatives, marijuana, cocaine, ecstasy, hallucinogens, heroin, inhalants, and methamphetamine) via structural equation modeling (SEM) in a sample comprising 1040 community-dwelling adults (67% women) aged between 18 and 78 (M = 29.55, SD = 11.37). Fit indexes of the SEM were good, thus supporting the hypothesized model. Specific forms of child maltreatment were related to increased use of specific substances; however, experience of childhood neglect (both physical and emotional) was found to have a central role in predicting use of most substances. Although no single pathway can fully explain the origins of substance abuse, the current study provides evidence of a critical developmental pathway to it, with implications for theory and clinical practice.
Background Cardiometabolic dysfunction is common in young people with psychosis. Recently, the Psychosis Metabolic Risk Calculator (PsyMetRiC) was developed and externally validated in the UK, predicting up-to six-year risk of metabolic syndrome (MetS) from routinely collected data. The full-model includes age, sex, ethnicity, body-mass index, smoking status, prescription of metabolically-active antipsychotic medication, high-density lipoprotein, and triglyceride concentrations; the partial-model excludes biochemical predictors. Methods To move toward a future internationally-useful tool, we externally validated PsyMetRiC in two independent European samples. We used data from the PsyMetab (Lausanne, Switzerland) and PAFIP (Cantabria, Spain) cohorts, including participants aged 16–35y without MetS at baseline who had 1–6y follow-up. Predictive performance was assessed primarily via discrimination (C-statistic), calibration (calibration plots), and decision curve analysis. Site-specific recalibration was considered. Findings We included 1024 participants (PsyMetab n=558, male=62%, outcome prevalence=19%, mean follow-up=2.48y; PAFIP n=466, male=65%, outcome prevalence=14%, mean follow-up=2.59y). Discrimination was better in the full- compared with partial-model (PsyMetab=full-model C=0.73, 95% C.I., 0.68–0.79, partial-model C=0.68, 95% C.I., 0.62–0.74; PAFIP=full-model C=0.72, 95% C.I., 0.66–0.78; partial-model C=0.66, 95% C.I., 0.60–0.71). As expected, calibration plots revealed varying degrees of miscalibration, which recovered following site-specific recalibration. PsyMetRiC showed net benefit in both new cohorts, more so after recalibration. Interpretation The study provides evidence of PsyMetRiC's generalizability in Western Europe, although further local and international validation studies are required. In future, PsyMetRiC could help clinicians internationally to identify young people with psychosis who are at higher cardiometabolic risk, so interventions can be directed effectively to reduce long-term morbidity and mortality. Funding NIHR Cambridge Biomedical Research Centre (BRC-1215-20014); The Wellcome Trust (201486/Z/16/Z); Swiss National Research Foundation (320030-120686, 324730- 144064, and 320030-173211); The Carlos III Health Institute (CM20/00015, FIS00/3095, PI020499, PI050427, and PI060507); IDIVAL (INT/A21/10 and INT/A20/04); The Andalusian Regional Government (A1-0055-2020 and A1-0005-2021); SENY Fundacion Research (2005-0308007); Fundacion Marques de Valdecilla (A/02/07, API07/011); Ministry of Economy and Competitiveness and the European Fund for Regional Development (SAF2016-76046-R and SAF2013-46292-R). For the Spanish and French translation of the abstract see Supplementary Materials section.
Background Patients with type 2 diabetes and obesity have chronic activation of the innate immune system possibly contributing to the higher risk of hyperinflammatory response to SARS-CoV2 and severe COVID-19 observed in this population. We tested whether interleukin-1β (IL-1β) blockade using canakinumab improves clinical outcome. Methods CanCovDia was a multicenter, randomised, double-blind, placebo-controlled trial to assess the efficacy of canakinumab plus standard-of-care compared with placebo plus standard-of-care in patients with type 2 diabetes and a BMI > 25 kg/m² hospitalised with SARS-CoV2 infection in seven tertiary-hospitals in Switzerland. Patients were randomly assigned 1:1 to a single intravenous dose of canakinumab (body weight adapted dose of 450-750 mg) or placebo. Canakinumab and placebo were compared based on an unmatched win-ratio approach based on length of survival, ventilation, ICU stay and hospitalization at day 29. This study is registered with ClinicalTrials.gov, NCT04510493. Findings Between October 17, 2020, and May 12, 2021, 116 patients were randomly assigned with 58 in each group. One participant dropped out in each group for the primary analysis. At the time of randomization, 85 patients (74·6 %) were treated with dexamethasone. The win-ratio of canakinumab vs placebo was 1·08 (95 % CI 0·69-1·69; p = 0·72). During four weeks, in the canakinumab vs placebo group 4 (7·0%) vs 7 (12·3%) participants died, 11 (20·0 %) vs 16 (28·1%) patients were on ICU, 12 (23·5 %) vs 11 (21·6%) were hospitalised for more than 3 weeks, respectively. Median ventilation time at four weeks in the canakinumab vs placebo group was 10 [IQR 6.0, 16.5] and 16 days [IQR 14.0, 23.0], respectively. There was no statistically significant difference in HbA1c after four weeks despite a lower number of anti-diabetes drug administered in patients treated with canakinumab. Finally, high-sensitive CRP and IL-6 was lowered by canakinumab. Serious adverse events were reported in 13 patients (11·4%) in each group. Interpretation In patients with type 2 diabetes who were hospitalised with COVID-19, treatment with canakinumab in addition to standard-of-care did not result in a statistically significant improvement of the primary composite outcome. Patients treated with canakinumab required significantly less anti-diabetes drugs to achieve similar glycaemic control. Canakinumab was associated with a prolonged reduction of systemic inflammation. Funding Swiss National Science Foundation grant #198415 and University of Basel. Novartis supplied study medication.
Objective Investigate associations of objective and subjective indicators of sleep impairment and disorders with low muscle strength (LMS) in different age groups and genders using data from a population-based cohort study. Methods Polysomnographic and subjective sleep data from participants (aged 40–80 years) of the HypnoLaus study (Lausanne, Switzerland) were cross-sectionally analyzed. Indicators of sleep impairment and disorders were based on pre-defined cutoffs. LMS was defined according to the diagnosis of sarcopenia (grip strength <27 kg for men and <16 kg for women). Results obtained by multivariate logistic regression were controlled for confounders. Results 1902 participants (mean [SD] age, 57.4 [10.5] years; 968 [50.9 %] female) were enrolled. Objective short (<6.2 h) and long sleep durations (>8.5 h) were associated with LMS (OR = 1.74, 95 % CI = 1.07–2.82; OR = 6.66, 95 % CI = 3.45–12.87, respectively). Increased nighttime wakefulness >90 min and severe obstructive apnea (OSA) (AHI > 30) were associated with LMS (OR = 1.60, 95 % CI = 1.01–2.56; OR = 2.36, 95 % CI = 1.29–4.31, respectively). In adults aged over 60 years, these associations persisted, and reduced sleep efficiency was associated with LMS (aOR = 1.81, 95 % CI 1.05–3.13). Objective long sleep duration was associated with LMS in both genders and severe OSA predicted LMS among women (aOR = 2.64, 95 % CI 1.11–6.24). Conclusions Markers of early sarcopenia are affected by long sleep duration from middle age onwards in both genders. Older adults are more susceptible to the effects of other indicators of inappropriate sleep duration and quality. The findings support a potential role of sarcopenia in age-related OSA. The intricate relationships between sleep and muscle health are potential targets of public health interventions and clinical research on preventive and therapeutic strategies against the increasing morbimortality observed with ageing.
Background Lung cancer is the second most common cancer and leading cause of cancer mortality worldwide. Recent advances in molecular testing and targeted therapy have improved survival among patients with metastatic non-small-cell lung cancer (NSCLC). We sought to quantify and describe molecular testing among metastatic non-squamous NSCLC cases in selected Southeast Asian countries and describe first-line therapy chosen. Patients and methods A retrospective study was conducted based on incident lung cancer cases diagnosed between 2017 and 2019 in Lampang (Thailand), Penang (Malaysia), Singapore and Yogyakarta (Indonesia). Cases (n = 3413) were defined using the International Classification of Diseases for Oncology third edition. In Singapore, a clinical series obtained from the National Cancer Centre was used to identify patients, while corresponding population-based cancer registries were used elsewhere. Tumor and clinical information were abstracted by chart review according to a predefined study protocol. Molecular testing of epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase (ALK) gene rearrangement, ROS1 gene rearrangement and BRAF V600 mutation was recorded. Results Among 2962 cases with a specified pathological diagnosis (86.8%), most patients had non-squamous NSCLC (75.8%). For cases with staging information (92.1%), the majority presented with metastatic disease (71.3%). Overall, molecular testing rates in the 1528 patients with stage IV non-squamous NSCLC were 67.0% for EGFR, 42.3% for ALK, 39.1% for ROS1, 7.8% for BRAF and 36.1% for PD-L1. Among these patients, first-line systemic treatment included chemotherapy (25.9%), targeted therapy (35.6%) and immunotherapy (5.9%), with 31% of patients having no record of antitumor treatment. Molecular testing and the proportion of patients receiving treatment were highly heterogenous between the regions. Conclusions This first analysis of data from a clinically annotated registry for lung cancer from four settings in Southeast Asia has demonstrated the feasibility of integrating clinical data within population-based cancer registries. Our study results identify areas where further development could improve patient access to optimal treatment.
Background Lurbinectedin, a selective inhibitor of oncogenic transcription, has shown preclinical antitumor activity against homologous recombination repair-deficient models and preliminary clinical activity in BRCA1/2 breast cancer. Patients and methods This phase II basket multitumor trial (NCT02454972) evaluated lurbinectedin 3.2 mg/m² 1-h intravenous infusion every 3 weeks in a cohort of 21 patients with pretreated germline BRCA1/2 breast cancer. Patients with any hormone receptor and human epidermal growth factor receptor 2 status were enrolled. The primary efficacy endpoint was overall response rate (ORR) according to RECIST v1.1. Secondary endpoints included duration of response (DoR), progression-free survival (PFS), overall survival (OS) and safety. Results Confirmed partial response (PR) was observed in six patients [ORR = 28.6%; 95% confidence interval (CI) 11.3% to 52.2%] who had received a median of two prior advanced chemotherapy lines. Lurbinectedin was active in both BRCA mutations: four PRs in 11 patients (36.4%) with BRCA2 and two PRs in 10 patients (20.0%) with BRCA1. Median DoR was 8.6 months, median PFS was 4.1 months and median OS was 16.1 months. Stable disease (SD) was observed in 10 patients (47.6%), including 3 with unconfirmed response in a subsequent tumor assessment [ORR unconfirmed = 42.9% (95% CI 21.8% to 66.0%)]. Clinical benefit rate (PR + SD ≥ 4 months) was 76.2% (95% CI 52.8% to 91.8%). No objective response was observed among patients who had received prior poly (ADP-ribose) polymerase inhibitors. The most common treatment-related adverse events (AEs) were nausea (61.9%), fatigue (38.1%) and vomiting (23.8%). These AEs were mostly grade 1/2. The most common grade 3/4 toxicity was neutropenia (42.9%: grade 4, 23.8%: with no febrile neutropenia). Conclusions This phase II study met its primary endpoint and showed activity of lurbinectedin in germline BRCA1/2 breast cancer. Lurbinectedin showed a predictable and manageable safety profile. Considering the exploratory aim of this trial as well as previous results in other phase II studies, further development of lurbinectedin in this indication is warranted.
Aims Hospitalization for heart failure treatment (HHF) is an incisive event in the course of HF. Today, the large majority of HHF patients is ≥ 65 years and discharge HF drugs are most often not applied at dose levels acknowledged to provide prognostic benefit. This study therefore aims to investigate the treatment effect size of discharge HF drugs in old HHF patients. Methods Drugs are analyzed according to pharmacological class. Individual discharge HF drug dose is reported as percentage of guidelines-recommended target dose. Primary endpoint was 1-year all-cause mortality (ACM) after discharge; the secondary endpoint combined 1-year ACM and first cardiovascular hospitalization within 1 year after discharge. Comparison between 65–80 years and > 80 years old study participants tested the relative treatment effect size as a function of respective age group. Results The 875 consecutive HHF patients had a median age of 82 years [76–87 years]; 48.6 % were females. Betablocker and diuretic treatment did not change the incidence of endpoints. Inhibition of the renin-angiotensin system (RASi), when compared to no treatment, decreased the incidence of endpoints both at the 1–25 % and the > 25 % target dose level. Antagonists of the mineralocorticoid receptor (MRA), when compared to no treatment, decreased the secondary endpoint at the 1–25 % target dose level but not at the > 25 % target dose level. The relative treatment effect size of RASi or MRA corresponded between the age strata for both endpoints. Conclusion Low-dose RASi and MRA had beneficial effects in these old HHF patients.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.