Lady Davis Institute for Medical Research
Recent publications
Hailey-Hailey disease is a rare, chronic, autosomal dominant skin disorder characterized by recurrent painful erosions and macerated plaques, primarily affecting intertriginous areas. It is caused by mutations in the ATP2C1 gene, leading to impaired calcium homeostasis and keratinocyte adhesion. Many patients experience poor disease control despite conventional therapies. We report a case of a female in her 60s with refractory Hailey-Hailey disease affecting the perianal, inguinal, and cervical folds, with painful, eroded plaques resistant to conventional treatments. Despite multiple failed therapies, including methotrexate, dapsone, acitretin, and naltrexone, she showed rapid improvement within 2 weeks of abrocitinib (100 mg daily), a JAK1 inhibitor, with sustained control at 2 months follow-up. JAK inhibitors, initially approved for inflammatory diseases such as atopic dermatitis, are emerging as promising therapies for genodermatoses. By suppressing IL-4/IL-13-driven inflammation, JAK1 inhibition may restore epithelial integrity and reduce chronic skin inflammation. This case adds to growing evidence that JAK inhibitors, particularly abrocitinib, may serve as an effective targeted therapy for refractory Hailey-Hailey disease. Further clinical trials are needed to confirm its long-term efficacy and safety.
Objective. This study aimed to determine whether the kinetics of sublethal damage recovery after x-ray irradiation, quantified as the repair half time ( TrepairSLD) derived from split-dose clonogenic survival, correlates with intrinsic radiosensitivity across four human cancer cell lines: HeLa (cervical), PC3 (prostate), and HCT116 and HT29 (colorectal). In addition, the study compared this survival-based indicator with molecular repair kinetics assessed through γH2AX and 53BP1 foci clearance. Approach. By using a phenomenological approach, we assessed sublethal damage recovery kinetics, aiming to determine whether this recovery rate could serve as a biomarker for cancer-specific intrinsic radiosensitivity. Cells were subjected to split-dose 4 Gy irradiation delivered in two fractions of 2 Gy across a 0 to 10 h inter-fraction interval range using a Multi-Rad x-ray irradiator with a peak tube voltage of 225 kV. The clonogenic assay was performed following split-dose irradiation of the experimental groups to assess cell survival. Colonies were fixed, stained, and counted ( ⩾50 cells/colony viable threshold) to calculate survival fractions (SFs) from the four independent experimental runs completed for each cell line. Unirradiated control cells were used to calculate plating efficiency. The measured SF as a function of inter-fraction time was fitted with the Lea–Catcheside modified linear-quadratic model with a half-life of sublethal damage repair, TrepairSLD, as a free parameter. To compare this approach to molecular DNA repair kinetics, immunofluorescence-based ionizing radiation-induced foci (IRIF) clearance experiments were performed following single 2 Gy irradiation using the same x-ray source. γH2AX and 53BP1 foci were quantified from 0.5 to 24 h post-irradiation, and foci clearance half-lives ( TrepairγH2AX and Trepair53BP1) were determined by single-phase exponential decay fitting. Main results. For all measured cell lines, an increase in SF was observed with increasing inter-fraction time. The estimated TrepairSLD varied across cell lines, from 1.07±0.35 h in HT29, to 1.98±0.94 h in HeLa, 2.00±0.30 h in PC3, and 3.58±1.45 h in HCT116, indicating different capacities for sublethal damage repair. A negative correlation was measured between TrepairSLD and clonogenic survival at 2 Gy ( SF2Gy) by performing orthogonal distance regression, with a slope of −350±50 min (p = 0.02). TrepairγH2AX and Trepair53BP1 ranged from 3 to 11 h, with HT29 showing the fastest foci resolution. However, these molecular repair kinetics times did not significantly correlate with SF2Gy (p > 0.05) or follow the same trend as TrepairSLD across cell lines. For example, PC3 cells exhibited the slowest foci clearance, whereas HCT116 displayed the slowest TrepairSLD, suggesting that IRIF-based measurements do not reliably reflect functional sublethal damage repair. Significance. Clonogenic survival assays capture the integrated biological outcome of radiation exposure, reflecting not only DNA damage recognition and repair but also downstream processes such as checkpoint activation, chromatin context, and long-term proliferative capacity. In contrast, molecular readouts like γH2AX and 53BP1 foci clearance, though rapid and widely used, may not fully account for defects in damage response pathways. The observed discrepancy between foci clearance kinetics and survival-based repair rates in this study highlights the limitations of foci quantification as a surrogate for radiosensitivity. These findings underscore the value of survival-based sublethal damage recovery measurements as functionally rich indicators of intrinsic radiosensitivity, which may inform future biomarker development or predictive modeling frameworks.
Introduction: Approximately 20% of patients with cancer will have cancer-associated venous thromboembolism (CAT) which is associated with significant morbidity and mortality. Despite its clinical importance, CAT awareness in cancer patients and caregivers remains low. We sought to assess the patients’ knowledge of CAT through a national survey. Methods: A survey assessing knowledge of different aspects of CAT was developed by a steering committee including 4 clinicians with expertise in CAT and a patient partner with lived experience. Survey dissemination among patients with cancer occurred through the Environics network, the Thrombosis Canada member network, the Thrombosis Canada social media platforms and was advertised through Instagram and Facebook and the Canadian Cancer Survivor Network newsletter. Results: Out of the 312 patients with cancer or survivors who responded to the survey, 178 (57.4%) were female and 118 (37.8%) were over 65 years old. Overall, 119 patients (38.1%, 95%CI: 37.7-49.8%) reported having no knowledge of CAT. Only 84 (26.9%, 95%CI: 22.1-32.2%) and 94 (30.1%, 95%CI: 25.1-35.6%) patients reported receiving education about their underlying risk of CAT or education about signs and symptoms of VTE, respectively. A total of 66 (21%, 95%CI: 16.8-26.1%) patients reported being informed by a healthcare professional about considering thromboprophylaxis. Patients were interested in learning more about the risk of CAT, its associated risk factors, and benefits and potential side effects of thromboprophylaxis. Conclusions: Many patients with cancer lack awareness or knowledge of CAT. Our results highlight ongoing education and awareness of CAT burden.
Background Physical activity has time‐varying associations with injury risk among children. While previous activity may predispose to injury through tissue damage, fatigue and insufficient recovery, it may protect against injury by strengthening tissues and improving fitness and skills. It is unclear what the relevant time window and relative importance of past activity are with regard to current injury risk in children. Objectives The objectives of this study were to assess how previous activity patterns are associated with injury risk among children. Methods Our data source was the Childhood Health, Activity, and Motor Performance School Study Denmark (CHAMPS‐DK), a prospective cohort study of Danish school children conducted between 2008 and 2014. We applied flexible weighted cumulative exposure methods within a Cox proportional hazards model to estimate the time‐varying association between the number of weekly activity sessions and time‐to‐first injury in each school year. We estimated several models with varying time windows and compared goodness‐of‐fit. Results Out of 1667 study participants, 986 (59.1%) were injured at least once, with a total of 1752 first injuries across school years. The best‐fitting model included 20 weeks of past physical activity. Higher levels of activity performed 10–20 weeks ago were associated with decreased injury risk, while higher levels of activity performed 2–9 weeks ago were associated with higher injury risks. Compared to those who remained minimally active for the entire past 20‐week period, children who were highly active in the past 10 weeks after being minimally active 11–20 weeks ago had an injury hazard ratio of 1.63 (95% confidence interval 1.18, 2.23). Conclusions Flexible weighted cumulative exposure methods suggest a complex temporal relationship between past physical activity history and injury in children.
Artificial intelligence applications in biomedicine face major challenges from data privacy requirements. To address this issue for clinically annotated tissue proteomic data, we developed a Federated Deep Learning (FDL) approach (ProCanFDL), training local models on simulated sites containing data from a pan-cancer cohort (n=1,260) and 29 cohorts held behind private firewalls (n=6,265), representing 19,930 replicate data-independent acquisition mass spectrometry (DIA-MS) runs. Local parameter updates were aggregated to build the global model, achieving a 43% performance gain on the hold-out test set (n=625) in 14 cancer subtyping tasks compared to local models, and matching centralized model performance. The approach’s generalizability was demonstrated by retraining the global model with data from two external DIA-MS cohorts (n=55) and eight acquired by tandem mass tag (TMT) proteomics (n=832). ProCanFDL presents a solution for internationally collaborative machine learning initiatives using proteomic data, e.g., for discovering predictive biomarkers or treatment targets, while maintaining data privacy.
Mutation, deletion, or silencing of genes encoding cellular metabolism factors occurs frequently in human malignancies. Neomorphic mutations in isocitrate dehydrogenases 1 and 2 (IDH1/2) promoting the production of R-2-hydroxyglutarate (R-2HG) instead of α-ketoglutarate (αKG) are recurrent in human brain cancers and constitute an early event in low-grade gliomagenesis. Due to its structural similarity with αKG, R-2HG acts as an inhibitor of αKG-dependent enzymes. These include the JUMONJI family of lysine demethylases, among which KDM4A is particularly sensitive to R-2HG-mediated inhibition. However, the precise molecular mechanism through which inhibition of αKG-dependent enzymes by R-2HG promotes gliomagenesis remains poorly understood. Here, we show that treatment with R-2HG induces cellular senescence in a p53-dependent manner. Furthermore, expression of mutated IDH1R132H or exposure to R-2HG, which leads to KDM4A inhibition, causes telomeric dysfunction. We demonstrate that KDM4A localizes to telomeric repeats and regulates abundance of H3K9(me3) at telomeres. We show that R-2HG caused reduced replication fork progression, and that depletion of SMARCAL1, a helicase involved in replication fork reversal, rescues telomeric defects caused by R-2HG or KDM4A depletion. These results establish a model whereby IDH1/2 mutations cause R-2HG-mediated inhibition of KDM4A, leading to telomeric DNA replication defects, telomere dysfunction, and associated genomic instability.
Objective Systemic sclerosis related interstitial lung disease (SSc-ILD) is a major cause of morbidity. We aimed to identify patients following similar trajectories of forced vital capacity (FVC) decline, examine their association with mortality and risk factors for FVC decline. Methods This is a multicentre retrospective study of 444 SSc patients with ILD and ≤7-year disease duration. Patients were grouped based on similar FVC decline trajectories using semi-parametric modeling with latent class analysis. Survival was compared between the worst FVC trajectory group and the others. Logistic regression models with backwards selection were applied to identify predictors of FVC trajectory using baseline disease features. Results Four FVC trajectory groups were identified. The most progressive trajectory declined by -2.18% per year and the other 3 trajectory groups were stable or progressed slowly. The most progressive group had a higher mortality rate than those with a stable/slow FVC trajectory (hazard ratio 2.95, 95%CI 1.74, 4.98). Baseline FVC (p< 0.001) and CRP elevation (p= 0.039) were associated the progressive trajectory. Baseline FVC ≤ 72% predicted the progressive trajectory with a sensitivity of 0.88 and specificity of 0.91. A lower baseline FVC was in turn associated with older age, Caucasian race, longer disease duration, ATA presence, and elevated CRP on exploratory analyses. Conclusion Distinct FVC trajectories are associated with different survival outcomes and the most important predictor of a progressive FVC trajectory was existing ILD severity. More work is needed to assess the utility of imaging or paraclinical findings that can improve prediction of distinct FVC trajectories.
Non-linear Mendelian randomisation (NLMR) is a relatively recently developed approach to estimate the causal effect of an exposure on an outcome where this is expected to be non-linear. Two commonly used techniques—based on stratifying the exposure and performing Mendelian randomisation (MR) within each strata—are the residual and doubly-ranked methods. The residual method is known to be biased in the presence of genetic effect heterogeneity—where the effect of the genotype on the exposure varies between individuals. The doubly-ranked method is considered to be less sensitive to genetic effect heterogeneity. In this paper, we simulate genetic effect heterogeneity and confounding of the exposure and outcome and identify that both methods are susceptible to likely unpredictable bias in this setting. Using UK Biobank, we identify empirical evidence of genetic effect heterogeneity and show via simulated outcomes that this leads to biased MR estimates within strata, whilst conventional MR across the full sample remains unbiased. We suggest that these biases are highly likely to be present in other empirical NLMR analyses using these methods and urge caution in current usage. Simulated outcome analyses may represent a useful test to identify if genetic effect heterogeneity is likely to bias NLMR estimates in future analyses.
Background Clear specification and reporting of implementation strategies and their targeted healthcare professional behaviors are essential for replication, adaptation, and cumulative learning in implementation science. However, critical gaps remain in the consistent use of reporting frameworks. This study aimed to: (1) assess the completeness of implementation strategy reporting using the Template for Intervention Description and Replication (TIDieR) checklist; (2) examine trends in implementation strategy reporting over time; and (3) assess the completeness of the reporting of healthcare professional behaviors targeted for change using the Action, Actor, Context, Target, Time (AACTT) framework. Methods We conducted a secondary analysis of 204 trials included in a systematic review of implementation strategies aimed at changing healthcare professional behavior. Implementation strategies were assessed using the 12-item TIDieR checklist; target behaviors were characterized using the five AACTT domains. Two independent reviewers extracted and coded the data. Descriptive statistics were used to summarize reporting patterns. Data were synthesized narratively and presented in tables, with trends illustrated via a scatterplot. Results Assessment of implementation strategy reporting using TIDieR showed that procedural details (98%), materials used (95%), and modes of delivery (88%) were frequently reported. Critical elements such as strategy tailoring (28%), fidelity assessment (19% planned; 17% actual), and modifications (10%) were often missing. A modest improvement in reporting was observed after the publication of TIDieR, with median scores increasing from 15.0 (IQR: 13.0–16.0) pre-2014 to 16.0 (IQR: 15.0–18.0) post-2014. Assessment of target healthcare professional behavior reporting using AACTT indicated that actions (e.g., “assess illness”) and actors (e.g., nurses) were generally well reported at a high level. However, key contextual and temporal details were largely absent. While physical context was documented in all studies, the emotional and social contexts of behaviors were rarely reported. Crucial information on the duration, frequency, and period of behaviors was rarely reported. Conclusions Implementation strategies and target behaviors are not consistently or sufficiently reported in trials. Increased adoption of structured reporting tools such as TIDieR and AACTT is essential to enhance transparency. Incorporating these frameworks during protocol development could strengthen intervention evaluation and reporting, advancing implementation science and fostering cumulative knowledge. Trial registration PROSPERO CRD42019130446.
The Deauville score (DS) on FDG-PET/CT is the standard for evaluating treatment response in Hodgkin Lymphoma (HL), but it’s qualitative and subjective. The standardized uptake value ratio (SUVR = SUVmax Lymphoma/SUVmax Liver) offers a more quantitative and objective assessment. Compare DS and SUVR at end-of-treatment (EOT) FDG-PET/CT for progression-free survival (PFS) in HL. Patients with classical HL had EOT-PET scans re-scored using DS and SUVR. Reciever operator curve (ROC) determined optimal SUVR cut-off. Sensitivity, specificity, positive and negative predictive values for DS (positive ≥ 4) and SUVR (positive ≥ 1.13) were computed. Kaplan-Meier curves and Cox-regression analyzed predictive ability for PFS. Among 154 patients (median age 31), the optimal SUVR cut-off was 1.13. DS and SUVR had identical diagnostic parameters. Both predicted PFS (Hazard Ratio 22.85 [95% CI 9.13–57.15]). The stage-specific SUVR cutoff was 1.40 for patients with limited favourable disease and 2.20 for patients with advanced disease. The stage-specific SUVR cut-off among patients with advanced disease resulted in improved specificity (92.6% vs. 98.8%) and positive predictive value (68.4% vs. 92.3%) compared to the DS. SUVR (≥ 1.13) and DS are equally predictive of PFS at EOT, with SUVR offering a more objective assessment.
Background Intraoperative and postoperative hypotension are associated with myocardial injury/infarction, stroke, acute kidney injury, and death. Because of its prolonged duration, postoperative hypotension contributes more to the risk of organ injury compared with intraoperative hypotension. A prediction model for clinically important postoperative hypotension after noncardiac surgery is needed to guide clinicians. Methods We performed a secondary analysis of the Vascular Events in Noncardiac Surgery Patients Cohort Evaluation (VISION) study. Patients aged ≥45 yr who had inpatient noncardiac surgery across 28 centres in 14 countries were included. In 14 of the centres selected at random (derivation cohort), we evaluated 49 variables using logistic regression to develop a model to predict postoperative clinically important hypotension, defined as a systolic blood pressure ≤90 mm Hg, that resulted in clinical intervention. The postoperative period was defined from the Post-Anesthesia Care Unit to hospital discharge. We then evaluated its calibration and discrimination in the other 14 centres (validation cohort). Results Among 40 004 patients in VISION, 20 442 (51.1%) were included in the derivation cohort, and 19 562 (48.9%) patients were included in the validation cohort. The incidence of clinically important postoperative hypotension in the entire cohort was 12.4% (4959 patients). A 41-variable model predicted the risk of clinically important postoperative hypotension (bias-corrected C-statistic: 0.73, C-statistic in validation cohort: 0.72). A simplified prediction model also predicted clinically important hypotension (bias-corrected C-statistic: 0.68) based on four information items. Conclusions Postoperative clinically important hypotension may be estimated before surgery using our primary model and a simple four-element model. Clinical trial registration NCT00512109.
Examinations of biomarkers are useful in measuring overall health. Endpoints are critical to assess the threshold where the scientific aim of the study does not prevail over the wellbeing of experimental laboratory animal. However, parameters able to assess health and recovery after an acute post-surgical intervention are needed. To fill this gap, we combined a suite of qualitative and quantitative measurements and created a surgical recovery matrix (SRM) able to monitor and score rodent health and wellbeing after surgery. We established baseline values in healthy male and female retired breeder C57BL/6N mice as a control, no surgery (NoSx) cohort. To test if SRM scoring was useful in the immediate surgical period, we monitored changes in 18 parameters after sham (SH) and myocardial infarction (MI)-inducing surgeries over 3 days of recovery. The surgical manipulations and cardiac damage involved in MI-inducing surgery render its categorization as a major surgery. In contrast, SH surgery represents a minor surgery such that the surgical manipulations are identical to that of the MI surgery but without any manipulation/damage to the heart. Six hours after surgery, males and females showed deficits in nestlet integration, eye grooming, a hunched posture and rough coat appearance whereas greater body weight losses and impaired wound healing were recorded later in the observation period. Sex-specific differences were observed such that males showed a propensity to reduced mobility and lower surface body temperature whereas females had a reduced Body Condition Score. These parameters discriminated the very low scores detected in controls from the intermediate scores after SH surgery and the highest scores in mice severely debilitated by MI surgery. Further, we identified sex-specific and time-dependent changes such that the highest scores were detected in male mice after an MI versus SH surgery. We conclude that a combination of quantitative and qualitative parameters successfully evaluated mouse recovery and health after minor and major surgery.
Background Theories, models, and frameworks (TMFs) are central to implementation practice and research. Selecting one or more TMF(s) for a project remains challenging due to numerous options and limited guidance. This study aimed to (1) identify and categorize the reported purposes and attributes of TMFs, as well as the practical considerations of TMF users, and (2) synthesize these findings into a meta-framework that supports implementation practitioners and researchers in selecting TMFs. Methods A scoping review was conducted using Joanna Briggs Institute guidelines. Medline, Embase, and CINAHL were searched to identify articles on the selection of TMFs. Articles were selected and data extracted using Covidence. Inductive thematic analysis was used to refine and categorize purposes, attributes and practical considerations. The meta-framework was developed by mapping these categories onto a sequential process, pilot-testing through case studies, and iteratively refining it based on team feedback. Results Of 9,276 records, 43 articles (2005–2024) were included. Most articles reported TMF purposes (41 articles), followed by attributes (30) and practical considerations (13). Seven distinct purposes were identified: (1) enhancing conceptual clarity, (2) anticipating change and guiding inquiry, (3) guiding the implementation process, (4) guiding identification of determinants, (5) guiding design and adaptation of strategies, (6) guiding evaluation and causal explanation, and (7) guiding interpretation and dissemination. Additionally, 24 TMF attributes were grouped into five domains: clarity and structure, scientific strength and evidence, applicability and usability, equity and sociocultural responsiveness, and system and partner integration. Ten practical considerations were grouped into three domains: team expertise and readiness, resource availability, and project fit. These findings informed the development of the Systematic Evaluation and Selection of Implementation Science Theories, Models and Frameworks (SELECT-IT) meta-framework, comprising four steps: (1) determine the purpose(s) of using TMF(s); (2) identify potential TMFs; (3) evaluate short-listed TMFs against attributes; and (4) assess practical considerations of using TMF(s) within the project context. A worked example and two user-friendly worksheets illustrate its utility. Conclusions This study advances understanding of the selection of implementation science TMFs by distinguishing inherent TMF attributes from practical considerations. The SELECT-IT meta-framework offers a structured, context-sensitive approach for selecting appropriate TMFs. Future research should evaluate its validity and utility across diverse contexts.
Mixed-methods process mapping is a visualisation tool that identifies the steps, resources and personnel required to deliver a clinical practice, and has been previously used in an ad hoc manner to develop effective implementation strategies and solutions. To realise the potential of mixed-methods process mapping as an implementation tool, we aimed to develop and formalize the methodological steps and provide guidance for contemporary best practice approaches to using this approach for optimising implementation practice and research. Synthesising theory, evidence and expertise, we have identified 10 best practice recommendations and provide the first systematic framework for integrating mixed-methods process mapping into three core phases of health systems implementation, specifically: (1) engaging interest holders (and maintaining engagement), (2) identifying when, where, why, and to whom change is needed (and potential consequences), and, (3) identifying barriers and enablers, and co-designing implementation strategies. For each phase, we provide: (a) a rationale for using mixed-methods process mapping, (b) best practice guidance for combining mixed-methods process mapping with implementation practice and research, and (c) case studies exemplifying best practice. This article provides intelligence on mixed-methods process mapping to improve the consistency and quality of its use among implementation researchers and practitioners. We present a rationale, guidance, and practical tools for conducting mixed-methods process mapping to enhance the quality of implementation research and practice which can be used and adapted internationally. In doing so, it builds capacity and provides an opportunity for researchers and healthcare professionals to better understand and embed evidence-based innovations into health systems, improving service and client outcomes. Further research is needed to establish potential uses of mixed-methods process mapping to support other core components of implementation practice (e.g., adaptation), and to formally test the impact of this approach independently versus as part of a combination of implementation strategies.
Halting breast cancer metastatic relapse following primary tumor removal remains challenging due to a lack of specific vulnerabilities to target during the clinical dormancy phase. To identify such vulnerabilities, we conducted genome-wide CRISPR screens on two breast cancer cell lines with distinct dormancy properties: 4T1 (short-term dormancy) and 4T07 (prolonged dormancy). The dormancy-prone 4T07 cells displayed a unique dependency on class III PI3K (PIK3C3). Unexpectedly, 4T07 cells exhibited higher mechanistic target of rapamycin complex 1 (mTORC1) activity than 4T1 cells due to lysosome-dependent signaling occurring at the cell periphery. Pharmacologic inhibition of PIK3C3 suppressed this phenotype in the 4T1-4T07 models as well as in human breast cancer cell lines and a breast cancer patient-derived xenograft. Furthermore, inhibiting PIK3C3 selectively reduced metastasis burden in the 4T07 model and eliminated dormant cells in a HER2-dependent murine breast cancer dormancy model. These findings suggest that PIK3C3-peripheral lysosomal signaling to mTORC1 may represent a targetable axis for preventing dormant cancer cell–initiated metastasis in patients with breast cancer. Significance Dormancy-prone breast cancer cells depend on the class III PI3K to mediate peripheral lysosomal positioning and mTORC1 hyperactivity, which can be targeted to blunt breast cancer metastasis.
Objective The purpose of this study was to systematically review the differences in disease‐specific quality of life (QoL) benefits experienced by bone‐anchored hearing implant (BAHI) users between those diagnosed with unilateral sensorineural hearing loss (U‐SNHL) and those with conductive/mixed hearing loss (CHL). Data Sources Eligible studies were searched for in Medline (Ovid), Embase (Ovid), CINAHL (Ebsco), Cochrane (Wiley), Global Health (Ovid), Web of Science (Clarivate Analytics), Africa Wide Information (Ebsco) and Global Index Medicus (WHO) from inception to October 23, 2022. Updated searches were performed on November 9, 2023, and July 11, 2024. Review Methods There were no restrictions on language. PRISMA standards were followed, and screening was conducted by two independent reviewers in Rayyan, with a third reviewer resolving conflicts. Risk of bias was assessed using RoBANS. Articles were included if patients were implanted with a BAHI and administered a validated, disease‐specific QoL measure. Results One thousand, three hundred twelve articles were identified after duplicate removal, with 56 articles meeting the inclusion criteria. Eight different disease‐specific QoL measures were administered. In all, the APHAB's “Global” ( p = 0.0002), EC ( p < 0.00001), and BN ( p = 0.02) scores, as well as the GBI's “Global” ( p = 0.0001), “General” ( p = 0.002), and “Physical” ( p = 0.02) scores were significantly different between U‐SNHL and CHL populations. Conclusion These results demonstrated disease‐specific QoL differences between BAHI users with U‐SNHL and CHL. Specifically, patients with CHL reported greater benefits in domains pertaining to communication ease, the clarity of sound, and their overall health and psychosocial status.
Despite the potential benefits of remote cognitive assessment for dementia, it is not appropriate for all clinical encounters. Our aim was to develop guidance on determining a patient's suitability for comprehensive remote cognitive diagnostic assessment for dementia. A multidisciplinary expert workgroup was convened under the auspices of the Canadian Consortium on Neurodegeneration in Aging. We applied the Delphi method to determine ‘red flags’ for remote cognitive assessment of dementia. This resulted in 14 red flags that met the predetermined consensus criteria. We then developed a novel clinical decision-making infographic that integrated these findings to support multidisciplinary clinicians in determining a patient's readiness to undergo comprehensive remote cognitive assessment.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
82 members
Mark A Wainberg
  • McGill University AIDS Centre
P. Reynier
  • Centre for Clinical Epidemiology (CCE)
Information
Address
Montréal, Canada