Montefiore Medical Center
  • The Bronx, United States
Recent publications
  • Barlas Goker
    Barlas Goker
  • Andrew Brook
    Andrew Brook
  • Ranxin Zhang
    Ranxin Zhang
  • [...]
  • David S Geller
    David S Geller
Background and Objectives Endoprosthetic reconstruction is the preferred approach for limb salvage surgery for many patients following malignant bone tumor resection. Implant failure is a common complication, however, there are no reliable means with which to offer patient‐specific survival estimations. Implant survival predictions can set patient expectations and may guide treatment planning. This study aims to test and compare machine‐learning models for the prediction of early tumor endoprosthetic implant survival. Methods A single‐center retrospective series of 138 cases (mean age 41, 70 males, 68 females) was split into an 80:20 training and testing set. XGBoost, random forest, decision tree learning, and logistic regression were trained and assessed for model performance. After an initial review, age, sex, body mass index, diagnosis, location, resection length, and number of surgeries were selected as features. The output variables were 12‐month, 24‐month, and 36‐month implant survival. Results Random forest had the best performance at 12, 24, and 36 months with an area under the curve (AUC) of 0.96, 0.89, 0.88; accuracy of 0.92, 0.83, 0.75; and Brier score of 0.09, 0.11, 0.20, respectively. Overall, the models performed better at 12 months compared to the other time points. The most important feature at 12 months was resection length (0.17), whereas age was most important at 24 months (0.15) and 36 months (0.17). Online tools were created based on the random forest models. Conclusions Machine learning models can be leveraged for the accurate prediction of early tumor endoprosthetic survival. These represent the first ML models used to predict endoprosthetic implant survival beyond 1 year and the first to include upper extremity implants. This offers better patient‐specific prognostication which can help manage patient expectations and may guide recommendations. Level of Evidence Level III.
Objective The current study investigated bidirectional associations between defense mechanisms and therapeutic techniques in two different psychotherapies for panic disorder (PD). Identifying how technique use facilitates and is facilitated by change in defenses might guide adaptation and improvement of therapies. Method Patient defense mechanisms and use of therapeutic techniques were measured at early, mid, and late‐treatment for 101 patients receiving panic‐focused psychodynamic psychotherapy (PFPP) and cognitive behavioral therapy (CBT). Time‐lagged associations between use of techniques and psychological defenses were examined using random‐intercept cross‐lagged panel models. Results In PFPP, less‐organized defense use at mid‐treatment predicted higher therapist focus on the patient's moment‐to‐moment experience and affect at late‐treatment. In CBT, greater therapist focus on the patient's thoughts and cognitions at early‐treatment predicted use of more adaptive defenses at mid‐treatment. Conclusion Results underscore differential treatment effects in the relationship between techniques and change in defensive functioning over time. Key Messages In cognitive behavioral therapy, therapists could focus on patients' thoughts and cognitions to foster adaptive defensive functioning like intellectualization and rationalization. In panic‐focused psychodynamic psychotherapy, therapists might increase affect‐focused interventions to target when patients persist in using lower adaptive defenses. Broadly, therapists can also be aware that their use of therapeutic interventions may be influenced by their patient's defensive functioning.
Background and Aims Calcific coronary lesions pose significant challenges to percutaneous coronary intervention (PCI), limiting stent delivery and expansion. Intravascular lithotripsy (IVL) and rotational atherectomy (RA) are widely used plaque modification techniques; however, comparative data on their effectiveness remain limited. We aimed to compare clinical and procedural outcomes between IVL and RA in the management of calcific coronary lesions. Methods PubMed, Embase, Scopus, and Cochrane Library were searched through January 2025 for randomized controlled trials (RCTs) and observational studies comparing IVL with RA in calcific coronary lesions undergoing PCI. The primary outcome was major adverse cardiovascular events (MACE). Secondary outcomes included all‐cause mortality, myocardial infarction (MI), stroke, repeat revascularization, procedural outcomes, and minimum stent area (MSA). Random‐effect models were used for outcome analysis, and meta‐regression assessed the impact of baseline characteristics. Results A total of 14 studies (2 RCTs, 12 observational; 2056 IVL patients, 3099 RA patients) were included. IVL and RA showed a comparable risk of MACE (OR 0.81; 95% CI 0.57−1.16; p : 0.26) and similar risks of all‐cause mortality, MI, stroke, and repeat revascularization. IVL was associated with a lower risk of coronary perforation (OR 0.43; 95% CI 0.32−0.57; p < 0.001) and slow or no‐reflow (OR 0.34; 95% CI 0.14−0.79; p 0.02). Additionally, IVL resulted in shorter procedure duration (SMD −0.30; 95% CI −0.61−0.00; p 0.05) and fluoroscopy time (SMD −0.41; 95% CI −0.62, −0.20; p 0.004). Post‐procedural MSA was similar between IVL and RA. Conclusion IVL and RA demonstrated comparable efficacy in terms of MACE and clinical outcomes in patients with calcific coronary lesions undergoing PCI. However, IVL was associated with a lower risk of coronary perforation, slow or no‐reflow phenomenon, and reduced procedure duration and fluoroscopy time, suggesting a potential procedural advantage over RA.
Background Chronic tendinopathy treatment remains elusive; however, percutaneous Ultrasound-Guided Tenotomy using TENEX® (PUTT) has demonstrated promising clinical outcomes, but the mechanism of action is not clearly defined. This study aims to describe potential mechanisms using an ex-vivo animal model. The objective of the study is to examine the histological effects of PUTT on the bovine soleus tendon across varying treatment durations. Methods Twelve bovine soleus tendons were allocated to four cohorts to undergo PUTT for 1, 3, 5, and 7 min. Each specimen was treated on one side, the opposite side serving as a control. Macroscopic and microscopic analyses were conducted to assess tendon sheath, fascicle, and perineural disruption. Fascicle penetration was measured using ImageJ software. Statistical analyses were performed using Analysis of Variance (ANOVA), with significance levels set at p < 0.05. Results Macroscopic and microscopic examination revealed progressive separation of the paratenon, peritendinous nerves, and fascicles, correlating with increased treatment duration. Fascicle penetration depths were 0.1 mm, 2.57 mm, 2.61 mm, and 3.93 mm at 1, 3, 5, and 7 min, respectively. ANOVA confirmed significant differences among groups (F (3, 8) = 620.898, p < 0.001), with a large effect size (η² = 0.996. Tukey's Honest Significant Difference (HSD) test revealed significant differences between most groups (p < 0.001), except between the 3-min and 5-min treatments, which showed no significant difference (p = 0.969). Conclusion PUTT induces significant structural changes in the paratenon and fascicle layer with longer treatment duration, resulting in more pronounced modifications.
INTRODUCTION Counseling on the likelihood of successful vaginal birth after cesarean (VBAC) is essential, because a failed trial of labor after cesarean (TOLAC) increases perinatal morbidity. Trial of labor after cesarean induction of labor (IOL) has higher rates of failure versus spontaneous TOLAC. We assessed VBAC success for TOLAC IOL and examined the utility of the VBAC calculator in predicting successful TOLAC. METHODS An IRB-approved, retrospective cohort study of patients (2020–2023) with a history of prior cesarean birth undergoing IOL was conducted. Logistic regression comparing calculator-predicted VBAC success rates to actual VBAC success was performed. RESULTS Of 270 patients undergoing TOLAC IOL, 138 (51.1%) had successful VBAC versus 172 (63.6%) as predicted by the VBAC calculator ( P <.0001). Successful VBAC was associated with history of prior vaginal delivery (VD) ( P =.0008), prior VBAC (<0.0001), and no history of arrest disorder ( P =.0007). VBAC occurred in 72% of patients with prior VD and 46% without prior VD. Vaginal birth after cesarean occurred in 82% of patients with prior VBAC and 43% without prior VBAC. Of those with successful VBAC, 22% had prior arrest disorder. In failed IOL, 41% had prior arrest disorder. No association was found between successful VBAC and age, body mass index, chronic hypertension, or diabetes. The VBAC calculator was documented as part of TOLAC counseling in only 31 patients (11.39%). CONCLUSIONS/IMPLICATIONS True VBAC success rate was 51.1%. This is 20% lower than the national rate (74.3%), which does not separate IOL from spontaneous labor. Trial of labor after cesarean patients should be counseled about lower success rates of VBAC post-IOL. The VBAC calculator should be used to provide more accurate counseling in patients undergoing TOLAC IOL.
Concerns exist about the safety of non-anesthesiologist positive pressure ventilation with sedation/analgesia during cardiac electrophysiology (EP) procedures in high-risk patients with known or risk factors such as obstructive sleep apnea (OSA). This is magnified if the procedures are done outside of intensive care areas or outside of hospital policies and procedures rules. Background Noninvasive positive pressure ventilation mask ventilation (NIPPV including continuous or bilevel positive airway pressure—CPAP/BiPAP) with sedation/analgesia is typically limited to hospital units staffed by pulmonary-intensive care or anesthesiology personnel, with monitoring by respiratory therapists or specifically trained nursing staff. NIPPV with sedation has raised concerns if delivered by laboratory staff in procedure rooms, especially in high-risk patients. Literature is sparse on this topic. NIPPV as described is routine at some institutions and prohibited at others. We aimed (1) to test the safety and efficacy of NIPPV with sedation prescribed by cardiologists and administered by trained nurses in a prospective cohort of high-risk patients and (2) to provide data that, if favorable, could lead to revisions of institutional policies. Methods We enrolled 50 consecutive consenting patients with known or at high risk for OSA. Three were then excluded (did not qualify, or procedure canceled). Procedures in 47 patients included 21 ICD implants (12 with defibrillation testing), 8 pacemaker implants, 11 ablations, and 7 cardioversions; some patients had combined procedures, e.g., “ablate & pace.” Standard NIPPV settings were used. Staff were trained in general NIPPV device monitoring and management. Data collected included vital signs, O2 saturations, hypercapnia, demographics, toleration of NIPPV, and complications. Results There were no NIPPV-related complications and no long-term adverse sequelae in the 47 patients who participated in the protocol. No patient required intubation or urgent rescue from an anesthesiologist. Most patients (45) tolerated NIPPV including patients without prior experience. Conclusions NIPPV with sedation can be safely delivered in high-risk OSA patients by trained non-anesthesiologist/pulmonary/intensive care personnel in an EP lab setting. Policy and procedure manuals may benefit from revision.
Objective The Community Health Worker Institute (CHWI) addresses health-related social needs (HRSNs) by integrating Community Health Workers (CHWs) into patient care. This study explores the barriers and facilitators to HRSN referrals and the integration of CHWs within clinical teams. Methods Qualitative interviews were conducted with CHWs, CHWI program staff, and clinicians from ambulatory care clinics. Semi-structured interviews, guided by the Consolidated Framework for Implementation Research, were audio-recorded, transcribed, and analyzed using rapid qualitative methods. Results Preliminary findings found that while clinicians support the CHWI and referral program, time constraints during patient visits likely hinder effective screenings and referrals. CHWs are seen as valuable advocates but continue to face challenges due to confusion about their clinical role from patients and clinicians. Hierarchical power dynamics seen between providers and CHWs likely contribute to this confusion. Clinics with strong leadership, clear role delineation, and clinical site preparation appear to have better CHW integration. Conclusion CHWs play a crucial role in addressing HRSNs, but their integration into clinical teams requires overcoming logistical and educational challenges. These findings offer insights for improving the HRSN referral process and integrating CHWs into healthcare systems.
Kidney transplant is the gold standard for the treatment of end‐stage renal disease (ESRD). However, there is a significant discrepancy between donor availability and the number of potential recipients on the waiting list. Living donor kidney transplantation has been considered an alternative to increase the donor pool. Left donor nephrectomy is typically preferred due to the length of the renal vein. However, in some cases, right donor nephrectomy must be considered, which presents challenges due to the shorter renal vein and, in some cases, multiple renal arteries. For these cases, transplant surgeons must have alternative strategies to reconstruct the vasculature and ensure that graft implantation and anastomosis are as safe as possible. We present a case of a living donor right laparoscopic nephrectomy with two renal arteries, including vein elongation with an end‐to‐end anastomosis with a deceased donor renal vein and an end‐to‐side arterial anastomosis using a deceased donor iliac artery conduit.
Inotuzumab Ozogamicin (InO) is an antibody-calicheamicin conjugate with high efficacy in lymphoid malignancies. It targets the B-cell surface protein CD22, which is expressed in most B-ALL cases, albeit with variable intensity. However, factors governing CD22 expression and thus leukemia sensitivity to InO remain incompletely understood. Using multi-omic characterization of 196 human B-ALL samples, coupled with ex vivo InO sensitivity profiling, we show that early leukemia differentiation arrest at the Pre-pro-B stage is associated with resistance to InO. Screening 1,639 transcription factor genes prioritized Early B-cell Factor 1 (EBF1) as a key regulator of CD22 expression (false discovery rate=7.1×10-4). Comparing the ATAC-seq profiling results of the most InO-sensitive and -resistant cases (LC50 <10th vs. >90th percentile, n=18), the binding motif for EBF1 was strikingly enriched in regions with differential open chromatin status (P=8×10-174). CRISPR interference targeting EBF1 binding sites at the CD22 locus led to ~ 50-fold reduction in cell surface CD22 expression, and consequently ~ 22-fold increase in InO resistance in ALL cell lines. Interestingly, within BCR::ABL1 ALL, we observed intra-subtype heterogeneity linked to EBF1 transcriptional downregulation (P=1.1×10-15) and/or somatic alteration (P=0.004), which led to reduced CD22 expression (P=8.3×10-11) and ex vivo and in vivo resistance to InO. Collectively, these findings point to the direct impact of EBF1 on CD22 expression during B-cell development, which in turn contributes to inter-patient variability in InO response, even within the same subtype of B-ALL.
Data on Clostridiodes difficile infections (CDI) among outpatient parenteral antibiotic therapy (OPAT) patients is limited. Herein, we describe characteristics of OPAT patients with CDI at a large academic medical center. Despite prolonged antibiotic exposure, the incidence rate of CDI was low.
Effective communication with adolescents in a school-based health care setting is critical for ensuring their engagement, understanding, and compliance with medical advice, which can ultimately lead to better health outcomes. This chapter explores the unique challenges and strategies involved in communicating with this age group, specifically in a school-based setting, highlighting the importance of establishing confidentiality, building trust, demonstrating respect, and providing age-appropriate information. Through a review of current literature and best practices, the chapter identifies key tools such as psychosocial screenings and motivational interviewing. By adopting these practices, healthcare providers can foster a more effective and collaborative relationship with their adolescent patients in the school health setting, thereby improving the overall quality of care.
The chapter highlights the evolution of sexual health education from traditional approaches to more inclusive and comprehensive frameworks, emphasizing the need for evidence-based, age-appropriate information. This chapter also delves into the benefits and challenges of comprehensive sex education. Lastly, we discuss the role of school-based providers in bridging the gap and providing sexual health education.
Lower extremity amputation secondary to diabetic foot ulcers (DFU) is associated with a 50% mortality rate within 5 years. The aim of this case series is to understand the risk factors and management of DFU leading to above-knee or below-knee amputation at an urban medical center. We conducted a retrospective review of the medical history, foot examination findings, noninvasive vascular studies, angiographic imaging, and radiology results from hospital stays during which patients underwent amputation. A total of 35 patients with DFU who underwent amputation between 2016 and 2021 were evaluated. Of these, 16 ambulatory patients had complete medical data and were included in the analysis. Risk factors for amputation, clinical presentation, diagnostic findings (e.g. vascular studies or imaging), and amputation approaches were analyzed. Our study found significant variability in the medical history, presentation, and management of patients with DFU who underwent lower extremity amputations, including differences in vascular abnormalities and the timing of care. Poor glucose control (median HbA1c of 10.3%) and delayed presentation likely contributed to tissue loss and amputation. Understanding the individual medical presentations and management of patients undergoing leg amputation secondary to DFU may inform the development of more effective strategies to prevent this complication in patients with diabetes. Learning points There is significant variability in the presentation and progression of diabetic foot ulcers (DFUs). Diagnostic evaluation of DFU varies between patients; a more standardized evaluation to inform best practices could be useful. Socioeconomic status (SES) plays a role in the increased risk of amputations among DFU patients, including delay in care and access to limb salvage programs. Multidisciplinary care, including early detection of DFU, patient education, and routine screenings, is essential for improving outcomes and reducing the risk of amputations in high-risk DFU patients.
Background Procedural complexity during percutaneous coronary interventions (PCI) with drug-eluting stent (DES) has been associated with adverse events, especially in case of long and multiple stents implantation. Objective This study aims to validate contemporary complex PCI criteria for drug coated balloon (DCB)-based PCI. Methods Consecutive patients undergoing DCB angioplasty at 2 Italian centers from 2018 to 2023 were retrospectively enrolled. Complex DCB-PCI was defined as the presence of at least 1 of the 6 following features: 3 vessels treated; ≥ 3 lesions treated; ≥ 3 devices (DES or DCB) used; bifurcation treated with 2 devices; total device length (DES + DCB) > 60 mm; CTO as target lesion. The primary endpoint was the 2 year incidence of target lesion failure (TLF), a composite of target lesion revascularization (TLR), target vessel-myocardial infarction and cardiac death, at time-to-first event analysis. Results A total of 1279 patients were included, of whom 642 (50.2%) met complex PCI criteria. The most frequently met criteria was “total device length > 60 mm” (71.6% in the complex PCI group). The proportion of in-stent restenosis (ISR) was 30.8% in the complex DCB-PCI group and 43.8% in the non-complex PCI group (p < 0.001). After adjusting for relevant clinical covariates and for the presence of ISR, patients undergoing complex PCI had a higher incidence of TLF at 2 years as compared to those undergoing non-complex PCI (16.7 vs. 11.4%; adj. hazard ratio 1.73, 95% confidence interval 1.16–2.59, p = 0.007). However, such difference was significant only in the ISR subgroup, while outcomes of complex and non-complex PCI for de novo lesions were similar. Conclusions In a real-world cohort of patients undergoing DCB angioplasty, complex PCI criteria were frequently met and associated with higher risk of TLF. However, their prognostic impact was limited in patients with de novo coronary lesions treated with DCB. Graphical abstract
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
1,570 members
Azeem Latib
  • Department of Cardiology
Neeraj Lalwani
  • Department of Radiology
Chandan Guha
  • Department of Oncology
Abul Kalam Azad
  • Department of Pathology
Information
Address
The Bronx, United States
Head of institution
Steven M Sayfer, MD