The study attempts to analyse whether the COVID-19 pandemic affected the incidence of forearm, arm, and hip fractures during a 1-year observation period. Additionally, changes in the overall treatment costs of those fractures were estimated. During the COVID-19 pandemic, the incidence of forearm, arm, and hip fractures remained statistically unchanged, neither were any significant changes observed in the expenditure, incurred for the treatment of the fracture cases. Purpose: The purpose of the study was to find out and evaluate if the consequences of COVID-19 pandemic (including lockdown and the fear of infection) influenced the incidence of osteoporotic forearm, arm, and hip fractures and to estimate the changes in the costs of their management during one-year observation period. Methods: The incidence of forearm, arm, and hip fractures was collected for the population, aged ≥ 50, residing at the district of Tarnowskie Góry and the Town of Piekary Śląskie, Poland, during 1 year of COVID-19 pandemic (from March 16th 2020 to March 15th 2021). The obtained results were compared with the number of corresponding limb fractures, recorded before the pandemic during five consecutive yearly periods, each starting from 16th March and ending on the 15th March of a subsequent year, the entire period covering the years 2015-2020. The rates of the analysed fractures were calculated per 100,000 inhabitants together with their economic impact. Results: The mean numbers and the incidence rates of upper extremity fractures were slightly lower during the COVID-19 pandemic than in the previous 5 years, whereas hip fracture figures remained almost stable. The observed changes were not statistically significant. That annual observation revealed a slight decrease in expenditure volumes, when compared to the analysed period before the pandemic (-0.33%). Conclusion: The decreased incidence rate of forearm, arm, and hip fractures, observed during the first months of the COVID-19 pandemic, was not statistically significant in the 1-year observation. After several weeks/months under the shock, caused by government limitations and the fear of infection, the number of patients remained unchanged during the one-year observation.
Background Medical nutrition therapy may be associated with clinical outcomes in critically ill patients with prolonged intensive care unit (ICU) stay. We wanted to assess nutrition practices in European intensive care units (ICU) and their importance for clinical outcomes. Methods Prospective multinational cohort study in patients staying in ICU ≥ 5 days with outcome recorded until day 90. Macronutrient intake from enteral and parenteral nutrition and non-nutritional sources during the first 15 days after ICU admission was compared with targets recommended by ESPEN guidelines. We modeled associations between three categories of daily calorie and protein intake (low: < 10 kcal/kg, < 0.8 g/kg; moderate: 10–20 kcal/kg, 0.8–1.2 g/kg, high: > 20 kcal/kg; > 1.2 g/kg) and the time-varying hazard rates of 90-day mortality or successful weaning from invasive mechanical ventilation (IMV). Results A total of 1172 patients with median [Q1;Q3] APACHE II score of 18.5 [13.0;26.0] were included, and 24% died within 90 days. Median length of ICU stay was 10.0 [7.0;16.0] days, and 74% of patients could be weaned from invasive mechanical ventilation. Patients reached on average 83% [59;107] and 65% [41;91] of ESPEN calorie and protein recommended targets, respectively. Whereas specific reasons for ICU admission (especially respiratory diseases requiring IMV) were associated with higher intakes (estimate 2.43 [95% CI: 1.60;3.25] for calorie intake, 0.14 [0.09;0.20] for protein intake), a lack of nutrition on the preceding day was associated with lower calorie and protein intakes (− 2.74 [− 3.28; − 2.21] and − 0.12 [− 0.15; − 0.09], respectively). Compared to a lower intake, a daily moderate intake was associated with higher probability of successful weaning (for calories: maximum HR 4.59 [95% CI: 1.5;14.09] on day 12; for protein: maximum HR 2.60 [1.09;6.23] on day 12), and with a lower hazard of death (for calories only: minimum HR 0.15, [0.05;0.39] on day 19). There was no evidence that a high calorie or protein intake was associated with further outcome improvements. Conclusions Calorie intake was mainly provided according to the targets recommended by the active ESPEN guideline, but protein intake was lower. In patients staying in ICU ≥ 5 days, early moderate daily calorie and protein intakes were associated with improved clinical outcomes. Trial registration NCT04143503 , registered on October 25, 2019. Graphical abstract
The appearance of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) and its spread all over the world is the cause of the coronavirus disease 2019 (COVID-19) pandemic, which has recently resulted in almost 400 million confirmed cases and 6 million deaths, not to mention unknown long-term or persistent side effects in convalescent individuals. In this short review, we discuss approaches to treat COVID-19 that are based on current knowledge of the mechanisms of viral cell receptor recognition, virus–host membrane fusion, and inhibition of viral RNA and viral assembly. Despite enormous progress in antiviral therapy and prevention, new effective therapies are still in great demand.
Background It is estimated that about 10% of pancreatic cancer cases have a genetic background. People with a familial predisposition to pancreatic cancer can be divided into 2 groups. The first is termed hereditary pancreatic cancer, which occurs in individuals with a known hereditary cancer syndrome caused by germline single gene mutations (e.g., BRCA1/2 , CDKN2A ). The second is considered as familial pancreatic cancer, which is associated with several genetic factors responsible for the more common development of pancreatic cancer in certain families, but the precise single gene mutation has not been found. Aim This review summarizes the current state of knowledge regarding the risk of pancreatic cancer development in hereditary pancreatic cancer and familial pancreatic cancer patients. Furthermore, it gathers the latest recommendations from the three major organizations dealing with the prevention of pancreatic cancer in high-risk groups and explores recent guidelines of scientific societies on screening for pancreatic cancers in individuals at risk for hereditary or familial pancreatic cancer. Conclusions In order to improve patients’ outcomes, authors of current guidelines recommend early and intensive screening in patients with pancreatic cancer resulting from genetic background. The screening should be performed in excellence centers. The scope, extent and cost-effectiveness of such interventions requires further studies.
The 5A score predicts in-hospital mortality of patients suffering from accidental hypothermia, including those not in cardiac arrest. The HOPE score was specifically developed to predict survival for the subgroup of hypothermic patients in cardiac considered for extracorporeal life support rewarming. The C-statistic in the external validation study of the HOPE score was 0.825 (95% CI: 0.753–0.897), confirming its excellent discrimination. In addition, its good calibration allows for a reliable interpretation of the corresponding survival probability after rewarming. The HOPE score should be used for predicting outcome and selecting hypothermic patients in cardiac arrest for rewarming.
Objective Many drugs applied to the skin with a systemic effect do not have a therapeutic effect, due to the barrier posed by the complex structure of the skin. To counteract this, absorption promoters are often added to the drug formulation. The use of albumin as an effective drug carrier is increasingly being addressed. Albumin, a natural, non-toxic polymer, can target drugs to specific cells and extend their biological half-life. This study was designed to trace the permeation of albumin after topical administration to the skin as a potential carrier of therapeutic substances. Materials and methods Four dermal formulations based on different polymers were prepared: methyl cellulose, sodium alginate, hypromellose and chitosan with methyl cellulose, obtaining final concentrations of albumin of 2%, 1.5% and 1%. The permeation of albumin through the skin was examined under simulated in vivo conditions. Results Most albumin permeated from the methylcellulose-based hydrogel. Depending on the concentration of albumin, permeation profiles were plotted and permeation rate constant and AUC(0–24 h) were calculated. Conclusion Methylcellulose was the optimal polymer for albumin release, whereas hypromellose was the least favorable. The concentration of albumin influences the amount and rate of permeation of this protein. The optimal concentration was 10 mg/g, from which the most albumin penetrated and the fastest. Human skin appeared to be more permeable to albumin than pig skin. However, the similar permeation profile through both membranes successfully allows the use of pig skin to track and evaluate the permeation of therapeutic substances with systemic effects.
Background The effect of age on the incidence of late sequelae that occur after anticancer treatment in childhood is still not fully elucidated. In this multicenter study of long-term survivors diagnosed before age of three, we investigated the prevalence of late effects many years after treatment. Methods The study group (n = 561) was selected from the Polish National Childhood Cancer Survivors Registry (n = 1761) created in 2007. A survivor was defined as an individual who has survived at least 5 years after completion of anticancer treatment. All children were diagnosed between 1991 and 2016, mean age at diagnosis was 1.82 years (range 0.03–2.99) and median follow up time - 9.85 years (range 5.0–23.6). They were treated in accordance with international protocols approved by the Polish Pediatric Leukemia and Lymphoma Group and Polish Solid Tumor Group. Chemotherapy alone was used in 192 (34.2%), chemotherapy and radiotherapy – 56 (10%), chemotherapy and surgery – 176 (31.4%), chemotherapy, radiotherapy, and surgery – 79 (14.1%), and surgery alone in 58 patients (10.3%). Results Of all patients enrolled to the study, only 94 (16.8%) had normal function of all organs. Seventy-six (13.5%) children developed dysfunction in one organ, another 83 (14.8%) had symptoms or complaints suggestive of dysfunction in two organs or systems, 88 (15.7%) had abnormalities in three organs, and 220 (39.2%) had at least four or more dysfunctions. In the entire study group, dysfunctions most frequently (> 20% of cases) involved the following organs/systems: circulatory – 21.8%, urinary – 30.8%, gastrointestinal – 20.8%, immune – 23.5%, vision – 20.7%, hearing – 21.8%, and oral and masticatory dysfunction – 26.9%. We did not find any significant differences in organ dysfunction between children diagnosed under the age of 1 and those diagnosed at the age of 1–3, except for a lower incidence of thyroid abnormalities (p = 0.007) and the higher prevalence of liver dysfunction in youngest patients. In the subset with longer follow-up period (> 10 years) more frequent thyroid abnormalities (p = 0.019), male (p = 0.002) and female (p = 0.026) gonads dysfunction, as well as musculoskeletal problems (p < 0.001) were observed. Among subjects who received radiotherapy compared to those who did not, short stature (p = 0.001), and dysfunction of the following systems/organs – circulatory (p = 0.049), urinary (p = 0.012), thyroid gland (p < 0.0001), nervous (p = 0.007), immunological (p = 0.002), liver (p = 0.03), dental or chewing difficulties (p = 0.001), hearing (p = 0.001) and musculoskeletal (p = 0.026) were more frequently reported. When multimodal therapy was applied (chemotherapy, radiotherapy, and surgery) a higher incidence of short stature (p = 0.007), urinary system disorders (p < 0.0001), thyroid dysfunction (p < 0.0001), hearing loss (p < 0.0001), and skin problems (p = 0.031) were observed. Conclusion This study confirms that radiotherapy and some specific toxicity of cytostatics are the most important factors affecting organ function. Apart from a higher incidence of liver dysfunction in the youngest patients, there were no significant differences in organ and system toxicities between children diagnosed under the age of 1 and those diagnosed at the age of 1–3. We have shown that this group requires systematic, careful and long-term follow-up.
Context: Loncastuximab tesirine (loncastuximab tesirine-lpyl; Lonca) is a novel antibody-drug conjugate comprising an anti-CD19 monoclonal antibody conjugated to a pyrrolobenzodiazepine dimer toxin, indicated for the treatment of relapsed/refractory diffuse large B-cell lymphoma (R/R DLBCL) after ≥2 systemic treatments. Objective: To characterize the safety and preliminary efficacy of Lonca + rituximab (Lonca-R). Methods: This is a phase 3, randomized, open-label, two-part, two-arm, multicenter study of Lonca-R in patients with R/R DLBCL (NCT04384484). Twenty patients were enrolled in part 1 in a nonrandomized safety run-in. In part 2, approximately 330 patients will be randomized 1:1 to receive Lonca-R or rituximab-gemcitabine-oxaliplatin (R-GemOx). Key inclusion criteria include age ≥18 years, diagnosis of DLBCL (including DLBCL transformed from indolent lymphoma) or high-grade B-cell lymphoma with MYC and BCL2 and/or BCL6 rearrangements, ≥1 line of prior systemic therapy, not a candidate for stem cell transplantation, and measurable disease per the 2014 Lugano criteria. All patients in the safety run-in received Lonca 0.15 mg/kg + rituximab at 375 mg/m² every 3 weeks (Q3W) for 2 cycles and then Lonca 0.075 mg/kg + rituximab at 375 mg/m² Q3W for up to 6 additional cycles. Results: The 20 patients in the safety run-in were a median age of 74.5 years (range 35–93) and received a median of 1 previous therapy (range 1–6). As of February 28, 2022 (data cutoff), 19 (95%) patients had at least 1 treatment-emergent adverse event (TEAE), and 10 (50%) patients had grade ≥3 TEAEs. The most common all-grade TEAEs, regardless of the relationship to the study treatment, were rash (5 [25%]), fatigue (4 [20%]), and increased gamma-glutamyltransferase (4 [20%]). The most common grade ≥3 TEAEs were increased gamma-glutamyltransferase (3 [15%]), increased alanine aminotransferase (2 [10%]), and neutropenia (2 [10%]). The overall response rate by central review was 15/20 (75%). A total of 8/20 (40%) and 7/20 (35%) patients attained complete response and partial response, respectively. Conclusions: Lonca-R demonstrated no new safety signals and showed encouraging antitumor activity in patients with R/R DLBCL. The randomized part of LOTIS-5 commenced in January 2022; recruitment is ongoing. Funding: ADC Therapeutics SA; medical writing: CiTRUS Health Group.
Context: The Bruton tyrosine kinase (BTK) inhibitor, zanubrutinib, was designed for high BTK specificity and minimal toxicity. SEQUOIA (NCT03336333) is a global, open-label, randomized phase 3 study in treatment-naïve patients with CLL/SLL without del(17p) who were unsuitable for fludarabine/cyclophosphamide/rituximab. Design: Patients were randomized to receive zanubrutinib (160 mg twice daily) or bendamustine (day 1-2: 90 mg/m²) and rituximab (cycle 1: 375 mg/m²; cycles 2-6: 500 mg/m²); stratification factors were age (<65 years vs ≥65 years), Binet Stage, IGHV mutation, and geographic region. Main Outcome Measures: Primary endpoint was an independent review committee (IRC)-assessed progression-free survival (PFS). Secondary endpoints included investigator-assessed (INV) PFS, overall response rate (ORR), overall survival (OS), and safety. Results: From October 31, 2017, to July 22, 2019, 479 patients were enrolled (zanubrutinib=241; BR=238). Baseline characteristics (zanubrutinib vs BR): median age, 70.0 years versus 70.0 years; unmutated IGHV, 53.4% versus 52.4%; del(11q), 17.8% versus 19.3%. With median follow-up of 26.2 months, PFS was significantly prolonged with zanubrutinib by IRC (HR 0.42; 2-sided P<.0001) and INV (HR 0.42; 2-sided P=.0001). Zanubrutinib treatment benefit occurred across age, Binet stage, bulky disease, del(11q) status, and unmutated IGHV (HR 0.24; 2-sided P<.0001), but not mutated IGHV (HR 0.67; 2-sided P=.1858). For zanubrutinib versus BR, 24-month PFS-IRC=85.5% versus 69.5%; ORR-IRC=94.6% versus 85.3%; complete response rate=6.6% versus 15.1%; ORR-INV=97.5% versus 88.7%; and 24-month OS=94.3% versus 94.6%. Select adverse event (AE) rates (zanubrutinib vs BR): atrial fibrillation (3.3% vs 2.6%), bleeding (45.0% vs 11.0%), hypertension (14.2% vs 10.6%), infection (62.1% vs 55.9%), and neutropenia (15.8% vs 56.8%). Treatment discontinuation due to AEs (zanubrutinib vs BR)=20 patients (8.3%) versus 31 patients (13.7%); AEs leading to death=11 patients (4.6%) versus 11 patients (4.8%). No sudden deaths occurred. Conclusions: In summary, zanubrutinib significantly improved PFS-IRC versus BR and was well tolerated, supporting the potential utility of frontline zanubrutinib in treatment-naïve CLL/SLL.
Background: MMB, an oral JAK1/2 and ACVR1/ALK2 inhibitor, was evaluated (vs DAN) in a pivotal phase 3 study of MF patients previously treated with a JAK inhibitor (JAKi). This subgroup analysis evaluated MOMENTUM patients with baseline platelet counts ≤150 × 10⁹/L. Methods: Eligibility: Primary or post-ET/PV MF; DIPSS high risk, Int-2, or Int-1; total symptom score (TSS) ≥10; hemoglobin <10 g/dL; prior JAKi ≥90 days, or ≥28 days if RBC transfusions ≥4 units in 8 weeks or Grade 3/4 thrombocytopenia, anemia, or hematoma; palpable spleen ≥5 cm; platelets ≥25 × 10⁹/L. JAKi taper/washout ≥21 days. Randomization 2:1 to MMB 200 mg or DAN 600 mg QD (+ placebo) for 24 weeks. Primary endpoint: TSS response (≥50% reduction from baseline) rate at week 24. Secondary endpoints at week 24: transfusion independence (TI) rate, splenic response rate (SRR; ≥25% volume reduction from baseline), TSS change from baseline, SRR (≥35% reduction), and rate of zero transfusions since baseline. Results: Mean baseline TSS: 29 MMB, 26 DAN, hemoglobin: 8.1 MMB, 7.8 DAN g/dL, and platelets: 74 × 10⁹/L MMB, 73 × 10⁹/L DAN. Efficacy results are consistent with the ITT analysis set for MMB vs DAN, respectively: TSS response rate (29.6% vs 11.6%), TI rate (32.1% vs 18.6%), SRR ≥25% (39.5% vs 7.0%), TSS change (-10.7 vs -3.8), SRR ≥35% (22.2% vs 4.7%), and rate of zero transfusions (30.9% vs 11.6%). Most common grade ≥3 TEAEs were thrombocytopenia (MMB, 31%; DAN, 16%) and anemia (MMB, 7%; DAN, 14%); grade ≥3 bleeding events: 9% MMB, 5% DAN. TEAEs leading to study drug discontinuation: 15% MMB, 19% DAN. A trend toward improved overall survival up to week 24 was seen with MMB vs DAN [HR (95% CI)=0.490 (0.195, 1.235)]. Analyses of patients with baseline platelets <100 × 10⁹/L (N=100) and baseline platelets <50 × 10⁹/L (N=31) show similar efficacy, safety, and survival profiles for MMB vs DAN. Conclusions: In symptomatic, anemic, and thrombocytopenic MF patients, MMB was superior to DAN for symptom responses, transfusion requirements, and spleen responses with comparable safety and favorable survival. MMB may address a critical unmet need in thrombocytopenic MF patients. NCT04173494.
Background: MMB, a JAK1/2 and ACVR1/ALK2 inhibitor, showed clinical activity in the MF SIMPLIFY trials. The pivotal phase 3 MOMENTUM study of MF patients previously treated with a JAK inhibitor (JAKi) tested MMB vs DAN on key symptom, anemia, and splenic endpoints. Methods: Eligibility: Primary or post- essential thrombocythemia (ET)/polycythemia vera (PV) MF; DIPSS High/Int-2/Int-1; MF symptom assessment form total symptom score (TSS) ≥10; hemoglobin <10 g/dL; prior JAKi ≥90 days, or ≥28 days if RBC transfusions ≥4 units in 8 weeks or grade 3/4 thrombocytopenia, anemia, or hematoma; palpable spleen ≥5 cm. Stratification: TSS, palpable spleen, and RBC units transfused. JAKi taper/washout ≥21 days. Randomization: 2:1 MMB 200 mg QD+DAN placebo or DAN 600 mg QD+MMB placebo for 24 weeks. Primary endpoint: TSS response (≥50% reduction from baseline) rate at week 24. Secondary endpoints, assessed sequentially at week 24: transfusion independence (TI) rate, splenic response rate (SRR; ≥25% volume reduction from baseline), TSS change from baseline, SRR (≥35% reduction) and rate of zero transfusions since baseline. Results: 94/130 (72%) MMB and 38/65 (58%) DAN patients completed randomized treatment (RT). Mean baseline TSS were 28 (MMB) and 26 (DAN), hemoglobin levels were 8.1 (MMB) and 7.9 (DAN) g/dL, and median platelets were 97 (MMB) and 94 (DAN) x10⁹/L. Baseline TI was 13% (MMB) and 15% (DAN). Prior JAKi was ruxolitinib in 195 (100%) and fedratinib in 9 (5%) patients. All primary and key secondary endpoints were met: TSS response (24.6% vs 9.2%), TI (30.8% vs 20.0%), SRR25 (40.0% vs 6.2%), TSS change (-9.36 vs -3.13), SRR35 (23.1% vs 3.1%), and zero transfusions (35.4% vs 16.9%). Most common grade ≥3 TEAEs in RT were thrombocytopenia (MMB, 22%; DAN, 12%) and anemia (MMB, 8%; DAN, 11%). TEAEs led to study drug discontinuation in 18% of MMB and 23% of DAN patients in RT. Trend toward improved survival up to week 24 was seen with MMB vs DAN (HR=0.506, p=0.0719). Conclusions: In symptomatic and anemic MF patients, MMB was superior to DAN for symptom responses, transfusion requirements, and spleen responses with comparable safety and favorable survival. MMB may address a critical unmet need, particularly in MF patients with anemia.
Dry needling (DN) is a standard procedure for treating musculoskeletal disorders. However, there are no clear recommendations for using DN in low back pain (LBP). Therefore, this study aimed to assess the effectiveness of the novel DN program for reducing pain intensity and improving functional efficiency in patients with chronic LBP. A group of 40 patients with chronic LBP due to the L5-S1 discopathy were eligible and randomized into experimental (n = 20) and control (n = 20) groups. The DN program was performed for the experimental group according to the Five Regulatory Systems (FRS) concept. The control group received sham therapy using placebo needles. DN sessions were performed twice a week for 4 weeks. A single needling application lasted 60 min. Both groups received standard treatment and physical exercise of LBP for 1 month. Subjective pain was measured by a visual analog scale (VAS), functional efficiency was assessed with the Oswestry Disability Index (ODI), and the lower spine range of motion was measured with the Schober test. There were significant differences in pain reduction (VAS) in both groups (p < 0.001). The strongest analgesic effect in the DN group yielded 6.45 points immediately after the therapy, 6.2 points after 1 month, and 6 points after 3 months. The DN group scored higher VAS reduction than the control group (p < 0.001). There were significant differences in the functional state (ODI) in the experimental group (p < 0.001). There was a significant ODI decrease by 18.1 points, after 1 month by 18.9 points, and after 3 months by 17.6 points. No significant differences were found in the control group (p > 0.05). Intergroup differences were observed in the functional efficiency in ODI in all measurement time-points (p < 0.001). There were significant differences in the range of motion (Schober test) in the DN group (main effect: p < 0.001). For all measurements, differences (p < 0.001) were observed in favor of DN compared to the control. In conclusion, DN program according to the FRS concept stands for the novel treatment method supplemented by an exercise program, effectively reducing pain and improving functional efficiency in LBP patients.
Background: The method of recruiting the study subjects is an important element of the study design. It can have a strong influence on the results. Different recruitment schedules can give a different picture of the studied phenomenon. Objectives: The aim of the study was to compare bone health in a group of female patients treated for osteoporosis with a population-based sample. Material and methods: A cohort of women from GO Study from 1 outpatient osteoporotic clinic (n = 1442, mean age 65.8 ±6.7 years) and population-based female sample of RAC-OST-POL Study (n = 963, mean age 65.8 ±7.5 years) were studied. Mean age did not differ between groups. Mean weight, height and body mass index (BMI) in subjects from GO Study and RAC-OST-POL Study were 69.5 ±13.1 kg, 157.8 ±6.1 cm and 27.9 ±5.1 kg/m2, and 74.2 ±13.7 kg, 156.0 ±6.0 cm and 30.5 ±5.4 kg/m2, respectively, and differed significantly (p < 0.0001 for each variable). Data on clinical risk factors for osteoporosis and fractures were collected. Bone densitometry at hip was performed using a Prodigy or Lunar DPX device (GE Healthcare, Waukesha, USA). Fracture risk was established using FRAX, Garvan and POL-RISK. Results: Mean values of T-score for femoral neck in subjects from GO Study and RAC-OST-POL Study were -1.67 ±0.91 and -1.27 ±0.91 and differed significantly (p < 0.0001). In GO Study and RAC-OST-POL Study, there were 518 (35.9%) and 280 (29.1%) subjects with fractures, respectively. The fracture frequency was significantly higher in the GO Study group (p < 0.001). Among clinical risk factors, only rheumatoid arthritis (p < 0.0001) secondary osteoporosis (p < 0.0001) and falls (p < 0.0001) were more frequent in RAC-OST-POL Study. Fracture risk established using FRAX, Garvan and POL-RISK calculators was significantly greater in patients enrolled in the GO Study than in subjects from the RAC-OST-POL population-based sample (p < 0.0001 for each variable). Conclusions: Differences noted between female patients treated for osteoporosis and population-based sample, especially in regard to fracture risk, reveal a strong influence of recruitment criteria on study results in the field of bone health and osteoporosis.
Introduction Atrial fibrillation (AF) is a prevalent disease considerably contributing to the worldwide cardiovascular burden. For patients at high thromboembolic risk (CHA 2 DS 2 -VASc ≥3) and not suitable for chronic oral anticoagulation, owing to history of major bleeding or other contraindications, left atrial appendage occlusion (LAAO) is indicated for stroke prevention, as it lowers patient’s ischaemic burden without augmentation in their anticoagulation profile. Methods and analysis Stand-Alone Left Atrial appendage occlusion for throMboembolism prevention in nonvalvular Atrial fibrillatioN DiseasE Registry (SALAMANDER) will be conducted in 10 heart surgery and cardiology centres across Poland to assess the outcomes of LAAO performed by fully thoracoscopic-epicardial, percutaneous-endocardial or hybrid endo-epicardial approach. The registry will include patients with nonvalvular AF at a high risk of thromboembolic and bleeding complications (CHA2DS2-VASc Score ≥2 for males, ≥3 for females, HASBLED score ≥2) referred for LAAO. The first primary outcome is composite procedure-related complications, all-cause death or major bleeding at 12 months. The second primary outcome is a composite of ischaemic stroke or systemic embolism at 12 months. The third primary outcome is the device-specific success assessed by an independent core laboratory at 3–6 weeks. The quality of life (QoL) will be assessed as well based on the QoL EQ-5D-5L questionnaire. Medication and drug adherence will be assessed as well. Ethics and dissemination Before enrolment, a detailed explanation is provided by the investigator and patients are given time to make an informed decision. The patient’s data will be protected according to the requirements of Polish law, General Data Protection Regulation (GDPR) and hospital Standard Operating Procedures. The study will be conducted in accordance with the Declaration of Helsinki. Ethical approval was granted by the local Bioethics Committee of the Upper-Silesian Medical Centre of the Silesian Medical University in Katowice (decision number KNW/0022/KB/284/19). The results will be published in peer-reviewed journals and presented during national and international conferences. Trial registration number NCT05144958 .
Soluble cell adhesion molecules (sCAMs) are involved in the development of neoplastic diseases. sCAMs can block lymphocytes and promote angiogenesis and migration of breast cancer (BC) cells. Interleukin 6 (IL-6) and tumor necrosis factor α (TNF-α) enhance metastatic potential via upregulation of CAMs. We assessed soluble interleukin-6 receptor subunit alpha (IL-6Ra), TNF-R1, TNF-R2, E-selectin, P-selectin, VCAM-1, ICAM-1, and EpCAM in 89 women with stage I-III BC and 28 healthy women. Blood samples were obtained at the beginning of neoadjuvant/induction (N = 49) or adjuvant treatment (N = 40), and after 2 months. Surgery revealed complete response in 29.4% of patients, partial response in 67%, and stable disease in 5.9%. Achieving a pathological response was 4 times greater for baseline levels of sIL-6Ra >5.63 ng/mL [odds ratio (OR) = 4.1, 95% confidence interval (CI): 0.8-20.4, P = 0.08] and more than 6 times for soluble tumor necrosis factor receptor 1 (sTNF-R1) ≥ 0.97 ng/mL (OR = 6.2, 95% CI: 1.2-32.3, P < 0.05). Compared with the control group, serum sP-selectin, soluble epithelial cell adhesion molecule (sEpCAM), and sTNF-R2 concentrations were significantly higher in patients who started adjuvant therapy (P < 0.05) and preoperative therapy (P < 0.01). Baseline serum sIL-6Ra concentrations were significantly higher in patients before surgery than in patients after tumor resection (P < 0.05), independent of the follow-up time. The baseline serum soluble receptors of IL-6 (sIL-6R) and TNF-α (sTNF-R1) concentrations have a predictive value for preoperative therapy in patients with BC.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.