ABSTRACT
Sepsis induces immune alterations, which last for months after the resolution of illness. The effect of this immunological reprogramming on the risk of developing cancer is unclear. Here we use a national claims database to show that sepsis survivors had a lower cumulative incidence of cancers than matched nonsevere infection survivors. We identify a chemokine network released from sepsis-trained resident macrophages that triggers tissue residency of T cells via CCR2 and CXCR6 stimulations as the immune mechanism responsible for this decreased risk of de novo tumor development after sepsis cure. While nonseptic inflammation does not provoke this network, laminarin injection could therapeutically reproduce the protective sepsis effect. This chemokine network and CXCR6 tissue-resident T cell accumulation were detected in humans with sepsis and were associated with prolonged survival in humans with cancer. These findings identify a therapeutically relevant antitumor consequence of sepsis-induced trained immunity.
Subject(s)
Macrophages , Neoplasms , Sepsis , Humans , Sepsis/immunology , Macrophages/immunology , Female , Neoplasms/immunology , Neoplasms/therapy , Male , Receptors, CXCR6/metabolism , Animals , T-Lymphocytes/immunology , Receptors, CCR2/metabolism , Middle Aged , Mice , Aged , Chemokines/metabolism , AdultABSTRACT
BACKGROUND & AIMS: Beyond cardiovascular disease protection, the health consequences of very low concentrations of low-density lipoprotein-cholesterol (LDL-C) remain a matter of debate. In primary hypobetalipoproteinemia (HBL), liver steatosis and cirrhosis have occasionally been reported. Here, we aimed to investigate the association between HBL and the risk of hepatic complications (cirrhosis complications and/or primary liver cancer) in the general population. METHODS: A cohort study was conducted in the French population-based cohort CONSTANCES. Participants with primary HBL (LDL-C <5th percentile for age and sex, [HBL]) were compared with those with normal LDL-C concentrations (40th-60th percentile, [Control]). Participants on lipid-lowering therapies were excluded. For hepatic complications, follow-up events were compared by calculating the incidence density ratio (IDR). The same analyses were replicated in the UK Biobank (UKBB) cohort. RESULTS: In the CONSTANCES and UKBB cohorts, 34,653 and 94,666 patients were analyzed, with median ages of 45 and 56 years, mean LDL-C concentrations (HBL vs. control) of 71 vs. 128 mg/dl and 86 vs. 142 mg/dl, and mean follow-up durations of 5.0 and 11.5 years, respectively. The HBL group presented a higher incidence of hepatic complications than the control group: 0.32/ vs. 0.07/1,000 person-years (IDR = 4.50, 95% CI 1.91-10.6) in CONSTANCES, and 0.69/ vs. 0.21/1,000 person-years (IDR = 3.27, 95% CI 2.63-4.06) in the UKBB. This risk proved to be independent of classic risk factors for liver disease (obesity, alcohol consumption, diabetes, viral hepatitis), including in a 5-year landmark analysis excluding early events. Sensitivity analyses based on apoliprotein-B levels (instead of LDL-C levels) or genetically defined HBL showed similar results. CONCLUSIONS: HBL is associated with a markedly increased risk of hepatic complications. HBL must be considered as a substantial independent risk factor for liver diseases which justifies specific prevention and screening. IMPACT AND IMPLICATIONS: Hypobetalipoproteinemia (HBL) is a lipid disorder characterized by permanent, inherited low levels (below the 5th percentile) of low-density lipoprotein-cholesterol. While HBL is associated with a lower risk of cardiovascular events, some studies suggest that it may be associated with a potential risk of hepatic steatosis and hepatic complications. Here, we studied the association between HBL and hepatic complications (defined as cirrhosis complications and/or primary liver cancer) in two populations of several hundred thousand people, both in France (CONSTANCES cohort) and the United Kingdom (UKBB). The results show that HBL is associated with a significant and independent excess risk of hepatic complications, including primary liver cancer. Thus, in people with HBL, the value of regular liver monitoring must be studied.
Subject(s)
Cholesterol, LDL , Humans , Female , Male , Middle Aged , Cholesterol, LDL/blood , Adult , France/epidemiology , Risk Factors , Cohort Studies , Liver Neoplasms/epidemiology , Liver Neoplasms/blood , Liver Cirrhosis/epidemiology , Liver Cirrhosis/blood , Liver Cirrhosis/complications , Aged , IncidenceABSTRACT
BACKGROUND: Long-term outcomes of lung transplantation (LTx) remain hampered by chronic lung allograft dysfunction (CLAD). Matrix metalloproteinase 9 (MMP-9) is a secretory endopeptidase identified as a key mediator in fibrosis processes associated with CLAD. The objective of this study was to investigate whether plasma MMP9 levels may be prognostic of CLAD development. METHODS: Participants were selected from the Cohort in Lung Transplantation (COLT) for which a biocollection was associated. We considered two time points, year 1 (Y1) and year 2 (Y2) post-transplantation, for plasma MMP-9 measurements. We analysed stable recipients at those time points, comparing those who would develop a CLAD within the 2 years following the measurement to those who would remain stable 2 years after. RESULTS: MMP-9 levels at Y1 were not significantly different between the CLAD and stable groups (230 ng/ml vs. 160 ng/ml, p = 0.4). For the Y2 analysis, 129 recipients were included, of whom 50 developed CLAD within 2 years and 79 remained stable within 2 years. MMP-9 plasma median concentrations were higher in recipients who then developed CLAD than in the stable group (230 ng/ml vs. 118 ng/ml, p = 0.003). In the multivariate analysis, the Y2 MMP-9 level was independently associated with CLAD, with an average increase of 150 ng/ml (95% CI [0-253], p = 0.05) compared to that in the stable group. The Y2 ROC curve revealed a discriminating capacity of blood MMP-9 with an area under the curve of 66%. CONCLUSION: Plasmatic MMP-9 levels measured 2 years after lung transplantation have prognostic value for CLAD.
Subject(s)
Lung Transplantation , Matrix Metalloproteinase 9 , Humans , Prognosis , Allografts , Lung Transplantation/adverse effects , Lung , Biomarkers , Retrospective StudiesABSTRACT
DYRK1A Syndrome (OMIM #614104) is caused by pathogenic variations in the DYRK1A gene located on 21q22. Haploinsufficiency of DYRK1A causes a syndrome with global psychomotor delay and intellectual disability. Low birth weight, growth restriction with feeding difficulties, stature insufficiency, and microcephaly are frequently reported. This study aims to create specific growth charts for individuals with DYRK1A Syndrome and identify parameters for size prognosis. Growth parameters were obtained for 92 individuals with DYRK1A Syndrome (49 males vs. 43 females). The data were obtained from pediatric records, parent reporting, and scientific literature. Growth charts for height, weight, body mass index (BMI), and occipitofrontal circumference (OFC) were generated using generalized additive models through R package gamlss. The growth curves include height, weight, and OFC measurements for patients aged 0-5 years. In accordance with the literature, the charts show that individuals are more likely to present intrauterine growth restriction with low birth weight and microcephaly. The growth is then characterized by severe microcephaly, low weight, and short stature. This study proposes growth charts for widespread use in the management of patients with DYRK1A syndrome.
Subject(s)
Intellectual Disability , Microcephaly , Male , Female , Child , Humans , Microcephaly/diagnosis , Microcephaly/genetics , Growth Charts , Intellectual Disability/diagnosis , Intellectual Disability/genetics , Syndrome , Body Mass Index , Body Height/geneticsABSTRACT
OBJECTIVES: Hemolysis is a contributor to CS-AKI. Biochemistry analyzers provide a hemolysis index to quantify in vitro hemolysis, a condition that can, for example, affect the accuracy of potassium concentration measurements. We aimed to assess whether the postoperative plasma level of the hemolysis index (HIpostoperative) could aid the early recognition of patients at risk for cardiac surgery-associated acute kidney injury (CS-AKI) and also to evaluate other hemolysis indicators: plasma carboxyhemoglobin (COHbpostoperative) and methemoglobin (MetHbpostoperative). DESIGN: One-year retrospective study. SETTING: University hospital. PARTICIPANTS: Patients undergoing elective cardiac surgery. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: In 1090 patients, the median HIpostoperative was higher in patients who developed CS-AKI compared to patients who did not (11 mg/dL [interquartile range (IQR), 5-38 mg/dL] v 7 mg/dL [IQR, 3-16 mg/dL]; p < 0.001). HIpostoperative refined the early recognition of CS-AKI: the area under the precision-recall curve (AUPRC) for HIpostoperative was 37% (95% confidence interval [CI], 31%-42%), whereas the AUPRC associated with no discriminative power, equal to the prevalence of CS-AKI in the whole population, was 21%. Among the 611 patients with measurements for all 3 biomarkers, the AUPRC of HIpostoperative was higher than that of COHbpostoperative or MetHbpostoperative (+6.6% and +7.4% respectively; p < 0.0001 for both). Unlike COHbpostoperative or MetHbpostoperative, the incorporation of HIpostoperative into a model (trained on a sample then validated in another sample) of CS-AKI early recognition significantly enhanced its performance, with a +1.9% (95% CI, 1.6%-2.1%) increase in AUPRC (p < 0.0001). CONCLUSIONS: Elevated HIpostoperative represents an early alert signal for the development of CS-AKI. Our findings support the incorporation of HIpostoperative, a readily available biomarker, into predictive scores of CS-AKI.
ABSTRACT
BACKGROUND: The prognostication of long-term functional outcomes remains challenging in patients with traumatic brain injury (TBI). Our aim was to demonstrate that intensive care unit (ICU) variables are not efficient to predict 6-month functional outcome in survivors with moderate to severe TBI (msTBI) but are mostly associated with mortality, which leads to a mortality bias for models predicting a composite outcome of mortality and severe disability. METHODS: We analyzed the data from the multicenter randomized controlled Continuous Hyperosmolar Therapy in Traumatic Brain-Injured Patients trial and developed predictive models using machine learning methods and baseline characteristics and predictors collected during ICU stay. We compared our models' predictions of 6-month binary Glasgow Outcome Scale extended (GOS-E) score in all patients with msTBI (unfavorable GOS-E 1-4 vs. favorable GOS-E 5-8) with mortality (GOS-E 1 vs. GOS-E 2-8) and binary functional outcome in survivors with msTBI (severe disability GOS-E 2-4 vs. moderate to no disability GOS-E 5-8). We investigated the link between ICU variables and long-term functional outcomes in survivors with msTBI using predictive modeling and factor analysis of mixed data and validated our hypotheses on the International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) model. RESULTS: Based on data from 370 patients with msTBI and classically used ICU variables, the prediction of the 6-month outcome in survivors was inefficient (mean area under the receiver operating characteristic 0.52). Using factor analysis of mixed data graph, we demonstrated that high-variance ICU variables were not associated with outcome in survivors with msTBI (p = 0.15 for dimension 1, p = 0.53 for dimension 2) but mostly with mortality (p < 0.001 for dimension 1), leading to a mortality bias for models predicting a composite outcome of mortality and severe disability. We finally identified this mortality bias in the IMPACT model. CONCLUSIONS: We demonstrated using machine learning-based predictive models that classically used ICU variables are strongly associated with mortality but not with 6-month outcome in survivors with msTBI, leading to a mortality bias when predicting a composite outcome of mortality and severe disability.
ABSTRACT
INTRODUCTION: In the case of sudden unexpected death in infancy (SUDI), eye examination is systematic to detect retinal hemorrhages (RH) that are a crucial hallmark for abusive head trauma (AHT). The aim of this study is to assess the ability of non-invasive post-mortem fundus photographs (PMFP) to detect RH in case of SUDI. METHODS: Bicentric retrospective analysis of consecutive cases of SUDI under 2 years of age were managed by two French SUDI referral centers with PMFP by RetCam (Clarity Medical Systems USA). PMFP were reviewed randomly, twice, by three independent ophthalmologists blinded for clinical data. RESULTS: Thirty cases (60 eyes) were included. Median age was 3.5 months (interquartile [1.6; 6.0]). No child died of AHT. Image quality was sufficient to assert presence or absence of RH in 50 eyes (83%). Sufficient quality rate was significantly higher when the post-mortem interval was inferior to 18 h (91%, 42/46) as opposed to over 18 h (57%, 8/14, p=0.0096). RH were found in six eyes (10%), four children (13%), with excellent inter and intra-raters' concordance (Cohen's Kappa from 0.81 [0.56-1.00] to 1.00 [1.00-1.00]). CONCLUSION: PMFP can detect RH in case of SUDI and is a relevant systematic screening test to be carried out as soon as the deceased child arrives in the hospital. It can decrease the need of eye removal for pathological examination, but further studies are needed to define the best decision algorithm.
Subject(s)
Craniocerebral Trauma , Sudden Infant Death , Infant , Humans , Retinal Hemorrhage , Retrospective Studies , Autopsy , Sudden Infant Death/pathology , Craniocerebral Trauma/diagnosisABSTRACT
BACKGROUND: Advanced non-small cell lung cancer (NSCLC) with a PD-L1 tumour proportion score ≥ 50% can be treated with pembrolizumab alone. Our aim was to assess the impact of baseline tumour size (BTS) on overall survival (OS) in NSCLC patients treated with pembrolizumab versus chemotherapy. METHODS: This retrospective, multicentre study included all patients with untreated advanced NSCLC receiving either pembrolizumab (PD-L1 ≥ 50%) or platinum-based chemotherapy (any PD-L1). The primary endpoint was the impact of BTS (defined as the sum of the dimensions of baseline target lesions according to RECIST v1.1 criteria) on OS. RESULTS: Between 09-2016 and 06-2020, 188 patients were included, 96 in the pembrolizumab (P-group) and 92 in the chemotherapy group (CT-group). The median follow-up was 26.9 months (range 0.13-37.91) and 44.4 months (range 0.23-48.62), respectively, while the median BTS was similar, 85.5 mm (IQR 57.2-113.2) and 86.0 mm (IQR 53.0-108.5), respectively (p = 0.42). The median P-group OS was 18.2 months [95% CI 12.2-not reached (NR)] for BTS > 86 mm versus NR (95% CI 27.2-NR) for BTS ≤ 86 mm (p = 0.0026). A high BTS was associated with a shorter OS in univariate analyses (p = 0.009) as well as after adjustment on confounding factors (HR 2.16, [95% CI 1.01-4.65], p = 0.048). The CT-group OS was not statistically different between low and high BTS patients, in univariate and multivariate analyses (p = 0.411). CONCLUSIONS: After adjustment on major baseline clinical prognostic factors, BTS was an independent prognostic factor for OS in PD-L1 ≥ 50% advanced NSCLC patients treated first-line with pembrolizumab.
Subject(s)
Carcinoma, Non-Small-Cell Lung , Lung Neoplasms , Antibodies, Monoclonal, Humanized , B7-H1 Antigen/therapeutic use , Carcinoma, Non-Small-Cell Lung/pathology , Humans , Lung Neoplasms/pathology , Prognosis , Retrospective StudiesABSTRACT
BACKGROUND: Heart failure (HF) is a growing complication and one of the leading causes of mortality in people living with type 2 diabetes (T2D). Among the possible causes, the excess of red meat and the insufficiency of vegetables consumption are suspected. Such an alimentation is associated with nutritional biomarkers, including trimethylamine N-oxide (TMAO) and its precursors. Here, we aimed to study these biomarkers as potential prognostic factors for HF in patients with T2D. METHODS: We used the SURDIAGENE (SURvival DIAbetes and GENEtics) study, a large, prospective, monocentric cohort study including 1468 patients with T2D between 2001 and 2012. TMAO and its precursors (trimethylamine [TMA], betaine, choline, and carnitine) as well as thio-amino-acids (cysteine, homocysteine and methionine) were measured by liquid chromatography-tandem mass spectrometry. The main outcome was HF requiring Hospitalization (HFrH) defined as the first occurrence of acute HF leading to hospitalization and/or death, established by an adjudication committee, based on hospital records until 31st December 2015. The secondary outcomes were the composite event HFrH and/or cardiovascular death and all-cause death. The association between the biomarkers and the outcomes was studied using cause-specific hazard-models, adjusted for age, sex, history of coronary artery disease, NT-proBNP, CKD-EPI-derived eGFR and the urine albumin/creatinine ratio. Hazard-ratios (HR) are expressed for one standard deviation. RESULTS: The data of interest were available for 1349/1468 of SURDIAGENE participants (91.9%), including 569 (42.2%) women, with a mean age of 64.3 ± 10.7 years and a median follow-up of 7.3 years [25th-75th percentile, 4.7-10.8]. HFrH was reported in 209 patients (15.5%), HFrH and/or cardiovascular death in 341 (25.3%) and all-cause death in 447 (33.1%). In unadjusted hazard-models, carnitine (HR = 1.20, 95% CI [1.05; 1.37]), betaine (HR = 1.34, [1.20; 1.50]), choline (HR = 1.35, [1.20; 1.52]), TMAO (HR = 1.32, [1.16; 1.50]), cysteine (HR = 1.38, [1.21; 1.58]) and homocysteine (HR = 1.28, [1.17; 1.39]) were associated with HFrH, but not TMA and methionine. In the fully adjusted models, none of these associations was significant, neither for HFrH nor for HFrH and/or CV death, when homocysteine only was positively associated with all-cause death (HR = 1.16, [1.06; 1.27]). CONCLUSIONS: TMAO and its precursors do not appear to be substantial prognosis factors for HFrH, beyond usual cardiac- and kidney-related risk factors, whereas homocysteine is an independent risk factor for all-cause death in patients with T2D.
Subject(s)
Diabetes Mellitus, Type 2 , Heart Failure , Aged , Betaine , Biomarkers , Carnitine , Choline , Cohort Studies , Cysteine , Diabetes Mellitus, Type 2/diagnosis , Female , Heart Failure/diagnosis , Homocysteine , Hospitalization , Humans , Male , Methionine , Middle Aged , Prospective Studies , Risk FactorsABSTRACT
PURPOSE: Parathyroidectomy to treat tertiary hyperparathyroidism (THPT) is now on a par with calcimimetic treatment. The effects of cinacalcet and parathyroidectomy on kidney transplant function remain controversial. The aim of this study was to evaluate kidney transplant function in THPT patients treated either by parathyroidectomy, cinacalcet, or not treated. METHODS: Between 2009 and 2019, 231 patients with functional grafts presenting THPT, defined either by calcaemia superior to 2.5 mmol/L with elevated PTH level or hypercalcaemia with non-adapted PTH level 1 year after kidney transplantation, were included. Hyperparathyroid patients treated by cinacalcet and parathyroidectomy were matched for age, sex, graft rank, and baseline eGFR with cinacalcet-only and untreated patients. Conditional logistic regression models were used to compare eGFR variations 1 year after parathyroidectomy between operated patients and matched controls. Five-year survivals were compared with the Mantel-Cox test. RESULTS: Eleven patients treated with parathyroidectomy and cinacalcet were matched with 16 patients treated by cinacalcet-only and 29 untreated patients. Demographic characteristics were comparable between groups. Estimated odds ratios for eGFR evolution in operated patients compared with cinacalcet-only and untreated patients were 0.92 [95%CI 0.83-1.02] and 0.99 [0.89-1.10] respectively, indicating no significant impairment of eGFR 1 year after surgery. Five-year allograft survival was not significantly impaired in operated patients. CONCLUSIONS: Parathyroidectomy did not appear to substantially alter or improve graft function 1 year after surgery or 5-year allograft survival. It could be hypothesized that in addition to its known benefits, parathyroidectomy can be safely performed vis-à-vis graft function in tertiary hyperparathyroidism.
Subject(s)
Hypercalcemia , Hyperparathyroidism, Secondary , Hyperparathyroidism , Kidney Transplantation , Calcimimetic Agents/therapeutic use , Calcium , Cinacalcet/therapeutic use , Humans , Hyperparathyroidism/etiology , Hyperparathyroidism/surgery , Hyperparathyroidism, Secondary/surgery , Kidney , Kidney Transplantation/adverse effects , Parathyroid Hormone , ParathyroidectomyABSTRACT
BACKGROUND: Bronchoalveolar lavage (BAL) is a major diagnostic tool in interstitial lung disease (ILD). Its use remains largely quantitative, usually focused on cell differential ratio. However, cellular morphological features provide additional valuable information. The significance of the "immune alveolitis" cytological profile, characterized by lymphocytic alveolitis with activated lymphocytes and macrophages in epithelioid transformation or foamy macrophages desquamating in cohesive clusters with lymphocytes, remains unknown in ILD. Our objective was to describe patients' characteristics and diagnoses associated with an immune alveolitis profile in undiagnosed ILD. METHODS: We performed a monocentric retrospective observational study. Eligible patients were adults undergoing diagnostic exploration for ILD and whose BAL fluid displayed an immune alveolitis profile. For each patient, we collected clinical, radiological and biological findings as well as the final etiology of ILD. RESULTS: Between January 2012 and December 2018, 249 patients were included. Mean age was 57 ± 16 years, 140 patients (56%) were men, and 65% of patients were immunocompromised. The main etiological diagnosis was Pneumocystis pneumonia (PCP) (24%), followed by drug-induced lung disease (DILD) (20%), viral pneumonia (14%) and hypersensitivity pneumonitis (HP) (10%). All PCP were diagnosed in immunocompromised patients while HP was found in only 8% of this subgroup. DILD and viral pneumonia were also commonly diagnosed in immunocompromised patients (94% and 80%, respectively). CONCLUSION: Our study highlights the additional value of BAL qualitative description in ILD. We suggest incorporating the immune alveolitis profile for the diagnosis and management of ILD, especially in immunocompromised patients, since it guides towards specific diagnoses.
Subject(s)
Immunocompromised Host , Lung Diseases, Interstitial/complications , Lung Diseases, Interstitial/immunology , Pulmonary Alveoli , Adult , Aged , Female , Humans , Lung Diseases, Interstitial/pathology , Male , Middle Aged , Pulmonary Alveoli/pathology , Retrospective StudiesABSTRACT
AIMS: Risk stratification of sudden cardiac arrest (SCA) in Brugada syndrome (Brs) remains the main challenge for physicians. Several scores have been suggested to improve risk stratification but never replicated. We aim to investigate the accuracy of the Brs risk scores. METHODS AND RESULTS: A total of 1613 patients [mean age 45 ± 15 years, 69% male, 323 (20%) symptomatic] were prospectively enrolled from 1993 to 2016 in a multicentric database. All data described in the risk score were double reviewed for the study. Among them, all patients were evaluated with Shanghai score and 461 (29%) with Sieira score. After a mean follow-up of 6.5 ± 4.7 years, an arrhythmic event occurred in 75 (5%) patients including 16 SCA, 11 symptomatic ventricular arrhythmia, and 48 appropriate therapies. Predictive capacity of the Shanghai score (n = 1613) and the Sieira (n = 461) score was, respectively, estimated by an area under the curve of 0.73 (0.67-0.79) and 0.71 (0.61-0.81). Considering Sieira score, the event rate at 10 years was significantly higher with a score of 5 (26.4%) than with a score of 0 (0.9%) or 1 (1.1%) (P < 0.01). No statistical difference was found in intermediate-risk patients (score 2-4). The Shanghai score does not allow to better stratify the risk of SCA. CONCLUSIONS: In the largest cohort of Brs patient ever described, risk scores do not allow stratifying the risk of arrhythmic event in intermediate-risk patient.
Subject(s)
Brugada Syndrome , Defibrillators, Implantable , Adult , Brugada Syndrome/complications , China , Death, Sudden, Cardiac/epidemiology , Death, Sudden, Cardiac/etiology , Electrocardiography , Female , Humans , Male , Middle Aged , Risk AssessmentABSTRACT
OBJECTIVES: After subarachnoid hemorrhage (SAH), potential renal insults are numerous but the burden of early acute kidney injury (AKI) is unclear. We determined its incidence, rate of persistence, risk factors, and impact on patients' outcomes. MATERIALS AND METHODS: Patients with non-traumatic SAH were retrospectively included if they underwent catheter angiography within the 48 h after their admission to the intensive care unit. Early AKI was defined according to Kidney Disease Improving Global Outcome (KDIGO) criteria, analyzed from the time of catheter angiography. Early AKI was considered as persistent if the KDIGO stage did not decrease between the 48th and the 60th hour. RESULTS: Among 499 consecutive patients, early AKI (mostly oliguria) occurred in 132 (26%): stage 1, 2 and 3 in 72 (14%), 44 (9%), and 16 (3%) patients, respectively. It persisted in 36% of cases. Early AKI occurred more likely when SAH was severe or renal function was impaired at hospital admission: adjusted odds ratio of 2.76 [95% 1.77-4.30] and 3.32 [1.17-9.46], respectively. ICU and hospital lengths of stay were longer in patients who developed early AKI than in patients who did not: 16 [9-29] versus 12 [4-24] days (p = 0.0003) and 21 [14-43] versus 16 [11-32] days (p = 0.007), respectively. There was an independent link between early AKI and renal outcome (n = 274 in the model) but not with hospital mortality (n = 453). CONCLUSIONS: One quarter of our population developed early AKI, mostly oliguria. It persisted beyond the 48th hour in one third of cases. The associated risk factors we identified were non-modifiable.
Subject(s)
Acute Kidney Injury , Oliguria , Acute Kidney Injury/diagnosis , Acute Kidney Injury/epidemiology , Acute Kidney Injury/etiology , Angiography/adverse effects , Catheters/adverse effects , Humans , Incidence , Intensive Care Units , Oliguria/complications , Retrospective Studies , Risk FactorsABSTRACT
AIM: To investigate the association between routine use of dipeptidyl peptidase-4 (DPP-4) inhibitors and the severity of coronavirus disease 2019 (COVID-19) infection in patient with type 2 diabetes in a large multicentric study. MATERIALS AND METHODS: This study was a secondary analysis of the CORONADO study on 2449 patients with type 2 diabetes (T2D) hospitalized for COVID-19 in 68 French centres. The composite primary endpoint combined tracheal intubation for mechanical ventilation and death within 7 days of admission. Stabilized weights were computed for patients based on propensity score (DPP-4 inhibitors users vs. non-users) and were used in multivariable logistic regression models to estimate the average treatment effect in the treated as inverse probability of treatment weighting (IPTW). RESULTS: Five hundred and ninety-six participants were under DPP-4 inhibitors before admission to hospital (24.3%). The primary outcome occurred at similar rates in users and non-users of DPP-4 inhibitors (27.7% vs. 28.6%; p = .68). In propensity analysis, the IPTW-adjusted models showed no significant association between the use of DPP-4 inhibitors and the primary outcome by Day 7 (OR [95% CI]: 0.95 [0.77-1.17]) or Day 28 (OR [95% CI]: 0.96 [0.78-1.17]). Similar neutral findings were found between use of DPP-4 inhibitors and the risk of tracheal intubation and death. CONCLUSIONS: These data support the safety of DPP-4 inhibitors for diabetes management during the COVID-19 pandemic and they should not be discontinued.
Subject(s)
COVID-19 Drug Treatment , COVID-19 , Diabetes Mellitus, Type 2 , Dipeptidyl-Peptidase IV Inhibitors , Aged , Aged, 80 and over , Angiotensin Receptor Antagonists , Angiotensin-Converting Enzyme Inhibitors , COVID-19/complications , Diabetes Mellitus, Type 2/complications , Diabetes Mellitus, Type 2/drug therapy , Diabetes Mellitus, Type 2/epidemiology , Dipeptidyl-Peptidase IV Inhibitors/adverse effects , Dipeptidyl-Peptidase IV Inhibitors/therapeutic use , Female , Humans , Male , Middle Aged , Pandemics , Prognosis , Propensity ScoreABSTRACT
BACKGROUND: Influenza generates a significant societal impact on morbidity, mortality, and associated costs. The study objective was to identify factors associated with influenza-like-illness (ILI) episodes during seasonal influenza epidemics among the general population. METHODS: A prospective study was conducted with the GrippeNet.fr crowdsourced cohort between 2012/13 and 2017/18. After having completed a yearly profile survey detailing socio-demographic, lifestyle and health characteristics, participants reported weekly data on symptoms. Factors associated with at least one ILI episode per influenza epidemic, using the European Centre for Disease Prevention and Control case definition, were analyzed through a conditional logistic regression model. RESULTS: From 2012/13 to 2017/18, 6992 individuals participated at least once, and 61% of them were women (n = 4258). From 11% (n = 469/4140 in 2013/14) to 29% (n = 866/2943 in 2012/13) of individuals experienced at least one ILI during an influenza epidemic. Factors associated with higher risk for ILI were: gender female (OR = 1.29, 95%CI [1.20; 1.40]), young age (< 5 years old: 3.12 [2.05; 4.68]); from 5 to 14 years old: 1.53 [1.17; 2.00]), respiratory allergies (1.27 [1.18; 1.37]), receiving a treatment for chronic disease (1.20 [1.09; 1.32]), being overweight (1.18 [1.08; 1.29]) or obese (1.28 [1.14; 1.44]), using public transport (1.17 [1.07; 1.29]) and having contact with pets (1.18 [1.09; 1.27]). Older age (≥ 75 years old: 0.70 [0.56; 0.87]) and being vaccinated against influenza (0.91 [0.84; 0.99]) were found to be protective factors for ILI. CONCLUSIONS: This ILI risk factors analysis confirms and further completes the list of factors observed through traditional surveillance systems. It indicates that crowdsourced cohorts are effective to study ILI determinants at the population level. These findings could be used to adapt influenza prevention messages at the population level to reduce the spread of the disease.
Subject(s)
Crowdsourcing , Influenza, Human/epidemiology , Adolescent , Adult , Aged , Child , Child, Preschool , Cohort Studies , Female , Humans , Infant , Infant, Newborn , Male , Middle Aged , Risk Factors , Young AdultABSTRACT
BackgroundVaccination policy in France was previously characterised by the coexistence of eight recommended and three mandatory vaccinations for children younger than 2 years old. These 11 vaccines are now mandatory for all children born after 1 January 2018.AimTo study the French population's opinion about this new policy and to assess factors associated with a positive opinion during this changing phase.MethodsA cross-sectional survey about vaccination was conducted from 16 November-19 December 2017 among the GrippeNet.fr cohort. Data were weighted for age, sex and education according to the French population. Univariate and multivariate analyses were performed to identify factors associated with a favourable opinion on mandatory vaccines' extension and defined in the '3Cs' model by the World Health Organization Strategic Advisory Group of Experts working group on vaccine hesitancy.ResultsOf the 3,222 participants (response rate 50.5%) and after adjustment, 64.5% agreed with the extension of mandatory vaccines. It was considered a necessary step by 68.7% of the study population, while 33.8% considered it unsafe for children and 56.9% saw it as authoritarian. Factors associated with a positive opinion about the extension of mandatory vaccines were components of the confidence, complacency and convenience dimensions of the '3Cs' model.ConclusionsIn our sample, two thirds of the French population was in favour of the extension of mandatory vaccines for children. Perception of vaccine safety and benefits were major predictors for positive and negative opinions about this new policy.
Subject(s)
Health Knowledge, Attitudes, Practice , Immunization Programs , Mandatory Programs , Vaccination Refusal/psychology , Vaccination/psychology , Adolescent , Adult , Aged , Aged, 80 and over , Child , Cross-Sectional Studies , Female , France , Health Policy , Humans , Male , Middle Aged , Principal Component Analysis , Residence Characteristics , Vaccination/legislation & jurisprudence , Vaccines , Young AdultABSTRACT
Purpose: To assess the efficacy of the gelatin torpedoes embolization technique after lung neoplastic lesions percutaneous radiofrequency ablation (PRFA) to reduce chest tube placement rate and hospital length of stay, and the safety of this embolization technique. Materials and methods: A total of 114 PRFA of lung neoplastic lesions performed in two centers between January 2017 and December 2022 were retrospectively reviewed. Two groups were compared, with 42 PRFA with gelatin torpedoes embolization technique (gelatin group) and 72 procedures without (control group). Procedures were performed by one of seven interventional radiologists using LeVeen CoAccess™ probe. Multivariate analyses were performed to identify risk factors for chest tube placement and hospital length of stay. Results: There was a significantly lower chest tube placement rate in the gelatin group compared to the control group (3 [7.1 %] vs. 27 [37.5 %], p < 0,001). Multivariate analysis showed a significant association between chest tube placement and gelatin torpedoes embolization technique (OR: 0.09; 95 % CI: 0.02-0.32; p = 0.0006). No significant difference was found in hospital length of stay between the two groups. Multivariate analysis did not show a significant relationship between hospital length of stay and gelatin torpedoes embolization technique. No embolic complication occurred in the gelatin group. Conclusion: Gelatin torpedoes embolization technique after PRFA of lung neoplastic lesions resulted in significantly reduced chest tube placement rate in our patient population. No significant reduction in hospital length of stay was observed. No major complication occurred in the gelatin group.
ABSTRACT
BACKGROUND: First-line chemotherapy plus immunotherapy (CT-IO) has recently demonstrated survival benefits over CT alone in extensive-stage small-cell lung cancer (ES-SCLC), based on randomized phase III studies. This retrospective multicenter study assessed the real-world use and effectiveness of CT-IO in ES-SCLC patients. PATIENTS AND METHODS: All newly diagnosed ES-SCLC patients from 4 French hospitals treated with CT alone or CT-IO between May 2020 and December 2021 were included. Overall survival (OS) and real-world progression-free survival (rwPFS) were estimated using the Kaplan-Meier method. Cox proportional hazard models were performed to estimate hazard ratios (HRs) with 95 % confidence intervals (CIs) in univariate and multivariate models. The aim was not to compare efficacy between groups. RESULTS: Among 104 patients, 75 (72.1%) received CT-IO. Brain metastases were diagnosed in 28.3% of patients, and 29.8% were performance status (PS) ≥ 2. At a median follow-up of 16.8 months (95%CI, 14.9-23.4), the median OS was 11.4 months (95%CI, 7.7-14.7) in the CT-IO group, and the 12-month OS rate was 43.6% (95%CI, 33.3-57.2). In the CT group, the median OS was 7.8 months (95%CI, 5.4-11.8) and the 12-month OS rate was 15.3% (95%CI, 5.7-41.0). In multivariate analyses, baseline brain and liver metastases were associated with a shorter OS for patients treated in the CT-IO group (HR, 3.80 [95%CI, 1.90-7.60] and 3.12 [95%CI, 1.60-6.08] respectively; P < 0.001 for both). CONCLUSION: We showed that clinicians have chosen to use IO beyond the specific criteria defined in guidelines. Survival data appeared promising with a median OS comparable to the one previously demonstrated in clinical trials.
Subject(s)
Brain Neoplasms , Lung Neoplasms , Small Cell Lung Carcinoma , Humans , Lung Neoplasms/drug therapy , Small Cell Lung Carcinoma/drug therapy , Brain , ImmunotherapyABSTRACT
BACKGROUND: The 30-day readmission rate provides a standardised quantitative evaluation of some postoperative complications. It is widely used worldwide in many medical and surgical specialities, and the World Health Organization recommends its use for monitoring healthcare system performance. In ophthalmology, its measurement is biased by the frequent and close planned surgery on one eye and then the other, particularly in the case of cataract surgery. This study measures the 30-day unplanned readmission rate in ophthalmology, globally and by surgery subtype, and describes the causes of readmission. METHODS: All patients readmitted within 30 days of ophthalmic surgery at Nantes University Hospital between January 2017 and December 2020 were identified in the Medical Information System. An ophthalmologist examined each medical record and collected the following data: the reason for readmission, comorbidities, the pathology treated, surgery type, surgery duration, the surgeon's experience, anaesthesia type, severity and readmission morbidity. RESULTS: For the 8522 ophthalmic surgeries performed in the four-year study period, 282 30-day unplanned readmissions were identified. The overall 30-day unplanned readmission rate was 2.07% for elective surgery, with a high variability depending on the surgery type: 0.95% for phacoemulsification, 4.95% for vitreoretinal surgery (3.42% for non-elective vitreoretinal surgery, 5.44% for retinal detachment surgery), 5.66% for deep lamellar keratoplasty and 11.90% for trabeculectomy. The unplanned 30-day readmission rate for ocular trauma surgery (emergency care) was 11.0%. Seven percent of all unplanned 30-day readmissions were not associated with an ophthalmological problem. CONCLUSIONS: This study is the first to report 30-day unplanned readmission in ophthalmology, globally and by surgical subtype, for elective and urgent procedures. This indicator can be used longitudinally to detect an increase in risk or transversely to compare the quality of care between different public or private hospitals.