Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 9.372
Filter
Add more filters

Publication year range
1.
Genet Epidemiol ; 2024 Sep 30.
Article in English | MEDLINE | ID: mdl-39350332

ABSTRACT

Most genome-wide association studies are based on case-control designs, which provide abundant resources for secondary phenotype analyses. However, such studies suffer from biased sampling of primary phenotypes, and the traditional statistical methods can lead to seriously distorted analysis results when they are applied to secondary phenotypes without accounting for the biased sampling mechanism. To our knowledge, there are no statistical methods specifically tailored for rare variant association analysis with secondary phenotypes. In this article, we proposed two novel joint test statistics for identifying secondary-phenotype-associated rare variants based on prospective likelihood and retrospective likelihood, respectively. We also exploit the assumption of gene-environment independence in retrospective likelihood to improve the statistical power and adopt a two-step strategy to balance statistical power and robustness. Simulations and a real-data application are conducted to demonstrate the superior performance of our proposed methods.

2.
J Neurosci ; 43(43): 7186-7197, 2023 10 25.
Article in English | MEDLINE | ID: mdl-37704373

ABSTRACT

Across species, neurons track time over the course of seconds to minutes, which may feed the sense of time passing. Here, we asked whether neural signatures of time-tracking could be found in humans. Participants stayed quietly awake for a few minutes while being recorded with magnetoencephalography (MEG). They were unaware they would be asked how long the recording lasted (retrospective time) or instructed beforehand to estimate how long it will last (prospective timing). At rest, rhythmic brain activity is nonstationary and displays bursts of activity in the alpha range (α: 7-14 Hz). When participants were not instructed to attend to time, the relative duration of α bursts linearly predicted individuals' retrospective estimates of how long their quiet wakefulness lasted. The relative duration of α bursts was a better predictor than α power or burst amplitude. No other rhythmic or arrhythmic activity predicted retrospective duration. However, when participants timed prospectively, the relative duration of α bursts failed to predict their duration estimates. Consistent with this, the amount of α bursts was discriminant between prospective and retrospective timing. Last, with a control experiment, we demonstrate that the relation between α bursts and retrospective time is preserved even when participants are engaged in a visual counting task. Thus, at the time scale of minutes, we report that the relative time of spontaneous α burstiness predicts conscious retrospective time. We conclude that in the absence of overt attention to time, α bursts embody discrete states of awareness constitutive of episodic timing.SIGNIFICANCE STATEMENT The feeling that time passes is a core component of consciousness and episodic memory. A century ago, brain rhythms called "α" were hypothesized to embody an internal clock. However, rhythmic brain activity is nonstationary and displays on-and-off oscillatory bursts, which would serve irregular ticks to the hypothetical clock. Here, we discovered that in a given lapse of time, the relative bursting time of α rhythms is a good indicator of how much time an individual will report to have elapsed. Remarkably, this relation only holds true when the individual does not attend to time and vanishes when attending to it. Our observations suggest that at the scale of minutes, α brain activity tracks episodic time.


Subject(s)
Alpha Rhythm , Brain , Humans , Retrospective Studies , Alpha Rhythm/physiology , Magnetoencephalography , Neurons/physiology
3.
J Neurosci ; 43(31): 5693-5709, 2023 08 02.
Article in English | MEDLINE | ID: mdl-37369587

ABSTRACT

The trial-unique nonmatching to location (TUNL) touchscreen task shows promise as a translational assay of working memory (WM) deficits in rodent models of autism, ADHD, and schizophrenia. However, the low-level neurocognitive processes that drive behavior in the TUNL task have not been fully elucidated. In particular, it is commonly assumed that the TUNL task predominantly measures spatial WM dependent on hippocampal pattern separation, but this proposition has not previously been tested. In this project, we tested this question using computational modeling of behavior from male and female mice performing the TUNL task (N = 163 across three datasets; 158,843 trials). Using this approach, we empirically tested whether TUNL behavior solely measured retrospective WM, or whether it was possible to deconstruct behavior into additional neurocognitive subprocesses. Overall, contrary to common assumptions, modeling analyses revealed that behavior on the TUNL task did not primarily reflect retrospective spatial WM. Instead, behavior was best explained as a mixture of response strategies, including both retrospective WM (remembering the spatial location of a previous stimulus) and prospective WM (remembering an anticipated future behavioral response) as well as animal-specific response biases. These results suggest that retrospective spatial WM is just one of a number of cognitive subprocesses that contribute to choice behavior on the TUNL task. We suggest that findings can be understood within a resource-rational framework, and use computational model simulations to propose several task-design principles that we predict will maximize spatial WM and minimize alternative behavioral strategies in the TUNL task.SIGNIFICANCE STATEMENT Touchscreen tasks represent a paradigm shift for assessment of cognition in nonhuman animals by automating large-scale behavioral data collection. Their main relevance, however, depends on the assumption of functional equivalence to cognitive domains in humans. The trial-unique, delayed nonmatching to location (TUNL) touchscreen task has revolutionized the study of rodent spatial working memory. However, its assumption of functional equivalence to human spatial working memory is untested. We leveraged previously untapped single-trial TUNL data to uncover a novel set of hierarchically ordered cognitive processes that underlie mouse behavior on this task. The strategies used demonstrate multiple cognitive approaches to a single behavioral outcome and the requirement for more precise task design and sophisticated data analysis in interpreting rodent spatial working memory.


Subject(s)
Hippocampus , Memory, Short-Term , Humans , Mice , Male , Female , Animals , Memory, Short-Term/physiology , Prospective Studies , Retrospective Studies , Hippocampus/physiology , Memory Disorders , Bias
4.
Circulation ; 147(24): 1854-1868, 2023 06 13.
Article in English | MEDLINE | ID: mdl-37194575

ABSTRACT

BACKGROUND: Basic life support education for schoolchildren has become a key initiative to increase bystander cardiopulmonary resuscitation rates. Our objective was to review the existing literature on teaching schoolchildren basic life support to identify the best practices to provide basic life support training in schoolchildren. METHODS: After topics and subgroups were defined, a comprehensive literature search was conducted. Systematic reviews and controlled and uncontrolled prospective and retrospective studies containing data on students <20 years of age were included. RESULTS: Schoolchildren are highly motivated to learn basic life support. The CHECK-CALL-COMPRESS algorithm is recommended for all schoolchildren. Regular training in basic life support regardless of age consolidates long-term skills. Young children from 4 years of age are able to assess the first links in the chain of survival. By 10 to 12 years of age, effective chest compression depths and ventilation volumes can be achieved on training manikins. A combination of theoretical and practical training is recommended. Schoolteachers serve as effective basic life support instructors. Schoolchildren also serve as multipliers by passing on basic life support skills to others. The use of age-appropriate social media tools for teaching is a promising approach for schoolchildren of all ages. CONCLUSIONS: Schoolchildren basic life support training has the potential to educate whole generations to respond to cardiac arrest and to increase survival after out-of-hospital cardiac arrest. Comprehensive legislation, curricula, and scientific assessment are crucial to further develop the education of schoolchildren in basic life support.


Subject(s)
Cardiopulmonary Resuscitation , Out-of-Hospital Cardiac Arrest , Child , Humans , Child, Preschool , Retrospective Studies , Prospective Studies , Cardiopulmonary Resuscitation/education , Educational Status
5.
Clin Infect Dis ; 2024 May 17.
Article in English | MEDLINE | ID: mdl-38759099

ABSTRACT

BACKGROUND: Aeromonas virulence may not be entirely dependent on the host immune status. Pathophysiologic determinants of disease progression and severity remain unclear. METHODS: One hundred five patients with Aeromonas infections and 112 isolates were identified, their clinical presentations and outcomes analyzed, and their antimicrobial resistance (AMR) patterns assessed. Two isolates (A and B) from fatal cases of Aeromonas dhakensis bacteremia were characterized using whole genome sequence analysis. Virulence factor- and AMR-encoding genes from these isolates were compared with a well-characterized diarrheal isolate A. dhakensis SSU, and environmental isolate A. hydrophila ATCC_7966T. RESULTS: Skin and soft tissue infections, traumatic wound infections, sepsis, burns, and intraabdominal infections were common. Diabetes, malignancy, and cirrhosis were frequent comorbidities. Male sex, age ≥ 65 years, hospitalization, burns, and intensive care were associated with complicated disease. High rates of AMR to carbapenems and piperacillin-tazobactam were found. Treatment failure was observed in 25.7% of cases. Septic shock and hospital-acquired infections were predictors of treatment failure. All four isolates harbored assorted broad-spectrum AMR genes including blaOXA, ampC, cphA, and efflux pumps. Only clinical isolates possessed both polar and lateral flagellar genes, genes for various surface adhesion proteins, type 3- and -6 secretion systems and their effectors, and toxin genes, including exotoxin A. Both isolates A and B were resistant to colistin and harbored the mobile colistin resistance-3 (mcr-3) gene. CONCLUSIONS: Empirical therapy tailored to local Aeromonas antibiograms may facilitate more favorable outcomes, while advanced diagnostic methods may aid in identifying correct Aeromonas spp. of significant clinical importance.

6.
Stroke ; 55(1): 122-130, 2024 01.
Article in English | MEDLINE | ID: mdl-38063017

ABSTRACT

BACKGROUND: Limited data exist on the temporal relationship between new-onset atrial fibrillation (AF) and ischemic stroke and its impact on patients' clinical characteristics and mortality. METHODS: A population-based registry-linkage database includes all patients with new-onset AF in Finland from 2007 to 2018. Ischemic stroke temporally associated with AF (ISTAF) was defined as an ischemic stroke occurring within ±30 days from the first AF diagnosis. Clinical factors associated with ISTAF were studied with logistic regression and 90-day survival with Cox proportional hazards analysis. RESULTS: Among 229 565 patients with new-onset AF (mean age, 72.7 years; 50% female), 204 774 (89.2%) experienced no ischemic stroke, 12 209 (5.3%) had past ischemic stroke >30 days before AF, and 12 582 (5.8%) had ISTAF. The annual proportion of ISTAF among patients with AF decreased from 6.0% to 4.8% from 2007 to 2018. Factors associated positively with ISTAF were higher age, lower education level, and alcohol use disorder, whereas vascular disease, heart failure, chronic kidney disease cancer, and psychiatric disorders were less probable with ISTAF. Compared with patients without ischemic stroke and those with past ischemic stroke, ISTAF was associated with ≈3-fold and 1.5-fold risks of death (adjusted hazard ratios, 2.90 [95% CI, 2.76-3.04] and 1.47 [95% CI, 1.39-1.57], respectively). The 90-day survival probability of patients with ISTAF increased from 0.79 (95% CI, 0.76-0.81) in 2007 to 0.89 (95% CI, 0.87-0.91) in 2018. CONCLUSIONS: ISTAF depicts the prominent temporal clustering of ischemic strokes surrounding AF diagnosis. Despite having fewer comorbidities, patients with ISTAF had worse, albeit improving, survival than patients with a history of or no ischemic stroke. REGISTRATION: URL: https://www.clinicaltrials.gov; Unique identifier: NCT04645537. URL: https://www.encepp.eu; Unique identifier: EUPAS29845.


Subject(s)
Atrial Fibrillation , Ischemic Stroke , Stroke , Humans , Female , Aged , Male , Stroke/diagnosis , Atrial Fibrillation/complications , Ischemic Stroke/complications , Comorbidity , Registries , Risk Factors , Anticoagulants
7.
Stroke ; 55(4): 983-989, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38482715

ABSTRACT

BACKGROUND: There is limited research on outcomes of patients with posttraumatic stress disorder (PTSD) who also develop stroke, particularly regarding racial disparities. Our goal was to determine whether PTSD is associated with the risk of hospital readmission after stroke and whether racial disparities existed. METHODS: The analytical sample consisted of all veterans receiving care in the Veterans Health Administration who were identified as having a new stroke requiring inpatient admission based on the International Classification of Diseases codes. PTSD and comorbidities were identified using the International Classification of Diseases codes and given the date of first occurrence. The retrospective cohort data were obtained from the Veterans Affairs Corporate Data Warehouse. The main outcome was any readmission to Veterans Health Administration with a stroke diagnosis. The hypothesis that PTSD is associated with readmission after stroke was tested using Cox regression adjusted for patient characteristics including age, sex, race, PTSD, smoking status, alcohol use, and comorbidities treated as time-varying covariates. RESULTS: Our final cohort consisted of 93 651 patients with inpatient stroke diagnosis and no prior Veterans Health Administration codes for stroke starting from 1999 with follow-up through August 6, 2022. Of these patients, 12 916 (13.8%) had comorbid PTSD. Of the final cohort, 16 896 patients (18.0%) with stroke were readmitted. Our fully adjusted model for readmission found an interaction between African American veterans and PTSD with a hazard ratio of 1.09 ([95% CI, 1.00-1.20] P=0.047). In stratified models, PTSD has a significant hazard ratio of 1.10 ([95% CI, 1.02-1.18] P=0.01) for African American but not White veterans (1.05 [95% CI, 0.99-1.11]; P=0.10). CONCLUSIONS: Among African American veterans who experienced stroke, preexisting PTSD was associated with increased risk of readmission, which was not significant among White veterans. This study highlights the need to focus on high-risk groups to reduce readmissions after stroke.


Subject(s)
Stress Disorders, Post-Traumatic , Stroke , Veterans , Humans , United States/epidemiology , Stress Disorders, Post-Traumatic/epidemiology , Stress Disorders, Post-Traumatic/diagnosis , Retrospective Studies , Patient Readmission , Stroke/epidemiology , Stroke/therapy , Comorbidity
8.
Clin Infect Dis ; 78(4): 918-921, 2024 Apr 10.
Article in English | MEDLINE | ID: mdl-37882613

ABSTRACT

Evaluating 100 adult coronavirus disease 2019 (COVID-19) patients at a Madrid hospital, we identified a mismatch between current clinical trial designs and the evolving profile of hospitalized patients. Most patients were ineligible due to design constraints, suggesting a need to rethink trial criteria for a more accurate representation of the hospitalized COVID-19 cohort.


Subject(s)
COVID-19 , Adult , Humans , SARS-CoV-2 , Retrospective Studies , Clinical Trials as Topic , Cohort Studies
9.
Stroke ; 2024 Oct 11.
Article in English | MEDLINE | ID: mdl-39391984

ABSTRACT

BACKGROUND: In acute stroke, diffusion-weighted imaging (DWI) is used to assess the ischemic core. Dynamic-susceptibility contrast perfusion magnetic resonance imaging allows an estimation of the oxygen extraction fraction (OEF), but the outcome of DWI lesions with increased OEF postrecanalization is unclear. This study investigated the impact of OEF on the fate of DWI lesions in patients achieving recanalization after thrombectomy. METHODS: This was a retrospective analysis of the HIBISCUS-STROKE cohort (Cohort of Patients to Identify Biological and Imaging Markers of Cardiovascular Outcomes in Stroke; NCT: 03149705), a single-center observational study that prospectively enrolled patients who underwent magnetic resonance imaging triage for thrombectomy and a day-6 T2-fluid-attenuated inversion recovery (FLAIR) magnetic resonance imaging. Automated postprocessing of admission dynamic-susceptibility contrast perfusion magnetic resonance imaging generated OEF maps. At visual analysis, the OEF status within DWI lesions was assessed in comparison to the contralateral side and correlated with volume changes (difference of ischemic lesion between admission DWI and registered day-6 T2-FLAIR). At voxel-based analysis, recovered DWI regions (lesions present on the admission DWI but absent on the registered day-6 T2-FLAIR) and nonrecovered regions were segmented to extract semiquantitative OEF values. RESULTS: Of the participants enrolled from 2016 to 2022, 134 of 321 (41.7%) were included (median age, 71.0 years; 58.2% male; median baseline National Institutes of Health Scale score, 15.0). At visual analysis, 46 of 134 (34.3%) patients had increased OEF within DWI lesions. These patients were more likely to show a reduction in ischemic lesion volumes compared with those without increased OEF (median change, -4.0 versus 4.8 mL; P<0.0001). Multivariable analysis indicated that increased OEF within DWI lesions was associated with a reduction in ischemic lesion volumes from admission DWI to day-6 T2-FLAIR (odds ratio, 0.68 [95% CI, 0.49-0.87]; P=0.008). At voxel-based analysis, recovered DWI regions had increased OEF, while nonrecovered regions had decreased OEF (median, 126.9% versus -27.0%; P<0.0001). CONCLUSIONS: Increased OEF within hyperacute DWI lesions was associated with ischemic lesion recovery between admission DWI and day-6 T2-FLAIR in patients achieving recanalization after thrombectomy. REGISTRATION: URL: https://www.clinicaltrials.gov; Unique identifier: NCT03149705.

10.
Stroke ; 55(2): 376-384, 2024 02.
Article in English | MEDLINE | ID: mdl-38126181

ABSTRACT

BACKGROUND: The aim of this study was to report the results of a subgroup analysis of the ASTER2 trial (Effect of Thrombectomy With Combined Contact Aspiration and Stent Retriever vs Stent Retriever Alone on Revascularization in Patients With Acute Ischemic Stroke and Large Vessel Occlusion) comparing the safety and efficacy of the combined technique (CoT) and stent retriever as a first-line approach in internal carotid artery (ICA) terminus±M1-middle cerebral artery (M1-MCA) and isolated M1-MCA occlusions. METHODS: Patients enrolled in the ASTER2 trial with ICA terminus±M1-MCA and isolated M1-MCA occlusions were included in this subgroup analysis. The effect of first-line CoT versus stent retriever according to the occlusion site was assessed on angiographic (first-pass effect, expanded Treatment in Cerebral Infarction score ≥2b50, and expanded Treatment in Cerebral Infarction score ≥2c grades at the end of the first-line strategy and at the end of the procedure) and clinicoradiological outcomes (24-hour National Institutes of Health Stroke Scale, ECASS-III [European Cooperative Acute Stroke Study] grades, and 3-month modified Rankin Scale). RESULTS: Three hundred sixty-two patients were included in the postsubgroup analysis according to the occlusion site: 299 were treated for isolated M1-MCA occlusion (150 with first-line CoT) and 63 were treated for ICA terminus±M1-MCA occlusion (30 with first-line CoT). Expanded Treatment in Cerebral Infarction score ≥2b50 (odds ratio, 11.83 [95% CI, 2.32-60.12]) and expanded Treatment in Cerebral Infarction score ≥2c (odds ratio, 4.09 [95% CI, 1.39-11.94]) were significantly higher in first-line CoT compared with first-line stent retriever in patients with ICA terminus±M1-MCA occlusion but not in patients with isolated M1-MCA. CONCLUSIONS: First-line CoT was associated with higher reperfusion grades in patients with ICA terminus±M1-MCA at the end of the procedure. REGISTRATION: URL: https://www.clinicaltrials.gov; Unique identifier: NCT03290885.


Subject(s)
Arterial Occlusive Diseases , Brain Ischemia , Carotid Artery Diseases , Endovascular Procedures , Ischemic Stroke , Stroke , Humans , Arterial Occlusive Diseases/complications , Brain Ischemia/surgery , Carotid Artery Diseases/diagnostic imaging , Carotid Artery Diseases/surgery , Carotid Artery Diseases/complications , Carotid Artery, Internal/diagnostic imaging , Carotid Artery, Internal/surgery , Endovascular Procedures/methods , Infarction, Middle Cerebral Artery/diagnostic imaging , Infarction, Middle Cerebral Artery/surgery , Infarction, Middle Cerebral Artery/complications , Ischemic Stroke/complications , Middle Cerebral Artery/surgery , Stents , Stroke/therapy , Thrombectomy/methods , Treatment Outcome
11.
Am J Epidemiol ; 193(2): 308-322, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-37671942

ABSTRACT

This study explores natural direct and joint natural indirect effects (JNIE) of prenatal opioid exposure on neurodevelopmental disorders (NDDs) in children mediated through pregnancy complications, major and minor congenital malformations, and adverse neonatal outcomes, using Medicaid claims linked to vital statistics in Rhode Island, United States, 2008-2018. A Bayesian mediation analysis with elastic net shrinkage prior was developed to estimate mean time to NDD diagnosis ratio using posterior mean and 95% credible intervals (CrIs) from Markov chain Monte Carlo algorithms. Simulation studies showed desirable model performance. Of 11,176 eligible pregnancies, 332 had ≥2 dispensations of prescription opioids anytime during pregnancy, including 200 (1.8%) having ≥1 dispensation in the first trimester (T1), 169 (1.5%) in the second (T2), and 153 (1.4%) in the third (T3). A significant JNIE of opioid exposure was observed in each trimester (T1, JNIE = 0.97, 95% CrI: 0.95, 0.99; T2, JNIE = 0.97, 95% CrI: 0.95, 0.99; T3, JNIE = 0.96, 95% CrI: 0.94, 0.99). The proportion of JNIE in each trimester was 17.9% (T1), 22.4% (T2), and 56.3% (T3). In conclusion, adverse pregnancy and birth outcomes jointly mediated the association between prenatal opioid exposure and accelerated time to NDD diagnosis. The proportion of JNIE increased as the timing of opioid exposure approached delivery.


Subject(s)
Neurodevelopmental Disorders , Prenatal Exposure Delayed Effects , Pregnancy , Female , Infant, Newborn , Child , Humans , United States/epidemiology , Analgesics, Opioid/adverse effects , Mediation Analysis , Prenatal Exposure Delayed Effects/chemically induced , Prenatal Exposure Delayed Effects/epidemiology , Bayes Theorem , Neurodevelopmental Disorders/chemically induced , Neurodevelopmental Disorders/epidemiology , Neurodevelopmental Disorders/drug therapy
12.
Am J Epidemiol ; 193(10): 1451-1459, 2024 Oct 07.
Article in English | MEDLINE | ID: mdl-38806447

ABSTRACT

Polygenic risk scores (PRSs) are rapidly emerging as a way to measure disease risk by aggregating multiple genetic variants. Understanding the interplay of the PRS with environmental factors is critical for interpreting and applying PRSs in a wide variety of settings. We develop an efficient method for simultaneously modeling gene-environment correlations and interactions using the PRS in case-control studies. We use a logistic-normal regression modeling framework to specify the disease risk and PRS distribution in the underlying population and propose joint inference across the 2 models using the retrospective likelihood of the case-control data. Extensive simulation studies demonstrate the flexibility of the method in trading-off bias and efficiency for the estimation of various model parameters compared with standard logistic regression or a case-only analysis for gene-environment interactions, or a control-only analysis, for gene-environment correlations. Finally, using simulated case-control data sets within the UK Biobank study, we demonstrate the power of our method for its ability to recover results from the full prospective cohort for the detection of an interaction between long-term oral contraceptive use and the PRS on the risk of breast cancer. This method is computationally efficient and implemented in a user-friendly R package.


Subject(s)
Gene-Environment Interaction , Multifactorial Inheritance , Humans , Case-Control Studies , Multifactorial Inheritance/genetics , Breast Neoplasms/genetics , Female , Logistic Models , Genetic Predisposition to Disease , Computer Simulation , Risk Factors , Models, Genetic , Genetic Risk Score
13.
Cancer ; 130(6): 927-935, 2024 03 15.
Article in English | MEDLINE | ID: mdl-37985357

ABSTRACT

BACKGROUND: Despite histological and molecular differences between invasive lobular carcinoma (ILC) and invasive carcinoma of no special type, according to national treatment guidelines no distinction is made regarding the use of (neo)adjuvant chemotherapy. Studies on the long-term outcome of chemotherapy in patients with ILC are scarce and show inconclusive results. METHODS: All patients with estrogen receptor (ER)-positive, human epidermal growth factor receptor 2 (HER2)-negative ILC with an indication for chemotherapy treated with adjuvant endocrine therapy were selected from the Erasmus Medical Center Breast Cancer database. Cox proportional hazards models were used to estimate the effect of chemotherapy on recurrence-free survival (RFS), breast cancer-specific survival (BCSS), and overall survival (OS). RESULTS: A total of 520 patients were selected, of whom 379 were treated with chemotherapy and 141 were not. Patients in the chemotherapy group were younger (51 vs. 61 years old; p < .001), had a higher T status (T3+, 33% vs. 14%; p < .001), and more often had lymph node involvement (80% vs. 49%; p < .001) in comparison to the no-chemotherapy group. After adjusting for confounders, chemotherapy treatment was not associated with better RFS (hazard ratio [HR], 1.20; 95% confidence interval [CI], 0.63-2.31), BCSS (HR, 1.24; 95% CI, 0.60-2.58), or OS (HR, 0.97; 95% CI, 0.56-1.66). This was also reflected by adjusted Cox survival curves in the chemotherapy versus no-chemotherapy group for RFS (75% vs. 79%), BCSS (80% vs. 84%), and OS (72% vs. 71%). CONCLUSIONS: Chemotherapy is not associated with improved RFS, BCSS, or OS for patients with ER+/HER2- ILC treated with adjuvant endocrine therapy and with an indication for chemotherapy.


Subject(s)
Breast Neoplasms , Carcinoma, Lobular , Humans , Middle Aged , Female , Breast Neoplasms/pathology , Carcinoma, Lobular/drug therapy , Carcinoma, Lobular/pathology , Retrospective Studies , Breast/pathology , Chemotherapy, Adjuvant , Receptor, ErbB-2/metabolism , Adjuvants, Immunologic , Immunologic Factors/therapeutic use
14.
Cancer ; 130(4): 530-540, 2024 02 15.
Article in English | MEDLINE | ID: mdl-37933916

ABSTRACT

BACKGROUND: This study aimed to describe treatment patterns and overall survival (OS) in patients with advanced non-small cell lung cancer (aNSCLC) in three countries between 2011 and 2020. METHODS: Three databases (US, Canada, Germany) were used to identify incident aNSCLC patients. OS was assessed from the date of incident aNSCLC diagnosis and, for patients who received at least a first line of therapy (1LOT), from the date of 1LOT initiation. In multivariable analyses, we analyzed the influence of index year and type of prescribed treatment on OS. FINDINGS: We included 51,318 patients with an incident aNSCLC diagnosis. The percentage of patients treated with a 1LOT differed substantially between countries, whereas the number of patients receiving immunotherapies/targeted treatments increased over time in all three countries. Median OS from the date of incident diagnosis was 9.9 months in the United States vs. 4.1 months in Canada. When measured from the start of 1LOT, patients had a median OS of 10.7 months in the United States, 10.9 months in Canada, and 10.9 months in Germany. OS from the start of 1LOT improved in all three countries from 2011 to 2020 by approximately 3 to 4 months. CONCLUSIONS: Observed continuous improvement in OS among patients receiving at least a 1LOT from 2011 to 2020 was likely driven by improved care and changes in the treatment landscape. The difference in the proportion of patients receiving a 1LOT in the observed countries requires further investigation.


Subject(s)
Carcinoma, Non-Small-Cell Lung , Lung Neoplasms , Humans , United States/epidemiology , Carcinoma, Non-Small-Cell Lung/pathology , Lung Neoplasms/pathology , Retrospective Studies , Germany/epidemiology , Canada/epidemiology
15.
BMC Immunol ; 25(1): 8, 2024 01 24.
Article in English | MEDLINE | ID: mdl-38267897

ABSTRACT

PURPOSE: The objective of this study was to identify potential predictors of immune-related adverse events (irAEs) in cancer patients receiving immune checkpoint inhibitor therapy among serum indexes, case data, and liquid biopsy results. METHODS: We retrospectively analyzed 418 patients treated with anti-programmed cell death 1(PD-1)/PD-1 ligand (PD-L1) inhibitors from January 2018 to May 2022 in our cancer center. We identified factors that correlated with the occurrence of irAEs and evaluated associations between irAEs and anti-PD-1/PD-L1 inhibitor responses. RESULTS: The incidence of irAEs was 42.1%, and pneumonitis (9.1%), thyroid toxicity (9.1%), cardiotoxicity (8.1%), and dermatologic toxicity (6.9%) were the four most common irAEs. Multivariate logistic analysis identified female sex, antibiotic use, higher post-treatment neutrophil-to-lymphocyte ratio (NLR), and higher baseline circulating tumor cell (CTC) level, as predictive biomarkers for the occurrence of irAEs. A lower baseline prognostic nutritional index (PNI), body mass index (BMI) ≥ 25 kg/m2, and higher post-treatment lactate dehydrogenase (LDH) level were predictive factors for more severe irAEs (higher severity grade). Patients without irAEs had better overall survival than those with irAEs. Specifically, pneumonitis and cardiotoxicity were found to be significant predictors of poor prognosis in the irAE subgroup with different organ-related irAEs. Low-dose steroid (dexamethasone 10 mg) treatment had no significant effect on outcomes. CONCLUSIONS: Gender, antibiotic use, post-treatment NLR, and baseline CTC level are potential predictive biomarkers of irAEs, while baseline PNI, BMI, and post-treatment LDH may predict the severity of irAEs. The predictive effect of irAE occurrence on survival benefit may depend on the type of irAE.


Subject(s)
Neoplasms , Pneumonia , Humans , Female , Immune Checkpoint Inhibitors/adverse effects , Cardiotoxicity , Programmed Cell Death 1 Receptor , Retrospective Studies , Anti-Bacterial Agents , Biomarkers , Neoplasms/drug therapy
16.
J Clin Immunol ; 44(6): 126, 2024 May 22.
Article in English | MEDLINE | ID: mdl-38773000

ABSTRACT

Alemtuzumab is used with reduced-toxicity conditioning (RTC) in allogeneic hematopoietic cell transplantation (HCT), demonstrating efficacy and feasibility for patients with inborn errors of immunity (IEI) in Western countries; however, the clinical experience in Asian patients with IEI is limited. We retrospectively analyzed patients with IEI who underwent the first allogeneic HCT with alemtuzumab combined with RTC regimens in Japan. A total of 19 patients were included and followed up for a median of 18 months. The donors were haploidentical parents (n = 10), matched siblings (n = 2), and unrelated bone marrow donors (n = 7). Most patients received RTC regimens containing fludarabine and busulfan and were treated with 0.8 mg/kg alemtuzumab with intermediate timing. Eighteen patients survived and achieved stable engraftment, and no grade 3-4 acute graft-versus-host disease was observed. Viral infections were observed in 11 patients (58%) and 6 of them presented symptomatic. The median CD4+ T cell count was low at 6 months (241/µL) but improved at 1 year (577/µL) after HCT. Whole blood cells continued to exhibit > 80% donor type in most cases; however, 3/10 patients exhibited poor donor chimerism only among T cells and also showed undetectable levels of T-cell receptor recombination excision circles (TRECs) at 1 year post-HCT. This study demonstrated the efficacy and safety of alemtuzumab; however, patients frequently developed viral infections and slow reconstitution or low donor chimerism in T cells, emphasizing the importance of monitoring viral status and T-cell-specific chimerism. (238 < 250 words).


Subject(s)
Alemtuzumab , Graft vs Host Disease , Hematopoietic Stem Cell Transplantation , Transplantation Conditioning , Transplantation, Homologous , Adolescent , Child , Child, Preschool , Female , Humans , Infant , Male , Alemtuzumab/therapeutic use , Asian People , Graft vs Host Disease/etiology , Hematopoietic Stem Cell Transplantation/methods , Retrospective Studies , Transplantation Conditioning/methods , Treatment Outcome , Japan , Immune System Diseases/genetics
17.
Oncologist ; 29(6): e750-e762, 2024 Jun 03.
Article in English | MEDLINE | ID: mdl-38431780

ABSTRACT

PURPOSE: Male breast cancer (MBC) is a rare, but increasingly common disease, and lacks prospective studies. Collaborative efforts are needed to understand and address MBC, including its prognosis, in different countries. METHODS: We retrospectively reviewed the clinical, histopathological, and molecular-genetic characteristics, treatments, and survival outcomes of MBC diagnosed between 2007 and 2017 in the Czech Republic. Prognostic factors of overall survival (OS), recurrence-free interval (RFi), and breast cancer-specific mortality (BCSM) were analyzed and indirectly compared to international data. RESULTS: We analyzed 256 patients with MBC (median age 66 years), including 12% with de novo metastatic (M1). Of 201 non-metastatic (M0) patients, 6% were <40 years old, 29% had stage I, 55% were cN0, and 54% underwent genetic testing. Overall, 97% of tumors had estrogen receptor expression ≥10%, 61% had high Ki67 index, 40% were high-grade (G3), and 68% were luminal B-like (HER2-negative). Systemic therapies included endocrine therapy (90%) and chemotherapy (53%). Few (5%) patients discontinued adjuvant endocrine therapy for reasons other than disease relapse or death. Patients treated with aromatase inhibitors alone had significantly shorter RFi (P < .001). OS, RFi, and BCSM were associated with disease stage, T stage, N stage, progesterone receptor expression, grade, and Ki67 index. Median OS reached 122 and 42 months in M0 and de novo M1 patients, respectively. CONCLUSION: Due to the rarity of MBC, this study highlights important findings from real clinical practice. Although the number of patients with MBC with unfavorable features was higher in this Czech dataset than in international studies, the prognosis remains consistent with real-world evidence.


Subject(s)
Breast Neoplasms, Male , Humans , Breast Neoplasms, Male/pathology , Breast Neoplasms, Male/mortality , Breast Neoplasms, Male/therapy , Breast Neoplasms, Male/drug therapy , Male , Retrospective Studies , Aged , Prognosis , Czech Republic/epidemiology , Middle Aged , Adult , Aged, 80 and over
18.
Oncologist ; 2024 Sep 30.
Article in English | MEDLINE | ID: mdl-39349396

ABSTRACT

BACKGROUND: The landscape of small cell lung cancer (SCLC) has changed since the 2019 and 2020 approvals of anti-PD-L1 atezolizumab and durvalumab for first-line (1L) treatment in combination with chemotherapy. We studied treatment patterns and real-world overall survival (rwOS) following 1L-3L therapy. PATIENTS AND METHODS: A nationwide electronic health record (EHR)-derived de-identified database was used to describe treatment patterns, characteristics, and survival of patients with extensive-stage (ES)-SCLC by 1L anti-PD-L1 treatment. Patients with ES-SCLC who initiated ≥1 line of systemic therapy from 2013 to 2021, with potential follow-up through 2022, were included. RESULTS: Among 9952 patients with SCLC, there were 4308 patients with ES-SCLC treated during the study period who met eligibility criteria. Etoposide + platinum (EP) chemotherapy was most common in the 1L, with addition of anti-PD-L1 therapy to most regimens by 2019. Second-line regimens varied by platinum sensitivity status and shifted from topotecan to lurbinectedin over time. Median rwOS following 1L therapy was 8.3 months (95% CI, 7.9-8.8) in those treated with 1L anti-PD-L1 and 8.0 months (95% CI, 7.8-8.2) in those who were not. Following 2L and 3L, median rwOS was 5.6 (95% CI, 4.9-6.3) and 4.9 months (95% CI, 3.4-6.0), respectively, among 1L anti-PD-L1-treated, and 4.5 (95% CI, 4.2-4.9) and 4.0 months (95% CI, 3.7-4.5), respectively, among those who were not. CONCLUSION: Despite the introduction of frontline anti-PD-L1 therapy, survival remains dismal among patients with ES-SCLC treated in the real-world setting.

19.
Biostatistics ; 2023 Oct 26.
Article in English | MEDLINE | ID: mdl-37886808

ABSTRACT

The tree-based scan statistic is a data mining method used to identify signals of adverse drug reactions in a database of spontaneous reporting systems. It is particularly beneficial when dealing with hierarchical data structures. One may use a retrospective case-control study design from spontaneous reporting systems (SRS) to investigate whether a specific adverse event of interest is associated with certain drugs. However, the existing Bernoulli model of the tree-based scan statistic may not be suitable as it fails to adequately account for dependencies within matched pairs. In this article, we propose signal detection statistics for matched case-control data based on McNemar's test, Wald test for conditional logistic regression, and the likelihood ratio test for a multinomial distribution. Through simulation studies, we demonstrate that our proposed methods outperform the existing approach in terms of the type I error rate, power, sensitivity, and false detection rate. To illustrate our proposed approach, we applied the three methods and the existing method to detect drug signals for dizziness-related adverse events related to antihypertensive drugs using the database of the Korea Adverse Event Reporting System.

20.
Article in English | MEDLINE | ID: mdl-39402306

ABSTRACT

PURPOSE: Low-dose computed tomography lung cancer screening is effective for reducing lung cancer mortality. It is critical to understand the lung cancer screening practices for screen-eligible individuals living in Alabama and Georgia where lung cancer is the leading cause of cancer death. High lung cancer incidence and mortality rates are attributed to high smoking rates among underserved, low income, and rural populations. Therefore, the purpose of this study is to define sociodemographic and clinical characteristics of patients who were screened for lung cancer at an Academic Medical Center (AMC) in Alabama and a Safety Net Hospital (SNH) in Georgia. METHODS: A retrospective cohort study of screen-eligible patients was constructed using electronic health records between 2015 and 2020 seen at an Academic Medical Center (AMC) and a Safety Net Hospital (SNH) separately. Chi-square tests and Student t tests were used to compare screening uptake across patient demographic and clinical variables. Bivariate and multivariate logistic regressions determined significant predictors of lung cancer screening uptake. RESULTS: At the AMC, 67,355 were identified as eligible for LCS and 1,129 were screened. In bivariate analyses, there were several differences between those who were screened and those who were not screened. Screening status in the site at Alabama-those with active tobacco use are significantly more likely to be screened than former smokers (OR: 3.208, p < 0.01). For every 10-unit increase in distance, the odds of screening decreased by about 15% (OR: 0.848, p < 0.01). For every 10-year increase in age, the odds of screening decrease by about 30% (OR: 0.704, p < 0.01). Each additional comorbidity increases the odds of screening by about 7.5% (OR: 1.075, p < 0.01). Those with both private and public insurance have much higher odds of screening compared to those with only private insurance (OR: 5.403, p < 0.01). However, those with only public insurance have lower odds of screening compared to those with private insurance (OR: 0.393, p < 0.01). At the SNH-each additional comorbidity increased the odds of screening by about 11.9% (OR: 1.119, p = 0.01). Notably, those with public insurance have significantly higher odds of being screened compared to those with private insurance (OR: 2.566, p < 0.01). CONCLUSION: The study provides evidence that LCS has not reached all subgroups and that additional targeted efforts are needed to increase lung cancer screening uptake. Furthermore, disparity was noticed between adults living closer to screening institutions and those who lived farther.

SELECTION OF CITATIONS
SEARCH DETAIL