ABSTRACT
BACKGROUND: Recent therapeutic advances and screening technologies have improved survival among patients with lung cancer, who are now at high risk of developing second primary lung cancer (SPLC). Recently, an SPLC risk-prediction model (called SPLC-RAT) was developed and validated using data from population-based epidemiological cohorts and clinical trials, but real-world validation has been lacking. The predictive performance of SPLC-RAT was evaluated in a hospital-based cohort of lung cancer survivors. METHODS: The authors analyzed data from 8448 ever-smoking patients diagnosed with initial primary lung cancer (IPLC) in 1997-2006 at Mayo Clinic, with each patient followed for SPLC through 2018. The predictive performance of SPLC-RAT and further explored the potential of improving SPLC detection through risk model-based surveillance using SPLC-RAT versus existing clinical surveillance guidelines. RESULTS: Of 8448 IPLC patients, 483 (5.7%) developed SPLC over 26,470 person-years. The application of SPLC-RAT showed high discrimination area under the receiver operating characteristics curve: 0.81). When the cohort was stratified by a 10-year risk threshold of ≥5.6% (i.e., 80th percentile from the SPLC-RAT development cohort), the observed SPLC incidence was significantly elevated in the high-risk versus low-risk subgroup (13.1% vs. 1.1%, p < 1 × 10-6 ). The risk-based surveillance through SPLC-RAT (≥5.6% threshold) outperformed the National Comprehensive Cancer Network guidelines with higher sensitivity (86.4% vs. 79.4%) and specificity (38.9% vs. 30.4%) and required 20% fewer computed tomography follow-ups needed to detect one SPLC (162 vs. 202). CONCLUSION: In a large, hospital-based cohort, the authors validated the predictive performance of SPLC-RAT in identifying high-risk survivors of SPLC and showed its potential to improve SPLC detection through risk-based surveillance. PLAIN LANGUAGE SUMMARY: Lung cancer survivors have a high risk of developing second primary lung cancer (SPLC). However, no evidence-based guidelines for SPLC surveillance are available for lung cancer survivors. Recently, an SPLC risk-prediction model was developed and validated using data from population-based epidemiological cohorts and clinical trials, but real-world validation has been lacking. Using a large, real-world cohort of lung cancer survivors, we showed the high predictive accuracy and risk-stratification ability of the SPLC risk-prediction model. Furthermore, we demonstrated the potential to enhance efficiency in detecting SPLC using risk model-based surveillance strategies compared to the existing consensus-based clinical guidelines, including the National Comprehensive Cancer Network.
Subject(s)
Cancer Survivors , Lung Neoplasms , Neoplasms, Second Primary , Humans , Lung Neoplasms/diagnosis , Lung Neoplasms/epidemiology , Lung Neoplasms/therapy , Risk , Smoking , LungABSTRACT
PURPOSE: In advanced non-small cell lung cancer (NSCLC), immune checkpoint inhibitor (ICI) monotherapy is often preferred over intensive ICI treatment for frail patients and those with poor performance status (PS). Among those with poor PS, the additional effect of frailty on treatment selection and mortality is unknown. METHODS: Patients in the veterans affairs national precision oncology program from 1/2019-12/2021 who received first-line ICI for advanced NSCLC were followed until death or study end 6/2022. Association of an electronic frailty index with treatment selection was examined using logistic regression stratified by PS. We also examined overall survival (OS) on intensive treatment using Cox regression stratified by PS. Intensive treatment was defined as concurrent use of platinum-doublet chemotherapy and/or dual checkpoint blockade and non-intensive as ICI monotherapy. RESULTS: Of 1547 patients receiving any ICI, 66.2% were frail, 33.8% had poor PS (≥ 2), and 25.8% were both. Frail patients received less intensive treatment than non-frail patients in both PS subgroups (Good PS: odds ratio [OR] 0.67, 95% confidence interval [CI] 0.51 - 0.88; Poor PS: OR 0.69, 95% CI 0.44 - 1.10). Among 731 patients receiving intensive treatment, frailty was associated with lower OS for those with good PS (hazard ratio [HR] 1.53, 95% CI 1.2 - 1.96), but no association was observed with poor PS (HR 1.03, 95% CI 0.67 - 1.58). CONCLUSION: Frail patients with both good and poor PS received less intensive treatment. However, frailty has a limited effect on survival among those with poor PS. These findings suggest that PS, not frailty, drives survival on intensive treatment.
Subject(s)
Carcinoma, Non-Small-Cell Lung , Immune Checkpoint Inhibitors , Immunotherapy , Lung Neoplasms , Humans , Carcinoma, Non-Small-Cell Lung/drug therapy , Carcinoma, Non-Small-Cell Lung/mortality , Carcinoma, Non-Small-Cell Lung/therapy , Lung Neoplasms/drug therapy , Lung Neoplasms/mortality , Lung Neoplasms/therapy , Male , Female , Aged , Immunotherapy/methods , Immune Checkpoint Inhibitors/therapeutic use , Middle Aged , Frailty , Aged, 80 and overABSTRACT
BACKGROUND: In their 2021 lung cancer screening recommendation update, the U.S. Preventive Services Task Force (USPSTF) evaluated strategies that select people based on their personal lung cancer risk (risk model-based strategies), highlighting the need for further research on the benefits and harms of risk model-based screening. OBJECTIVE: To evaluate and compare the cost-effectiveness of risk model-based lung cancer screening strategies versus the USPSTF recommendation and to explore optimal risk thresholds. DESIGN: Comparative modeling analysis. DATA SOURCES: National Lung Screening Trial; Surveillance, Epidemiology, and End Results program; U.S. Smoking History Generator. TARGET POPULATION: 1960 U.S. birth cohort. TIME HORIZON: 45 years. PERSPECTIVE: U.S. health care sector. INTERVENTION: Annual low-dose computed tomography in risk model-based strategies that start screening at age 50 or 55 years, stop screening at age 80 years, with 6-year risk thresholds between 0.5% and 2.2% using the PLCOm2012 model. OUTCOME MEASURES: Incremental cost-effectiveness ratio (ICER) and cost-effectiveness efficiency frontier connecting strategies with the highest health benefit at a given cost. RESULTS OF BASE-CASE ANALYSIS: Risk model-based screening strategies were more cost-effective than the USPSTF recommendation and exclusively comprised the cost-effectiveness efficiency frontier. Among the strategies on the efficiency frontier, those with a 6-year risk threshold of 1.2% or greater were cost-effective with an ICER less than $100 000 per quality-adjusted life-year (QALY). Specifically, the strategy with a 1.2% risk threshold had an ICER of $94 659 (model range, $72 639 to $156 774), yielding more QALYs for less cost than the USPSTF recommendation, while having a similar level of screening coverage (person ever-screened 21.7% vs. USPSTF's 22.6%). RESULTS OF SENSITIVITY ANALYSES: Risk model-based strategies were robustly more cost-effective than the 2021 USPSTF recommendation under varying modeling assumptions. LIMITATION: Risk models were restricted to age, sex, and smoking-related risk predictors. CONCLUSION: Risk model-based screening is more cost-effective than the USPSTF recommendation, thus warranting further consideration. PRIMARY FUNDING SOURCE: National Cancer Institute (NCI).
Subject(s)
Lung Neoplasms , Humans , Middle Aged , Aged, 80 and over , Lung Neoplasms/diagnostic imaging , Cost-Effectiveness Analysis , Early Detection of Cancer/methods , Cost-Benefit Analysis , Lung , Quality-Adjusted Life Years , Mass Screening/methodsABSTRACT
BACKGROUND: Patients with a prior malignancy are at elevated risk of developing subsequent primary malignancies (SPMs). However, the risk of developing subsequent primary glioblastoma (SPGBM) in patients with a prior cancer history is poorly understood. METHODS: We used the Surveillance, Epidemiology, and End Results (SEER) database and identified patients diagnosed with non-CNS malignancy between 2000 and 2018. We calculated a modified standardized incidence ratio (M-SIR), defined as the ratio of the incidence of SPGBM among patients with initial non-CNS malignancy to the incidence of GBM in the general population, stratified by sex latency, and initial tumor location. RESULTS: Of the 5,326,172 patients diagnosed with a primary non-CNS malignancy, 3559 patients developed SPGBM (0.07%). Among patients with SPGBM, 2312 (65.0%) were men, compared to 2,706,933 (50.8%) men in the total primary non-CNS malignancy cohort. The median age at diagnosis of SPGBM was 65 years. The mean latency between a prior non-CNS malignancy and developing a SPGBM was 67.3 months (interquartile range [IQR] 27-100). Overall, patients with a primary non-CNS malignancy had a significantly elevated M-SIR (1.13, 95% CI 1.09-1.16), with a 13% increased incidence of SPGBM when compared to the incidence of developing GBM in the age-matched general population. When stratified by non-CNS tumor location, patients diagnosed with primary melanoma, lymphoma, prostate, breast, renal, or endocrine malignancies had a higher M-SIR (M-SIR ranges: 1.09-2.15). Patients with lung cancers (M-SIR 0.82, 95% CI 0.68-0.99), or stomach cancers (M-SIR 0.47, 95% CI 0.24-0.82) demonstrated a lower M-SIR. CONCLUSION: Patients with a history of prior non-CNS malignancy are at an overall increased risk of developing SPGBM relative to the incidence of developing GBM in the general population. However, the incidence of SPGBM after prior non-CNS malignancy varies by primary tumor location, with some non-CNS malignancies demonstrating either increased or decreased predisposition for SPGBM depending on tumor origin. These findings merit future investigation into whether these relationships represent treatment effects or a previously unknown shared predisposition for glioblastoma and non-CNS malignancy.
Subject(s)
Glioblastoma , Lymphoma , Neoplasms, Second Primary , Male , Humans , Aged , Female , Glioblastoma/epidemiology , Glioblastoma/complications , SEER Program , Neoplasms, Second Primary/etiology , Lymphoma/complications , Incidence , Risk FactorsABSTRACT
PURPOSE: Given the rarity of disseminated disease at the time of initial evaluation for pediatric brain tumor patients, we sought to identify clinical and radiographic predictors of spinal metastasis (SM) at the time of presentation. METHODS: We performed a single-institution retrospective chart review of pediatric brain tumor patients who first presented between 2004 and 2018. We extracted information regarding patient demographics, radiographic attributes, and presenting symptoms. Univariate and multivariate logistic regression was used to estimate the association between measured variables and SMs. RESULTS: We identified 281 patients who met our inclusion criteria, of whom 19 had SM at initial presentation (6.8%). The most common symptoms at presentation were headache (n = 12; 63.2%), nausea/vomiting (n = 16; 84.2%), and gait abnormalities (n = 8; 41.2%). Multivariate models demonstrated that intraventricular and posterior fossa tumors were more frequently associated with SM (OR: 5.28, 95% CI: 1.79-15.59, p = 0.003), with 4th ventricular (OR: 7.42, 95% CI: 1.77-31.11, p = 0.006) and cerebellar parenchymal tumor location (OR: 4.79, 95% CI: 1.17-19.63, p = 0.030) carrying the highest risk for disseminated disease. In addition, evidence of intracranial leptomeningeal enhancement on magnetic resonance imaging (OR: 46.85, 95% CI: 12.31-178.28, p < 0.001) and hydrocephalus (OR: 3.19; 95% CI: 1.06-9.58; p = 0.038) were associated with SM. CONCLUSIONS: Intraventricular tumors and the presence of intracranial leptomeningeal disease were most frequently associated with disseminated disease at presentation. These findings are consistent with current clinical expectations and offer empirical evidence that heightened suspicion for SM may be prospectively applied to certain subsets of pediatric brain tumor patients at the time of presentation.
Subject(s)
Brain Neoplasms , Hydrocephalus , Child , Humans , Retrospective Studies , Brain Neoplasms/diagnostic imaging , Brain Neoplasms/secondary , Headache , Magnetic Resonance ImagingABSTRACT
OBJECTIVE: The natural history of seizure risk after brain tumor resection is not well understood. Identifying seizure-naive patients at highest risk for postoperative seizure events remains a clinical need. In this study, the authors sought to develop a predictive modeling strategy for anticipating postcraniotomy seizures after brain tumor resection. METHODS: The IBM Watson Health MarketScan Claims Database was canvassed for antiepileptic drug (AED)- and seizure-naive patients who underwent brain tumor resection (2007-2016). The primary event of interest was short-term seizure risk (within 90 days postdischarge). The secondary event of interest was long-term seizure risk during the follow-up period. To model early-onset and long-term postdischarge seizure risk, a penalized logistic regression classifier and multivariable Cox regression model, respectively, were built, which integrated patient-, tumor-, and hospitalization-specific features. To compare empirical seizure rates, equally sized cohort tertiles were created and labeled as low risk, medium risk, and high risk. RESULTS: Of 5470 patients, 983 (18.0%) had a postdischarge-coded seizure event. The integrated binary classification approach for predicting early-onset seizures outperformed models using feature subsets (area under the curve [AUC] = 0.751, hospitalization features only AUC = 0.667, patient features only AUC = 0.603, and tumor features only AUC = 0.694). Held-out validation patient cases that were predicted by the integrated model to have elevated short-term risk more frequently developed seizures within 90 days of discharge (24.1% high risk vs 3.8% low risk, p < 0.001). Compared with those in the low-risk tertile by the long-term seizure risk model, patients in the medium-risk and high-risk tertiles had 2.13 (95% CI 1.45-3.11) and 6.24 (95% CI 4.40-8.84) times higher long-term risk for postdischarge seizures. Only patients predicted as high risk developed status epilepticus within 90 days of discharge (1.7% high risk vs 0% low risk, p = 0.003). CONCLUSIONS: The authors have presented a risk-stratified model that accurately predicted short- and long-term seizure risk in patients who underwent brain tumor resection, which may be used to stratify future study of postoperative AED prophylaxis in highest-risk patient subpopulations.
Subject(s)
Anticonvulsants , Brain Neoplasms , Aftercare , Anticonvulsants/adverse effects , Brain Neoplasms/drug therapy , Brain Neoplasms/surgery , Humans , Patient Discharge , Retrospective Studies , Seizures/etiologyABSTRACT
Evaluating gene by environment (G × E) interaction under an additive risk model (i.e., additive interaction) has gained wider attention. Recently, statistical tests have been proposed for detecting additive interaction, utilizing an assumption on gene-environment (G-E) independence to boost power, that do not rely on restrictive genetic models such as dominant or recessive models. However, a major limitation of these methods is a sharp increase in type I error when this assumption is violated. Our goal was to develop a robust test for additive G × E interaction under the trend effect of genotype, applying an empirical Bayes-type shrinkage estimator of the relative excess risk due to interaction. The proposed method uses a set of constraints to impose the trend effect of genotype and builds an estimator that data-adaptively shrinks an estimator of relative excess risk due to interaction obtained under a general model for G-E dependence using a retrospective likelihood framework. Numerical study under varying levels of departures from G-E independence shows that the proposed method is robust against the violation of the independence assumption while providing an adequate balance between bias and efficiency compared with existing methods. We applied the proposed method to the genetic data of Alzheimer disease and lung cancer.
Subject(s)
Bayes Theorem , Gene-Environment Interaction , Genotype , Alzheimer Disease/etiology , Alzheimer Disease/genetics , Apolipoprotein E4/genetics , Empirical Research , Genetic Predisposition to Disease/genetics , Humans , Likelihood Functions , Lung Neoplasms/etiology , Lung Neoplasms/genetics , Models, Statistical , Polymorphism, Single Nucleotide/genetics , Retrospective Studies , Risk Factors , Smoking/adverse effectsABSTRACT
Several statistical methods have been proposed for testing gene-environment (G-E) interactions under additive risk models using data from genome-wide association studies. However, these approaches have strong assumptions from underlying genetic models, such as dominant or recessive effects that are known to be less robust when the true genetic model is unknown. We aimed to develop a robust trend test employing a likelihood ratio test for detecting G-E interaction under an additive risk model, while incorporating the G-E independence assumption to increase power. We used a constrained likelihood to impose 2 sets of constraints for: 1) the linear trend effect of genotype and 2) the additive joint effects of gene and environment. To incorporate the G-E independence assumption, a retrospective likelihood was used versus a standard prospective likelihood. Numerical investigation suggests that the proposed tests are more powerful than tests assuming dominant, recessive, or general models under various parameter settings and under both likelihoods. Incorporation of the independence assumption enhances efficiency by 2.5-fold. We applied the proposed methods to examine the gene-smoking interaction for lung cancer and gene-apolipoprotein E $\varepsilon$4 interaction for Alzheimer disease, which identified 2 interactions between apolipoprotein E $\varepsilon$4 and loci membrane-spanning 4-domains subfamily A (MS4A) and bridging integrator 1 (BIN1) genes at genome-wide significance that were replicated using independent data.
Subject(s)
Gene-Environment Interaction , Likelihood Functions , Models, Genetic , Adaptor Proteins, Signal Transducing/genetics , Alzheimer Disease/genetics , Apolipoprotein E4/genetics , Genetic Predisposition to Disease , Genome-Wide Association Study , Genotype , Humans , Lung Neoplasms/genetics , Membrane Proteins/genetics , Nuclear Proteins/genetics , Research Design , Smoking/adverse effects , Tumor Suppressor Proteins/geneticsABSTRACT
Importance: The US Preventive Services Task Force (USPSTF) is updating its 2013 lung cancer screening guidelines, which recommend annual screening for adults aged 55 through 80 years who have a smoking history of at least 30 pack-years and currently smoke or have quit within the past 15 years. Objective: To inform the USPSTF guidelines by estimating the benefits and harms associated with various low-dose computed tomography (LDCT) screening strategies. Design, Setting, and Participants: Comparative simulation modeling with 4 lung cancer natural history models for individuals from the 1950 and 1960 US birth cohorts who were followed up from aged 45 through 90 years. Exposures: Screening with varying starting ages, stopping ages, and screening frequency. Eligibility criteria based on age, cumulative pack-years, and years since quitting smoking (risk factor-based) or on age and individual lung cancer risk estimation using risk prediction models with varying eligibility thresholds (risk model-based). A total of 1092 LDCT screening strategies were modeled. Full uptake and adherence were assumed for all scenarios. Main Outcomes and Measures: Estimated lung cancer deaths averted and life-years gained (benefits) compared with no screening. Estimated lifetime number of LDCT screenings, false-positive results, biopsies, overdiagnosed cases, and radiation-related lung cancer deaths (harms). Results: Efficient screening programs estimated to yield the most benefits for a given number of screenings were identified. Most of the efficient risk factor-based strategies started screening at aged 50 or 55 years and stopped at aged 80 years. The 2013 USPSTF-recommended criteria were not among the efficient strategies for the 1960 US birth cohort. Annual strategies with a minimum criterion of 20 pack-years of smoking were efficient and, compared with the 2013 USPSTF-recommended criteria, were estimated to increase screening eligibility (20.6%-23.6% vs 14.1% of the population ever eligible), lung cancer deaths averted (469-558 per 100â¯000 vs 381 per 100â¯000), and life-years gained (6018-7596 per 100â¯000 vs 4882 per 100â¯000). However, these strategies were estimated to result in more false-positive test results (1.9-2.5 per person screened vs 1.9 per person screened with the USPSTF strategy), overdiagnosed lung cancer cases (83-94 per 100â¯000 vs 69 per 100â¯000), and radiation-related lung cancer deaths (29.0-42.5 per 100â¯000 vs 20.6 per 100â¯000). Risk model-based vs risk factor-based strategies were estimated to be associated with more benefits and fewer radiation-related deaths but more overdiagnosed cases. Conclusions and Relevance: Microsimulation modeling studies suggested that LDCT screening for lung cancer compared with no screening may increase lung cancer deaths averted and life-years gained when optimally targeted and implemented. Screening individuals at aged 50 or 55 years through aged 80 years with 20 pack-years or more of smoking exposure was estimated to result in more benefits than the 2013 USPSTF-recommended criteria and less disparity in screening eligibility by sex and race/ethnicity.
Subject(s)
Early Detection of Cancer , Lung Neoplasms/diagnostic imaging , Practice Guidelines as Topic , Tomography, X-Ray Computed , Aged , Early Detection of Cancer/adverse effects , Early Detection of Cancer/standards , Humans , Lung/diagnostic imaging , Lung Neoplasms/mortality , Lung Neoplasms/prevention & control , Middle Aged , Models, Theoretical , Risk Assessment , Sensitivity and Specificity , Smoking , Smoking Cessation , Tomography, X-Ray Computed/adverse effects , Tomography, X-Ray Computed/methodsABSTRACT
[This corrects the article DOI: 10.1371/journal.pmed.1002277.].
ABSTRACT
INTRODUCTION: Deep brain stimulation (DBS) has emerged as a safe and effective therapy for refractory Tourette syndrome (TS). Recent studies have identified several neural targets as effective in reducing TS symptoms with DBS, but, to our knowledge, none has compared the effectiveness of DBS with conservative therapy. METHODS: A literature review was performed to identify studies investigating adult patient outcomes reported as Yale Global Tic Severity Scale (YGTSS) scores after DBS surgery, pharmacotherapy, and psychotherapy. Data were pooled using a random-effects model of inverse variance-weighted meta-analysis (n = 168 for DBS, n = 131 for medications, and n = 154 for behavioral therapy). RESULTS: DBS resulted in a significantly greater reduction in YGTSS total score (49.9 ± 17.5%) than pharmacotherapy (22.5 ± 15.2%, p = 0.001) or psychotherapy (20.0 ± 11.3%, p < 0.001), with a complication (adverse effect) rate of 0.15/case, 1.13/case, and 0.60/case, respectively. CONCLUSION: Our data suggest that adult patients with refractory TS undergoing DBS experience greater symptomatic improvement with surprisingly low morbidity than can be obtained with pharmacotherapy or psychotherapy.
Subject(s)
Conservative Treatment/methods , Deep Brain Stimulation/methods , Tourette Syndrome/diagnostic imaging , Tourette Syndrome/therapy , Clinical Trials as Topic/methods , Conservative Treatment/trends , Deep Brain Stimulation/trends , Humans , Treatment OutcomeSubject(s)
COVID-19 , Paraproteinemias , COVID-19/complications , COVID-19 Vaccines , Humans , Plasma Cells , Risk FactorsABSTRACT
Background and Purpose- Deep vein thrombosis (DVTs) is a common disease with high morbidity if it progresses to pulmonary embolus (PE). Anticoagulation is the treatment of choice; warfarin has long been the standard of care. Early experience with direct oral anticoagulants (DOACs) suggests that these agents may be may be a safer and equally effective alternative in the treatment of DVT/PE. Nontraumatic intracranial hemorrhage (ICH) is one of the most devastating potential complications of anticoagulation therapy. We sought to compare the rates of ICH in patients treated with DOACs versus those treated with warfarin for DVT/PE. Methods- The MarketScan Commercial Claims and Medicare Supplemental databases were used. Adult DVT/PE patients without known atrial fibrillation and with prescriptions for either a DOAC or warfarin were followed for the occurrence of inpatient admission for ICH. Coarsened exact matching was used to balance the treatment cohorts. Cox proportional-hazards regressions and Kaplan-Meier survival curves were used to estimate the association between DOACs and the risk of ICH compared with warfarin. Results- The combined cohort of 218 620 patients had a median follow-up of 3.0 months, mean age of 55.4 years, and was 52.1% women. The DOAC cohort had 26 980 patients and 8 ICH events (1.0 cases per 1000 person-years), and the warfarin cohort had 191 640 patients and 324 ICH events (3.3 cases per 1000 person-years; P<0.0001). The DOAC cohort had a lower hazard ratio for ICH compared with warfarin in both the unmatched (hazard ratio=0.26; P=0.0002) and matched (hazard ratio=0.20; P=0.0001) Cox proportional-hazards regressions. Conclusions- DOACs show superior safety to warfarin in terms of risk of ICH in patients with DVT/PE.
Subject(s)
Anticoagulants/administration & dosage , Atrial Fibrillation/epidemiology , Intracranial Hemorrhages/epidemiology , Pulmonary Embolism/epidemiology , Venous Thrombosis/epidemiology , Warfarin/administration & dosage , Administration, Oral , Adult , Aged , Anticoagulants/adverse effects , Atrial Fibrillation/diagnostic imaging , Atrial Fibrillation/drug therapy , Cohort Studies , Female , Follow-Up Studies , Humans , Intracranial Hemorrhages/chemically induced , Intracranial Hemorrhages/diagnostic imaging , Male , Middle Aged , Pulmonary Embolism/diagnostic imaging , Pulmonary Embolism/drug therapy , Retrospective Studies , Venous Thrombosis/diagnostic imaging , Venous Thrombosis/drug therapy , Warfarin/adverse effectsABSTRACT
BACKGROUND: The incidence of hepatocellular carcinoma (HCC) has been rising rapidly in the United States. California is an ethnically diverse state with the largest number of incident HCC cases in the country. Characterizing HCC disparities in California may inform priorities for HCC prevention. METHODS: By using data from the Surveillance, Epidemiology, and End Results 18-Registry Database and the California Cancer Registry, age-adjusted HCC incidence in California from 2009 through 2013 was calculated by race/ethnicity and neighborhood ethnic enclave status. A geographic analysis was conducted using Medical Service Study Areas (MSSAs) as the geographic unit, and race/ethnicity-specific standardized incidence ratios (SIRs) were calculated to identify MSSAs with higher-than-expected HCC incidence compared with the statewide average. RESULTS: During 2009 through 2013, the age-adjusted incidence of HCC in California was the highest in Asians/Pacific Islanders (APIs) and Hispanics (>100% higher than whites), especially those living in more ethnic neighborhoods (20%-30% higher than less ethnic neighborhoods). Of the 542 MSSAs statewide, 42 had elevated HCC incidence (SIR ≥ 1.5; lower bound of 95% confidence interval > 1) for whites, 14 for blacks, 24 for APIs, and 36 for Hispanics. These MSSAs have 24% to 52% higher proportions of individuals below the 100% federal poverty line than other MSSAs. CONCLUSIONS: APIs and Hispanics residing in more ethnic neighborhoods and individuals residing in lower income neighborhoods require more extensive preventive efforts tailored toward their unique risk factor profiles. The current race/ethnicity-specific geographic analysis can be extended to other states to inform priorities for HCC targeted prevention at the subcounty level, eventually reducing HCC burden in the country.
Subject(s)
Carcinoma, Hepatocellular/ethnology , Health Status Disparities , Liver Neoplasms/ethnology , Adult , Aged , Aged, 80 and over , California/epidemiology , Carcinoma, Hepatocellular/epidemiology , Carcinoma, Hepatocellular/prevention & control , Ethnicity/statistics & numerical data , Female , Geography , Health Plan Implementation/organization & administration , Health Plan Implementation/statistics & numerical data , Humans , Incidence , Liver Neoplasms/epidemiology , Liver Neoplasms/prevention & control , Male , Medical Oncology/organization & administration , Medical Oncology/statistics & numerical data , Middle Aged , Preventive Medicine/organization & administration , Preventive Medicine/statistics & numerical data , Racial Groups/statistics & numerical data , Registries , Residence Characteristics/statistics & numerical data , SEER ProgramABSTRACT
OBJECTIVES: A reduction in glucose metabolism in the posterior cingulate cortex (PCC) predicts conversion to Alzheimer's disease (AD) and tracks disease progression, signifying its importance in AD. We aimed to use decline in PCC glucose metabolism as a proxy for the development and progression of AD to discover common genetic variants associated with disease vulnerability. METHODS: We performed a genome-wide association study (GWAS) of decline in PCC fludeoxyglucose F 18 ([18 F] FDG) positron emission tomography measured in Alzheimer's Disease Neuroimaging Initiative participants (n = 606). We then performed follow-up analyses to assess the impact of significant single-nucleotide polymorphisms (SNPs) on disease risk and longitudinal cognitive performance in a large independent data set (n = 870). Last, we assessed whether significant SNP influence gene expression using two RNA sequencing data sets (n = 210 and n = 159). RESULTS: We demonstrate a novel genome-wide significant association between rs2273647-T in the gene, PPP4R3A, and reduced [18 F] FDG decline (p = 4.44 × 10-8 ). In a follow-up analysis using an independent data set, we demonstrate a protective effect of this variant against risk of conversion to MCI or AD (p = 0.038) and against cognitive decline in individuals who develop dementia (p = 3.41 × 10-15 ). Furthermore, this variant is associated with altered gene expression in peripheral blood and altered PPPP4R3A transcript expression in temporal cortex, suggesting a role at the molecular level. INTERPRETATIONS: PPP4R3A is a gene involved in AD risk and progression. Given the protective effect of this variant, PPP4R3A should be further investigated as a gene of interest in neurodegenerative diseases and as a potential target for AD therapies. Ann Neurol 2017;82:900-911.
Subject(s)
Alzheimer Disease/genetics , Alzheimer Disease/metabolism , Disease Progression , Genetic Variation/genetics , Phosphoprotein Phosphatases/genetics , Aged , Aged, 80 and over , Alzheimer Disease/diagnostic imaging , Female , Follow-Up Studies , Genome-Wide Association Study/methods , Humans , Longitudinal Studies , Male , Positron-Emission Tomography/trendsABSTRACT
BACKGROUND: Metabolomics is emerging as an important tool for detecting differences between diseased and non-diseased individuals. However, prospective studies are limited. METHODS: We examined the detectability, reliability, and distribution of metabolites measured in pre-diagnostic plasma samples in a pilot study of women enrolled in the Northern California site of the Breast Cancer Family Registry. The study included 45 cases diagnosed with breast cancer at least one year after the blood draw, and 45 controls. Controls were matched on age (within 5 years), family status, BRCA status, and menopausal status. Duplicate samples were included for reliability assessment. We used a liquid chromatography/gas chromatography mass spectrometer platform to measure metabolites. We calculated intraclass correlations (ICCs) among duplicate samples, and coefficients of variation (CVs) across metabolites. RESULTS: Of the 661 named metabolites detected, 338 (51%) were found in all samples, and 490 (74%) in more than 80% of samples. The median ICC between duplicates was 0.96 (25th - 75th percentile: 0.82-0.99). We observed a greater than 20% case-control difference in 24 metabolites (p < 0.05), although these associations were not significant after adjusting for multiple comparisons. CONCLUSIONS: These data show that assays are reproducible for many metabolites, there is a minimal laboratory variation for the same sample, and a large between-person variation. Despite small sample size, differences between cases and controls in some metabolites suggest that a well-powered large-scale study is likely to detect biological meaningful differences to provide a better understanding of breast cancer etiology.
Subject(s)
Breast Neoplasms/metabolism , Metabolomics/methods , Registries/statistics & numerical data , Adult , Aged , Breast Neoplasms/blood , Breast Neoplasms/epidemiology , Breast Neoplasms/etiology , California/epidemiology , Case-Control Studies , Chromatography, Liquid/methods , Female , Gas Chromatography-Mass Spectrometry/methods , Humans , Metabolome , Middle Aged , Pilot Projects , Prospective Studies , Reproducibility of ResultsABSTRACT
OBJECTIVE: The aim of this study was to evaluate contemporary practices and opinions among gynecologic oncologists regarding the use of total pelvic exenteration (TPE) for palliative intent. METHODS: This cross-sectional study of the membership of the Society of Gynecologic Oncology utilized an electronic survey to assess the opinions and practice patterns of gynecologic oncologists regarding TPEs. The primary outcome was willingness to consider a TPE for palliative intent, and demographic and practice characteristics were collected for correlation. Qualitative data were also collected. Descriptive statistics are presented, and χ tests, Fisher exact tests, and logistic regression analyses were used. RESULTS: We included 315 surveys for analysis, for a completed response rate of 23.5%. Approximately half (52.4%, n = 165) of respondents indicated willingness to consider palliative TPE. When controlled for all variables, gynecologic oncologists who were more than 10 years out of fellowship were less likely to perform a palliative exenteration (odds ratio, 0.55; 95% confidence interval, 0.30-0.98), whereas those who reported experience with minimally invasive exenteration were more likely to offer it for palliation (odds ratio, 2.20; 95% confidence interval, 1.07-4.73). Fifty-three respondents (16.8%) provided qualitative data. The themes that emerged as considerations for TPE as palliation were (1) symptoms and quality of life, (2) surgical and perioperative morbidity, (3) anticipated overall survival, (4) counseling and informed consent, (5) functional status and comorbidities, (6) likelihood of residual disease, and (7) alternative procedures available for palliation. CONCLUSION: Half of gynecologic oncologists seem to be willing to offer a palliative TPE, although more-experienced gynecologic oncologists are more likely to reserve the procedure for curative intent.
Subject(s)
Endometrial Neoplasms/surgery , Adult , Aged , Aged, 80 and over , Chemotherapy, Adjuvant , Cross-Sectional Studies , Disease-Free Survival , Endometrial Neoplasms/drug therapy , Endometrial Neoplasms/pathology , Female , Guideline Adherence , Gynecology/methods , Gynecology/standards , Humans , Hysterectomy , Lymph Node Excision , Medical Oncology/methods , Medical Oncology/standards , Middle Aged , Neoplasm Staging , Palliative Care , Pelvic Exenteration , Retrospective Studies , Risk Factors , Salpingo-oophorectomy , Treatment OutcomeABSTRACT
The U.S. Preventive Services Task Force (USPSTF) recently updated their national lung screening guidelines and recommended low-dose computed tomography (LDCT) for lung cancer (LC) screening through age 80. However, the risk of overdiagnosis among older populations is a concern. Using four comparative models from the Cancer Intervention and Surveillance Modeling Network, we evaluate the overdiagnosis of the screening program recommended by USPSTF in the U.S. 1950 birth cohort. We estimate the number of LC deaths averted by screening (D) per overdiagnosed case (O), yielding the ratio D/O, to quantify the trade-off between the harms and benefits of LDCT. We analyze 576 hypothetical screening strategies that vary by age, smoking, and screening frequency and evaluate efficient screening strategies that maximize the D/O ratio and other metrics including D and life-years gained (LYG) per overdiagnosed case. The estimated D/O ratio for the USPSTF screening program is 2.85 (model range: 1.5-4.5) in the 1950 birth cohort, implying LDCT can prevent â¼3 LC deaths per overdiagnosed case. This D/O ratio increases by 22% when the program stops screening at an earlier age 75 instead of 80. Efficiency frontier analysis shows that while the most efficient screening strategies that maximize the mortality reduction (D) irrespective of overdiagnosis screen through age 80, screening strategies that stop at age 75 versus 80 produce greater efficiency in increasing life-years gained per overdiagnosed case. Given the risk of overdiagnosis with LC screening, the stopping age of screening merits further consideration when balancing benefits and harms.
Subject(s)
Lung Neoplasms/diagnosis , Medical Overuse/statistics & numerical data , Aged , Aged, 80 and over , Early Detection of Cancer/methods , Female , Humans , Male , Mass Screening/methods , Models, Theoretical , Risk Assessment/methods , Time Factors , Tomography, X-Ray ComputedABSTRACT
BACKGROUND: Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years). Nine previously established risk models were assessed for their ability to identify those most likely to develop or die from lung cancer. All models considered age and various aspects of smoking exposure (smoking status, smoking duration, cigarettes per day, pack-years smoked, time since smoking cessation) as risk predictors. In addition, some models considered factors such as gender, race, ethnicity, education, body mass index, chronic obstructive pulmonary disease, emphysema, personal history of cancer, personal history of pneumonia, and family history of lung cancer. METHODS AND FINDINGS: Retrospective analyses were performed on 53,452 National Lung Screening Trial (NLST) participants (1,925 lung cancer cases and 884 lung cancer deaths) and 80,672 Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO) ever-smoking participants (1,463 lung cancer cases and 915 lung cancer deaths). Six-year lung cancer incidence and mortality risk predictions were assessed for (1) calibration (graphically) by comparing the agreement between the predicted and the observed risks, (2) discrimination (area under the receiver operating characteristic curve [AUC]) between individuals with and without lung cancer (death), and (3) clinical usefulness (net benefit in decision curve analysis) by identifying risk thresholds at which applying risk-based eligibility would improve lung cancer screening efficacy. To further assess performance, risk model sensitivities and specificities in the PLCO were compared to those based on the NLST eligibility criteria. Calibration was satisfactory, but discrimination ranged widely (AUCs from 0.61 to 0.81). The models outperformed the NLST eligibility criteria over a substantial range of risk thresholds in decision curve analysis, with a higher sensitivity for all models and a slightly higher specificity for some models. The PLCOm2012, Bach, and Two-Stage Clonal Expansion incidence models had the best overall performance, with AUCs >0.68 in the NLST and >0.77 in the PLCO. These three models had the highest sensitivity and specificity for predicting 6-y lung cancer incidence in the PLCO chest radiography arm, with sensitivities >79.8% and specificities >62.3%. In contrast, the NLST eligibility criteria yielded a sensitivity of 71.4% and a specificity of 62.2%. Limitations of this study include the lack of identification of optimal risk thresholds, as this requires additional information on the long-term benefits (e.g., life-years gained and mortality reduction) and harms (e.g., overdiagnosis) of risk-based screening strategies using these models. In addition, information on some predictor variables included in the risk prediction models was not available. CONCLUSIONS: Selection of individuals for lung cancer screening using individual risk is superior to selection criteria based on age and pack-years alone. The benefits, harms, and feasibility of implementing lung cancer screening policies based on risk prediction models should be assessed and compared with those of current recommendations.
Subject(s)
Early Detection of Cancer/methods , Lung Neoplasms/diagnosis , Models, Theoretical , Patient Selection , Aged , Female , Humans , Male , Mass Screening/methods , Middle Aged , Retrospective Studies , Risk Assessment , Risk FactorsABSTRACT
BACKGROUND: The US preventive services task force (USPSTF) recently recommended that individuals aged 55-80 with heavy smoking history be annually screened by low-dose computed tomography (LDCT), thereby extending the stopping age from 74 to 80 compared to the national lung screening trial (NLST) entry criterion. This decision was made partly with model-based analyses from cancer intervention and surveillance modeling network (CISNET), which assumed perfect compliance to screening. METHODS: As part of CISNET, we developed a microsimulation model for lung cancer (LC) screening and calibrated and validated it using data from NLST and the prostate, lung, colorectal, and ovarian cancer screening trial (PLCO), respectively. We evaluated population-level outcomes of the lifetime screening program recommended by the USPSTF by varying screening compliance levels. RESULTS: Validation using PLCO shows that our model reproduces observed PLCO outcomes, predicting 884 LC cases [Expected(E)/Observed(O) = 0.99; CI 0.92-1.06] and 563 LC deaths (E/O = 0.94 CI 0.87-1.03) in the screening arm that has an average compliance rate of 87.9% over four annual screening rounds. We predict that perfect compliance to the USPSTF recommendation saves 501 LC deaths per 100,000 persons in the 1950 U.S. birth cohort; however, assuming that compliance behaviors extrapolated and varied from PLCO reduces the number of LC deaths avoided to 258, 230, and 175 as the average compliance rate over 26 annual screening rounds changes from 100 to 46, 39, and 29%, respectively. CONCLUSION: The implementation of the USPSTF recommendation is expected to contribute to a reduction in LC deaths, but the magnitude of the reduction will likely be heavily influenced by screening compliance.