Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 150
Filter
1.
Am J Epidemiol ; 2024 Jul 04.
Article in English | MEDLINE | ID: mdl-38965750

ABSTRACT

In cohort studies, it can be infeasible to collect specimens on an entire cohort. For example, to estimate sensitivity of multiple Multi-Cancer Detection (MCD) assays, we desire an extra 80mL of cell-free DNA (cfDNA) blood, but this much extra blood is too expensive for us to collect on everyone. We propose a novel epidemiologic study design that efficiently oversamples those at highest baseline disease risk from whom to collect specimens, to increase the number of future cases with cfDNA blood collection. The variance reduction ratio from our risk-based subsample versus a simple random (sub)sample (SRS) depends primarily on the ratio of risk model sensitivity to the fraction of the cohort selected for specimen collection subject to constraining the risk model specificity. In a simulation where we chose 34% of Prostate, Lung, Colorectal, and Ovarian Screening Trial cohort at highest risk of lung cancer for cfDNA blood collection, we could enrich the number of lung cancers 2.42-fold and the standard deviation of lung-cancer MCD sensitivity was 31-33% reduced versus SRS. Risk-based collection of specimens on a subsample of the cohort could be a feasible and efficient approach to collecting extra specimens for molecular epidemiology.

2.
Vaccine ; 2024 Jul 13.
Article in English | MEDLINE | ID: mdl-39004528

ABSTRACT

Though widely applied in other epidemiological fields, the case-cohort study design has seen little application in the field of vaccinology. Case-cohort studies use probabilistic sampling and reweighting to draw inferences about effects (in this case vaccine efficacy) at the population level in an efficient manner. The SARS-CoV-2 pandemic was met with high vaccine uptake, and high rates of population testing prior to the emergence of Omicron variants of concern, in Ontario, Canada, providing an ideal environment for application of case-cohort methodology. We combined a population-based case line list and vaccination database for the province of Ontario between December 2020 and October 2021. Risk of infection after vaccination was evaluated in all laboratory-confirmed vaccinated SARS-CoV-2 cases, and a 2 % sample of vaccinated controls, evaluated using survival analytic methods, including construction of Cox proportional hazards models. Vaccination status was treated as a time-varying covariate. First and second doses of SARS-CoV-2 vaccine markedly reduced risk of infection (first dose efficacy 68 %, 95 % CI 67 %-69 %; second dose efficacy 88 %, 95 % CI 87-88 %). In multivariable models, extended dosing intervals were associated with lowest risk of breakthrough infection (HR for redosing 0.64 (95 % CI 0.61-0.67) at 6-8 weeks). Heterologous vaccine schedules that mixed viral vector vaccine first doses with mRNA second doses were significantly more effective than mRNA only vaccines. Risk of infection largely vanished during the time period 4-6 months after the second vaccine dose, but rose markedly thereafter. We conclude that a case-cohort design provided an efficient means to identify strong protective effects associated with SARS-CoV-2 vaccination in real time, and also served to quantify the timing and magnitude of infection breakthrough risk in the same cohort. Heterologous vaccination and extended dosing intervals improved the durability of immune response.

3.
Int J Epidemiol ; 53(4)2024 Jun 12.
Article in English | MEDLINE | ID: mdl-39008896

ABSTRACT

BACKGROUND: Epstein-Barr virus (EBV) is a major cause of nasopharyngeal carcinoma (NPC) and measurement of different EBV antibodies in blood may improve early detection of NPC. Prospective studies can help assess the roles of different EBV antibodies in predicting NPC risk over time. METHODS: A case-cohort study within the prospective China Kadoorie Biobank of 512 715 adults from 10 (including two NPC endemic) areas included 295 incident NPC cases and 745 subcohort participants. A multiplex serology assay was used to quantify IgA and IgG antibodies against 16 EBV antigens in stored baseline plasma samples. Cox regression was used to estimate adjusted hazard ratios (HRs) for NPC and C-statistics to assess the discriminatory ability of EBV-markers, including two previously identified EBV-marker combinations, for predicting NPC. RESULTS: Sero-positivity for 15 out of 16 EBV-markers was significantly associated with higher NPC risk. Both IgA and IgG antibodies against the same three EBV-markers showed the most extreme HRs, i.e. BGLF2 (IgA: 124.2 (95% CI: 63.3-243.9); IgG: 8.6 (5.5-13.5); LF2: [67.8 (30.0-153.1), 10.9 (7.2-16.4)]); and BFRF1: 26.1 (10.1-67.5), 6.1 (2.7-13.6). Use of a two-marker (i.e. LF2/BGLF2 IgG) and a four-marker (i.e. LF2/BGLF2 IgG and LF2/EA-D IgA) combinations yielded C-statistics of 0.85 and 0.84, respectively, which persisted for at least 5 years after sample collection in both endemic and non-endemic areas. CONCLUSIONS: In Chinese adults, plasma EBV markers strongly predict NPC occurrence many years before clinical diagnosis. LF2 and BGLF2 IgG could identify NPC high-risk individuals to improve NPC early detection in community and clinical settings.


Subject(s)
Antibodies, Viral , Early Detection of Cancer , Epstein-Barr Virus Infections , Herpesvirus 4, Human , Immunoglobulin A , Immunoglobulin G , Nasopharyngeal Carcinoma , Nasopharyngeal Neoplasms , Humans , Male , China/epidemiology , Female , Middle Aged , Herpesvirus 4, Human/immunology , Prospective Studies , Antibodies, Viral/blood , Nasopharyngeal Carcinoma/virology , Nasopharyngeal Carcinoma/blood , Nasopharyngeal Carcinoma/immunology , Nasopharyngeal Carcinoma/epidemiology , Nasopharyngeal Neoplasms/virology , Nasopharyngeal Neoplasms/blood , Nasopharyngeal Neoplasms/immunology , Nasopharyngeal Neoplasms/epidemiology , Epstein-Barr Virus Infections/immunology , Epstein-Barr Virus Infections/epidemiology , Epstein-Barr Virus Infections/blood , Adult , Immunoglobulin A/blood , Early Detection of Cancer/methods , Immunoglobulin G/blood , Aged , Case-Control Studies , Proportional Hazards Models , East Asian People
4.
Environ Res ; 259: 119560, 2024 Jul 04.
Article in English | MEDLINE | ID: mdl-38971361

ABSTRACT

INTRODUCTION: Per- and polyfluoroalkyl substances (PFAS) are environmentally persistent, potentially carcinogenic chemicals. Previous studies investigating PFAS exposure and prostate cancer yielded mixed findings. We aimed to investigate associations between PFAS exposure and incident prostate cancer in a large cohort of U.S. men, overall and by selected demographic, lifestyle, and medical-related characteristics. METHODS: We conducted a case-cohort study among Cancer Prevention Study-II LifeLink Cohort participants who, at baseline (1998-2001), had serum specimens collected and no prior cancer diagnosis. The study included all men diagnosed with prostate cancer (n = 1610) during follow-up (baseline-June 30, 2015) and a random sub-cohort of 500 men. PFAS concentrations [perfluorohexane sulfonic acid (PFHxS), perfluorooctane sulfonate (PFOS), perfluorononanoic acid (PFNA), and perfluorooctanoic acid (PFOA)] were measured in stored serum specimens. We used multivariable Cox proportional hazards models to estimate associations between PFAS concentrations and prostate cancer, overall and by selected characteristics (grade, stage, family history, age, education, smoking status, and alcohol consumption). RESULTS: Prostate cancer hazards were slightly higher among men with concentrations in the highest (Q4) vs lowest quartile (Q1) for PFHxS [hazard ratio (HR) (95% CI): 1.18 (0.88-1.59)] and PFOS [HR (95% CI): 1.18 (0.89-1.58)], but not for PFNA or PFOA. However, we observed heterogeneous associations by age, family history of prostate cancer (PFHxS), alcohol consumption (PFHxS), and education (PFNA). For example, no meaningful associations were observed among men aged <70 years at serum collection, but among men aged ≥70 years, HRs (95% CIs) comparing Q4 to Q1 were PFHxS 1.54 (1.02-2.31) and PFOS 1.62 (1.08-2.44). No meaningful heterogeneity in associations were observed by tumor grade or stage. CONCLUSIONS: Our findings do not clearly support an association between the PFAS considered and prostate cancer. However, positive associations observed in some subgroups, and consistently positive associations observed for PFHxS warrant further investigation.

5.
Kidney Med ; 6(6): 100834, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38826568

ABSTRACT

Rationale & Objective: Tubulointerstitial damage is a feature of early chronic kidney disease (CKD), but current clinical tests capture it poorly. Urine biomarkers of tubulointerstitial health may identify risk of CKD. Study Design: Prospective cohort (Atherosclerosis Risk in Communities [ARIC]) and case-cohort (Multi-Ethnic Study of Atherosclerosis [MESA] and Reasons for Geographic and Racial Differences in Stroke [REGARDS]). Setting & Participants: Adults with estimated glomerular filtration rate (eGFR) ≥60 mL/min/1.73 m2 and without diabetes in the ARIC, REGARDS, and MESA studies. Exposures: Baseline urine monocyte chemoattractant protein-1 (MCP-1), alpha-1-microglobulin (α1m), kidney injury molecule-1, epidermal growth factor, and chitinase-3-like protein 1. Outcome: Incident CKD or end-stage kidney disease. Analytical Approach: Multivariable Cox proportional hazards regression for each cohort; meta-analysis of results from all 3 cohorts. Results: 872 ARIC participants (444 cases of incident CKD), 636 MESA participants (158 cases), and 924 REGARDS participants (488 cases) were sampled. Across cohorts, mean age ranged from 60 ± 10 to 63 ± 8 years, and baseline eGFR ranged from 88 ± 13 to 91 ± 14 mL/min/1.73 m2. In ARIC, higher concentrations of urine MCP-1, α1m, and kidney injury molecule-1 were associated with incident CKD. In MESA, higher concentration of urine MCP-1 and lower concentration of epidermal growth factor were each associated with incident CKD. In REGARDS, none of the biomarkers were associated with incident CKD. In meta-analysis of all 3 cohorts, each 2-fold increase α1m concentration was associated with incident CKD (HR, 1.19; 95% CI, 1.08-1.31). Limitations: Observational design susceptible to confounding; competing risks during long follow-up period; meta-analysis limited to 3 cohorts. Conclusions: In 3 combined cohorts of adults without prevalent CKD or diabetes, higher urine α1m concentration was independently associated with incident CKD. 4 biomarkers were associated with incident CKD in at least 1 of the cohorts when analyzed individually. Kidney tubule health markers might inform CKD risk independent of eGFR and albuminuria.


This study analyzed 3 cohorts (ARIC, MESA, and REGARDS) of adults without diabetes or prevalent chronic kidney disease (CKD) to determine the associations of 5 urinary biomarkers of kidney tubulointerstitial health with incident CKD, independent of traditional measures of kidney health. Meta-analysis of results from all 3 cohorts suggested that higher baseline levels of urine alpha-1-microglobulin were associated with incident CKD at follow-up. Results from individual cohorts suggested that in addition to alpha-1-microglobulin, monocyte chemoattractant protein-1, kidney injury molecule-1, and epidermal growth factor may also be associated with the development of CKD. These findings underscore the importance of kidney tubule interstitial health in defining risk of CKD independent of creatinine and urine albumin.

6.
Scand J Trauma Resusc Emerg Med ; 32(1): 47, 2024 May 21.
Article in English | MEDLINE | ID: mdl-38773613

ABSTRACT

BACKGROUND: Care for injured patients in England is provided by inclusive regional trauma networks. Ambulance services use triage tools to identify patients with major trauma who would benefit from expedited Major Trauma Centre (MTC) care. However, there has been no investigation of triage performance, despite its role in ensuring effective and efficient MTC care. This study aimed to investigate the accuracy of prehospital major trauma triage in representative English trauma networks. METHODS: A diagnostic case-cohort study was performed between November 2019 and February 2020 in 4 English regional trauma networks as part of the Major Trauma Triage Study (MATTS). Consecutive patients with acute injury presenting to participating ambulance services were included, together with all reference standard positive cases, and matched to data from the English national major trauma database. The index test was prehospital provider triage decision making, with a positive result defined as patient transport with a pre-alert call to the MTC. The primary reference standard was a consensus definition of serious injury that would benefit from expedited major trauma centre care. Secondary analyses explored different reference standards and compared theoretical triage tool accuracy to real-life triage decisions. RESULTS: The complete-case case-cohort sample consisted of 2,757 patients, including 959 primary reference standard positive patients. The prevalence of major trauma meeting the primary reference standard definition was 3.1% (n=54/1,722, 95% CI 2.3 - 4.0). Observed prehospital provider triage decisions demonstrated overall sensitivity of 46.7% (n=446/959, 95% CI 43.5-49.9) and specificity of 94.5% (n=1,703/1,798, 95% CI 93.4-95.6) for the primary reference standard. There was a clear trend of decreasing sensitivity and increasing specificity from younger to older age groups. Prehospital provider triage decisions commonly differed from the theoretical triage tool result, with ambulance service clinician judgement resulting in higher specificity. CONCLUSIONS: Prehospital decision making for injured patients in English trauma networks demonstrated high specificity and low sensitivity, consistent with the targets for cost-effective triage defined in previous economic evaluations. Actual triage decisions differed from theoretical triage tool results, with a decreasing sensitivity and increasing specificity from younger to older ages.


Subject(s)
Emergency Medical Services , Trauma Centers , Triage , Humans , Triage/methods , England , Female , Male , Middle Aged , Adult , Trauma Centers/organization & administration , Wounds and Injuries/diagnosis , Wounds and Injuries/therapy , Aged , Cohort Studies , Injury Severity Score
7.
Lifetime Data Anal ; 30(3): 572-599, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38565754

ABSTRACT

The case-cohort design obtains complete covariate data only on cases and on a random sample (the subcohort) of the entire cohort. Subsequent publications described the use of stratification and weight calibration to increase efficiency of estimates of Cox model log-relative hazards, and there has been some work estimating pure risk. Yet there are few examples of these options in the medical literature, and we could not find programs currently online to analyze these various options. We therefore present a unified approach and R software to facilitate such analyses. We used influence functions adapted to the various design and analysis options together with variance calculations that take the two-phase sampling into account. This work clarifies when the widely used "robust" variance estimate of Barlow (Biometrics 50:1064-1072, 1994) is appropriate. The corresponding R software, CaseCohortCoxSurvival, facilitates analysis with and without stratification and/or weight calibration, for subcohort sampling with or without replacement. We also allow for phase-two data to be missing at random for stratified designs. We provide inference not only for log-relative hazards in the Cox model, but also for cumulative baseline hazards and covariate-specific pure risks. We hope these calculations and software will promote wider use of more efficient and principled design and analysis options for case-cohort studies.


Subject(s)
Proportional Hazards Models , Humans , Cohort Studies , Software , Calibration , Body Weight , Computer Simulation
8.
Front Oncol ; 14: 1306255, 2024.
Article in English | MEDLINE | ID: mdl-38571507

ABSTRACT

Objective: To assess the effectiveness and clinical value of case-cohort design and determine prognostic factors of breast cancer patients in Xinjiang on the basis of case-cohort design. Methods: The survival data with different sample characteristics were simulated by using Cox proportional risk models. To evaluate the effectiveness for the case-cohort, entire cohort, and simple random sampling design by comparing the mean, coefficient of variation, etc., of covariate parameters. Furthermore, the prognostic factors of breast cancer patients in Xinjiang were determined based on case-cohort sampling designs. The models were comprehensively evaluated by likelihood ratio test, the area under the receiver operating characteristic curve (AUC), and Akaike Information Criterion (AIC). Results: In a simulations study, the case-cohort design shows better stability and improves the estimation efficiency when the censored rate is high. In the breast cancer data, molecular subtypes, T-stage, N-stage, M-stage, types of surgery, and postoperative chemotherapy were identified as the prognostic factors of patients in Xinjiang. These models based on the different sampling designs both passed the likelihood ratio test (p<0.05). Moreover, the model constructed under the case-cohort design had better fitting effect (AIC=3,999.96) and better discrimination (AUC=0.807). Conclusion: Simulations study confirmed the effectiveness of case-cohort design and further determined the prognostic factors of breast cancer patients in Xinjiang based on this design, which presented the practicality of case-cohort design in actual data.

9.
Diseases ; 12(2)2024 Feb 06.
Article in English | MEDLINE | ID: mdl-38391779

ABSTRACT

About half of the world's population is at risk of dengue infection. Epidemics of dengue fever have caused an increased risk of morbidity and mortality in recent years, which led to the exploration of vaccines as a preventive measure. This systematic review and meta-analysis aimed to evaluate the efficacy, immune response, and safety of dengue vaccines in children by analyzing clinical trials. The review followed standard procedures for data extraction using PRISMA guidelines and searching multiple databases, including PubMed, CINAHL, Medline, Health Source, Science Direct, and Academic Search Premiere. Eligible studies involved children (0-17 years old). Quality assessment was analyzed using the Cochrane Collaboration criteria, while data synthesis was conducted using thematic analysis and meta-analysis. Among the 38 selected studies, dengue vaccines showed varying efficacy against all four serotypes. The CYD-TDV (Dengvaxia®) and Tekade (TAK-003) vaccines showed strong protection against severe dengue, but their long-term efficacy varied. Vaccines triggered satisfactory immune responses, notably in those previously exposed to dengue. Safety profiles were mostly favorable, noting mild adverse events post-vaccination. Meta-analysis supported vaccine efficacy and immune response, but safety concerns warrant further exploration. In conclusion, dengue vaccines showed promising efficacy and immune response, particularly against severe manifestations.

10.
Biometrics ; 80(1)2024 Jan 29.
Article in English | MEDLINE | ID: mdl-38281769

ABSTRACT

The case-cohort study design provides a cost-effective study design for a large cohort study with competing risk outcomes. The proportional subdistribution hazards model is widely used to estimate direct covariate effects on the cumulative incidence function for competing risk data. In biomedical studies, left truncation often occurs and brings extra challenges to the analysis. Existing inverse probability weighting methods for case-cohort studies with competing risk data not only have not addressed left truncation, but also are inefficient in regression parameter estimation for fully observed covariates. We propose an augmented inverse probability-weighted estimating equation for left-truncated competing risk data to address these limitations of the current literature. We further propose a more efficient estimator when extra information from the other causes is available. The proposed estimators are consistent and asymptotically normally distributed. Simulation studies show that the proposed estimator is unbiased and leads to estimation efficiency gain in the regression parameter estimation. We analyze the Atherosclerosis Risk in Communities study data using the proposed methods.


Subject(s)
Cohort Studies , Humans , Proportional Hazards Models , Probability , Computer Simulation , Incidence
11.
J Epidemiol ; 34(1): 38-40, 2024 Jan 05.
Article in English | MEDLINE | ID: mdl-36642515

ABSTRACT

BACKGROUND: The logistic regression analysis proposed by Schouten et al (Stat Med. 1993;12:1733-1745) has been a standard method in current statistical analysis of case-cohort studies, and it enables effective estimation of risk ratios from selected subsamples, with adjustment of potential confounding factors. Schouten et al (1993) also proposed the standard error estimate of the risk ratio estimator can be calculated using the robust variance estimator, and this method has been widely adopted. METHODS AND RESULTS: The robust variance estimator does not account for the duplications of case and subcohort samples and generally has certain bias (ie, inaccurate confidence intervals and P-values are possibly obtained). To address the invalid statistical inference problem, we provide an alternative bootstrap-based valid variance estimator. Through simulation studies, the bootstrap method consistently provided more precise confidence intervals compared with those provided using the robust variance method, while retaining adequate coverage probabilities. CONCLUSION: The robust variance estimator has certain bias, and inadequate conclusions might be deduced from the resultant statistical analyses. The proposed bootstrap variance estimator can provide more accurate and precise interval estimates. The bootstrap method would be an alternative effective approach in practice to provide accurate evidence.


Subject(s)
Models, Statistical , Humans , Logistic Models , Japan , Computer Simulation , Bias , Probability
12.
Psychol Med ; 54(7): 1391-1402, 2024 May.
Article in English | MEDLINE | ID: mdl-37980927

ABSTRACT

BACKGROUND: This longitudinal register study aimed to investigate the association between gambling disorder (GD) and work disability and to map work disability in subgroups of individuals with GD, three years before and three years after diagnosis. METHODS: We included individuals aged 19-62 with GD between 2005 and 2018 (n = 2830; 71.1% men, mean age: 35.1) and a matched comparison cohort (n = 28 300). Work disability was operationalized as the aggregated net days of sickness absence and disability pension. Generalized estimating equation models were used to calculate adjusted odds ratios (AORs) and 95% confidence intervals (CIs) for the risk of long-term work disability (>90 days of work disability/year). Secondly, we conducted Group-based Trajectory Models on days of work disability. RESULTS: Individuals with GD showed a four-year increased risk of long-term work disability compared to the matched cohort, peaking at the time of diagnosis (AOR = 1.89; CI 1.67-2.13). Four trajectory groups of work disability days were identified: constant low (60.3%, 5.6-11.2 days), low and increasing (11.4%, 11.8-152.5 days), medium-high and decreasing (11.1%, 65.1-110 days), and constant high (17.1%, 264-331 days). Individuals who were females, older, with prior psychiatric diagnosis, and had been dispensed a psychotropic medication, particularly antidepressants, were more likely to be assigned to groups other than the constant low. CONCLUSION: Individuals with GD have an increased risk of work disability which may add financial and social pressure and is an additional incentive for earlier detection and prevention of GD.


Subject(s)
Disabled Persons , Gambling , Male , Female , Humans , Adult , Cohort Studies , Sweden/epidemiology , Gambling/epidemiology , Longitudinal Studies , Pensions , Sick Leave
13.
Am J Epidemiol ; 2023 Dec 05.
Article in English | MEDLINE | ID: mdl-38055633

ABSTRACT

Studies have highlighted the potential importance of modeling interactions for suicide attempt prediction. This case-cohort study identified risk factors for suicide attempts among persons with depression in Denmark using statistical approaches that do (random forests) or do not model interactions (least absolute shrinkage and selection operator regression [LASSO]). Cases made a non-fatal suicide attempt (n = 6,032) between 1995 and 2015. The comparison subcohort was a 5% random sample of all persons in Denmark on January 1, 1995 (n = 11,963). We used random forests and LASSO for sex-stratified prediction of suicide attempts from demographic variables, psychiatric and somatic diagnoses, and treatments. Poisonings, psychiatric disorders, and medications were important predictors for both sexes. Area under the receiver operating characteristic curve (AUC) values were higher in LASSO models (0.85 [95% CI = 0.84, 0.86] in men; 0.89 [95% CI = 0.88, 0.90] in women) than random forests (0.76 [95% CI = 0.74, 0.78] in men; 0.79 [95% CI = 0.78, 0.81] in women). Automatic detection of interactions via random forests did not result in better model performance than LASSO models that did not model interactions. Due to the complex nature of psychiatric comorbidity and suicide, modeling interactions may not always be the optimal statistical approach to enhancing suicide attempt prediction in high-risk samples.

14.
BMC Med Res Methodol ; 23(1): 287, 2023 12 07.
Article in English | MEDLINE | ID: mdl-38062377

ABSTRACT

BACKGROUND: Case-cohort studies are conducted within cohort studies, with the defining feature that collection of exposure data is limited to a subset of the cohort, leading to a large proportion of missing data by design. Standard analysis uses inverse probability weighting (IPW) to address this intended missing data, but little research has been conducted into how best to perform analysis when there is also unintended missingness. Multiple imputation (MI) has become a default standard for handling unintended missingness and is typically used in combination with IPW to handle the intended missingness due to the case-control sampling. Alternatively, MI could be used to handle both the intended and unintended missingness. While the performance of an MI-only approach has been investigated in the context of a case-cohort study with a time-to-event outcome, it is unclear how this approach performs with a binary outcome. METHODS: We conducted a simulation study to assess and compare the performance of approaches using only MI, only IPW, and a combination of MI and IPW, for handling intended and unintended missingness in the case-cohort setting. We also applied the approaches to a case study. RESULTS: Our results show that the combined approach is approximately unbiased for estimation of the exposure effect when the sample size is large, and was the least biased with small sample sizes, while MI-only and IPW-only exhibited larger biases in both sample size settings. CONCLUSIONS: These findings suggest that a combined MI/IPW approach should be preferred to handle intended and unintended missing data in case-cohort studies with binary outcomes.


Subject(s)
Cohort Studies , Humans , Data Interpretation, Statistical , Probability , Bias , Computer Simulation
15.
Cancers (Basel) ; 15(23)2023 Dec 03.
Article in English | MEDLINE | ID: mdl-38067398

ABSTRACT

Recent studies have shed light on alterations to the proinflammatory tumor microenvironment as a significant carcinogenic mechanism. Despite previous studies on associations between proinflammatory cytokines and lung cancer risk, few studies have been conducted in Asian populations. This study aimed to investigate associations between proinflammatory cytokines and lung cancer risk, considering histological types, in the Korean general population. We carried out a case-cohort study on the Korean National Cancer Center Community (KNCCC) cohort (lung cancer cases: 136, subcohort: 822). Pre-diagnostic serum levels of proinflammatory cytokines (i.e., IL-6, TNF-α, IL-1ß, IFN-γ, and IL-10) were measured using Quantikine® ELISA. A Cox proportional-hazards regression analysis was conducted. In this study, serum levels of IL-6, IL-1ß, and IFN-γ were associated with lung cancer risk. IL-6 was associated with lung cancer, regardless of the histological type. IL-1ß had an association only with adenocarcinoma, while IFN-γ had an association only with squamous-cell carcinoma. This study shows associations between serum levels of IL-6, IL-1ß, and IFN-γ and lung cancer risk, underscoring the potential of these cytokines to act as risk biomarkers. The utilization of these biomarkers for risk prediction may hold the promise of facilitating the identification of the high-risk population.

16.
Cureus ; 15(9): e46238, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37908950

ABSTRACT

BACKGROUND: The neutrophil-to-lymphocyte ratio (NLR) has been studied as an indicator of systemic inflammation and as a prognostic tool in multiple areas of medicine. Previous research has suggested that higher NLR and rapid increase to peak NLR are associated with poorer outcomes in patients with coronavirus disease 2019 (COVID-19), particularly in those experiencing acute respiratory distress syndrome (ARDS). Within vascular surgery, there is data to suggest a positive correlation between elevated pre-extracorporeal membrane oxygenation (ECMO) NLR and higher rates of mortality following major procedures. This study explores the prognostic value of peri-ECMO NLR in patients requiring veno-venous ECMO (VV-ECMO) therapy for COVID-19-related ARDS. The objective of this study was to explore the utility of pre-ECMO NLR as an easily accessible prognostic factor for patients suffering from COVID-19-associated ARDS that require VV-ECMO. METHODS: This was a retrospective cohort study within a tertiary care hospital conducted between April 2020 and January 2021. Patients requiring VV-ECMO therapy for COVID-19-associated ARDS were included. Peri-ECMO NLR values, length of stay (LOS), duration on VV-ECMO, and discharge status were recorded. Receiver operating characteristic (ROC) curve analysis and Youden's J statistics were performed to calculate a cut-off value of 11.005 for pre-ECMO NLR and 17.616 for on-ECMO NLR. Pre-ECMO and on-ECMO Kaplan-Meyer curves were generated for two groups of patients, those above and below NLR cutoff thresholds. Two-sample T-test was performed to test for significant differences in LOS and duration on VV-ECMO. RESULTS: Twenty-six patients were included in the study for final analyses. There was an overall mortality of 39% (n = 10). ROC curve analysis and Youden's J statistic revealed an optimal cut-off value of pre-ECMO NLR = 11.005 and on-ECMO NLR = 17.616. Results showed that the patient group placed on VV-ECMO with a pre-ECMO NLR less than 11.005 experienced no mortality (n = 7) and a median LOS of 28 days (IQR = 14.5-64.5 days). The patient group on VV-ECMO with a pre-ECMO NLR greater than 11.005 (n = 19) included all mortality (n = 10) and had a median LOS of 49 days (IQR = 25.5-63.5 days). The patient group with on-ECMO NLR less than 17.616 also conferred a survival advantage. There was no significant difference in LOS or duration on VV-ECMO between the two groups, pre-ECMO or on-ECMO. CONCLUSIONS:  A pre-ECMO NLR cutoff was identified and offered statistically significant prognostic value in predicting mortality. A lower on-ECMO NLR value also indicated a survival advantage. Future studies should include NLR within multivariate models to better discern the effect of NLR and elucidate how it can be factored into clinical decision-making. Importantly, this data can be expanded to assess the predictive value of NLR pertaining to the COVID-19-induced ARDS population and matched cohorts.

17.
Environ Int ; 181: 108269, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37866238

ABSTRACT

BACKGROUND: Limited evidence suggests that antimony induces vascular inflammation and oxidative stress and may play a role in cardiovascular disease (CVD) risk. However, few studies have examined whether environmental antimony from sources other than tobacco smoking is related with CVD risk. The general population may be exposed through air, drinking water, and food that contains antimony from natural and anthropogenic sources, such as mining, coal combustion, and manufacturing. OBJECTIVES: To examine the association of urine antimony with incident acute myocardial infarction (AMI), heart failure, and stroke among people who never smoked tobacco. METHODS: Between 1993 and 1997, the Danish Diet, Cancer and Health (DCH) cohort enrolled participants (ages 50-64 years), including n = 19,394 participants who reported never smoking at baseline. Among these never smokers, we identified incident cases of AMI (N = 809), heart failure (N = 958), and stroke (N = 534) using the Danish National Patient Registry. We also randomly selected a subcohort of 600 men and 600 women. We quantified urine antimony concentrations in samples provided at enrollment. We used modified Cox proportional hazards models to estimate adjusted hazard ratios (HR) for each incident CVD outcome in relation to urine antimony, statistically adjusted for creatinine. We used a separate prospective cohort, the San Luis Valley Diabetes Study (SLVDS), to replicate these results. RESULTS: In the DCH cohort, urine antimony concentrations were positively associated with rates of AMI and heart failure (HR = 1.52; 95%CI = 1.12, 2.08 and HR = 1.58; 95% CI = 1.15, 2.18, respectively, comparing participants in the highest (>0.09 µg/L) with the lowest quartile (<0.02 µg/L) of antimony). In the SLVDS cohort, urinary antimony was positively associated with AMI, but not heart failure. DISCUSSION: Among this sample of Danish people who never smoked, we found that low levels of urine antimony are associated with incident CVD. These results were partially confirmed in a smaller US cohort.


Subject(s)
Cardiovascular Diseases , Heart Failure , Myocardial Infarction , Stroke , Female , Humans , Male , Antimony , Cardiovascular Diseases/epidemiology , Cohort Studies , Denmark/epidemiology , Myocardial Infarction/epidemiology , Non-Smokers , Risk Factors , Stroke/epidemiology , Prospective Studies
18.
BMC Med Res Methodol ; 23(1): 119, 2023 05 19.
Article in English | MEDLINE | ID: mdl-37208600

ABSTRACT

BACKGROUND: Sub-cohort sampling designs such as a case-cohort study play a key role in studying biomarker-disease associations due to their cost effectiveness. Time-to-event outcome is often the focus in cohort studies, and the research goal is to assess the association between the event risk and risk factors. In this paper, we propose a novel goodness-of-fit two-phase sampling design for time-to-event outcomes when some covariates (e.g., biomarkers) can only be measured on a subgroup of study subjects. METHODS: Assuming that an external model, which can be the well-established risk models such as the Gail model for breast cancer, Gleason score for prostate cancer, and Framingham risk models for heart diseases, or built from preliminary data, is available to relate the outcome and complete covariates, we propose to oversample subjects with worse goodness-of-fit (GOF) based on an external survival model and time-to-event. With the cases and controls sampled using the GOF two-phase design, the inverse sampling probability weighting method is used to estimate the log hazard ratio of both incomplete and complete covariates. We conducted extensive simulations to evaluate the efficiency gain of our proposed GOF two-phase sampling designs over case-cohort study designs. RESULTS: Through extensive simulations based on a dataset from the New York University Women's Health Study, we showed that the proposed GOF two-phase sampling designs were unbiased and generally had higher efficiency compared to the standard case-cohort study designs. CONCLUSION: In cohort studies with rare outcomes, an important design question is how to select informative subjects to reduce sampling costs while maintaining statistical efficiency. Our proposed goodness-of-fit two-phase design provides efficient alternatives to standard case-cohort designs for assessing the association between time-to-event outcome and risk factors. This method is conveniently implemented in standard software.


Subject(s)
Breast Neoplasms , Male , Humans , Female , Cohort Studies , New York , Universities , Women's Health , Biomarkers
19.
J Comput Biol ; 30(6): 663-677, 2023 06.
Article in English | MEDLINE | ID: mdl-37140454

ABSTRACT

This study develops a sure joint feature screening method for the case-cohort design with ultrahigh-dimensional covariates. Our method is based on a sparsity-restricted Cox proportional hazards model. An iterative reweighted hard thresholding algorithm is proposed to approximate the sparsity-restricted, pseudo-partial likelihood estimator for joint screening. We rigorously show that our method possesses the sure screening property, with the probability of retaining all relevant covariates tending to 1 as the sample size goes to infinity. Our simulation results demonstrate that the proposed procedure has substantially improved screening performance over some existing feature screening methods for the case-cohort design, especially when some covariates are jointly correlated, but marginally uncorrelated, with the event time outcome. A real data illustration is provided using breast cancer data with high-dimensional genomic covariates. We have implemented the proposed method using MATLAB and made it available to readers through GitHub.


Subject(s)
Algorithms , Breast Neoplasms , Humans , Female , Proportional Hazards Models , Computer Simulation , Breast Neoplasms/diagnosis , Breast Neoplasms/genetics
20.
Gastroenterology ; 165(2): 483-491.e7, 2023 08.
Article in English | MEDLINE | ID: mdl-37146913

ABSTRACT

BACKGROUND & AIMS: Because post-polypectomy surveillance uses a growing proportion of colonoscopy capacity, more targeted surveillance is warranted. We therefore compared surveillance burden and cancer detection using 3 different adenoma classification systems. METHODS: In a case-cohort study among individuals who had adenomas removed between 1993 and 2007, we included 675 individuals with colorectal cancer (cases) diagnosed a median of 5.6 years after adenoma removal and 906 randomly selected individuals (subcohort). We compared colorectal cancer incidence among high- and low-risk individuals defined according to the traditional (high-risk: diameter ≥10 mm, high-grade dysplasia, villous growth pattern, or 3 or more adenomas), European Society of Gastrointestinal Endoscopy (ESGE) 2020 (high-risk: diameter ≥10 mm, high-grade dysplasia, or 5 or more adenomas), and novel (high-risk: diameter ≥20 mm or high-grade dysplasia) classification systems. For the different classification systems, we calculated the number of individuals recommended frequent surveillance colonoscopy and estimated number of delayed cancer diagnoses. RESULTS: Four hundred and thirty individuals with adenomas (52.7%) were high risk based on the traditional classification, 369 (45.2%) were high risk based on the ESGE 2020 classification, and 220 (27.0%) were high risk based on the novel classification. Using the traditional, ESGE 2020, and novel classifications, the colorectal cancer incidences per 100,000 person-years were 479, 552, and 690 among high-risk individuals, and 123, 124, and 179 among low-risk individuals, respectively. Compared with the traditional classification, the number of individuals who needed frequent surveillance was reduced by 13.9% and 44.2%, respectively, and 1 (3.4%) and 7 (24.1%) cancer diagnoses were delayed using the ESGE 2020 and novel classifications. CONCLUSIONS: Using the ESGE 2020 and novel risk classifications will substantially reduce resources needed for colonoscopy surveillance after adenoma removal.


Subject(s)
Adenoma , Colonic Polyps , Colorectal Neoplasms , Humans , Cohort Studies , Adenoma/epidemiology , Colonoscopy , Colorectal Neoplasms/epidemiology , Risk , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL