Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 182
Filtrar
2.
Lung Cancer ; 152: 58-65, 2021 02.
Artigo em Inglês | MEDLINE | ID: mdl-33352384

RESUMO

INTRODUCTION: The relationship between Body-Mass-Index (BMI) and lung cancer prognosis is heterogeneous. We evaluated the impact of sex, smoking and race on the relationship between BMI and overall survival (OS) in non-small-cell-lung-cancer (NSCLC). METHODS: Data from 16 individual ILCCO studies were pooled to assess interactions between BMI and the following factors on OS: self-reported race, smoking status and sex, using Cox models (adjusted hazard ratios; aHR) with interaction terms and adjusted penalized smoothing spline plots in stratified analyses. RESULTS: Among 20,937 NSCLC patients with BMI values, females = 47 %; never-smokers = 14 %; White-patients = 76 %. BMI showed differential survival according to race whereby compared to normal-BMI patients, being underweight was associated with poor survival among white patients (OS, aHR = 1.66) but not among black patients (aHR = 1.06; pinteraction = 0.02). Comparing overweight/obese to normal weight patients, Black NSCLC patients who were overweight/obese also had relatively better OS (pinteraction = 0.06) when compared to White-patients. BMI was least associated with survival in Asian-patients and never-smokers. The outcomes of female ever-smokers at the extremes of BMI were associated with worse outcomes in both the underweight (pinteraction<0.001) and obese categories (pinteraction = 0.004) relative to the normal-BMI category, when compared to male ever-smokers. CONCLUSION: Underweight and obese female ever-smokers were associated with worse outcomes in White-patients. These BMI associations were not observed in Asian-patients and never-smokers. Black-patients had more favorable outcomes in the extremes of BMI when compared to White-patients. Body composition in Black-patients, and NSCLC subtypes more commonly seen in Asian-patients and never-smokers, may account for differences in these BMI-OS relationships.

3.
Hypertension ; 77(1): 94-102, 2021 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-33190561

RESUMO

Since 2003, US hypertension guidelines have recommended ACE (angiotensin-converting enzyme) inhibitors or ARBs (angiotensin receptor blockers) as first-line antihypertensive therapy in the presence of albuminuria (urine albumin/creatinine ratio ≥300 mg/g). To examine national trends in guideline-concordant ACE inhibitor/ARB utilization, we studied adults participating in the National Health and Nutrition Examination Surveys 2001 to 2018 with hypertension (defined by self-report of high blood pressure, systolic blood pressure ≥140 mm Hg or diastolic ≥90 mm Hg, or use of antihypertensive medications). Among 20 538 included adults, the prevalence of albuminuria ≥300 mg/g was 2.8% in 2001 to 2006, 2.8% in 2007 to 2012, and 3.2% in 2013 to 2018. Among those with albuminuria ≥300 mg/g, no consistent trends were observed for the proportion receiving ACE inhibitor/ARB treatment from 2001 to 2018 among persons with diabetes, without diabetes, or overall. In 2013 to 2018, ACE inhibitor/ARB usage in the setting of albuminuria ≥300 mg/g was 55.3% (95% CI, 46.8%-63.6%) among adults with diabetes and 33.4% (95% CI, 23.1%-45.5%) among those without diabetes. Based on US population counts, these estimates represent 1.6 million adults with albuminuria ≥300 mg/g currently not receiving ACE inhibitor/ARB therapy, nearly half of whom do not have diabetes. ACE inhibitor/ARB underutilization represents a significant gap in preventive care delivery for adults with hypertension and albuminuria that has not substantially changed over time.

4.
Clin Epidemiol ; 12: 1249-1260, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-33204166

RESUMO

Background: Reproducibility of clinical and epidemiologic research is important to generalize findings and has increasingly been scrutinized. A recently published randomized trial, PIVOTAL, evaluated high vs low intravenous iron dosing strategies to manage anemia in hemodialysis patients in the UK. Our objective was to assess the reproducibility of the PIVOTAL trial findings using data from a well-established cohort study, the Dialysis Outcomes and Practice Patterns Study (DOPPS). Methods: To overcome the absence of randomization in the DOPPS, we applied the parametric g-formula, an extension of standardization to longitudinal data. We estimated the effect of a proactive high-dose vs reactive low-dose iron supplementation strategy on all-cause mortality (primary outcome), hemoglobin, two measures of iron concentration (ferritin and TSAT), and erythropoiesis-stimulating agent dose over 12 months of follow-up in 6325 DOPPS patients. Results: Comparing high- vs low-iron dose strategies, the 1-year mortality risk difference was 0.020 (95% CI: 0.008, 0.031) and risk ratio was 1.20 (95% CI: 1.07, 1.33), compared with null 1-year findings in the PIVOTAL trial. Differences in secondary outcomes were directionally consistent but of lesser magnitude than in the PIVOTAL trial. Conclusion: Our findings are somewhat consistent with the recent PIVOTAL trial, with discrepancies potentially attributable to model misspecification and differences between the two study populations. In addition to the importance of our results to nephrologists and hence hemodialysis patients, our analysis illustrates the utility of the parametric g-formula for generalizing results and comparing complex and dynamic treatment strategies using observational data.

5.
Cancer Epidemiol ; 69: 101824, 2020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-33039726

RESUMO

BACKGROUND: Although there is some evidence of positive associations between both the glycemic index (GI) and glycemic load (GL) with cancer risk, the relationships with lung cancer risk remain largely unexplored. We evaluated the associations between GI and GL with lung cancer. METHODS: The analyses were performed using data from a population-based case-control study recruited between 1999 and 2004 in Los Angeles County. Dietary factors were collected from 593 incident lung cancer cases and 1026 controls using a modified food frequency questionnaire. GI and GL were estimated using a food composition table. Adjusted odds ratios (ORs) and 95 % confidence intervals (CI) were estimated using unconditional logistic regression adjusting for potential confounders. RESULTS: Dietary GI was positively associated with lung cancer (OR for upper vs. lower tertile = 1.62; 95 % CI: 1.17, 2.25). For histologic subtypes, positive associations were observed between GI and adenocarcinoma (OR for upper vs. lower tertile = 1.82; 95 % CI: 1.22, 2.70) and small cell carcinoma (OR for upper vs. lower tertile = 2.68; 95 % CI: 1.25, 5.74). No clear association between GL and lung cancer was observed. CONCLUSION: These findings suggest that high dietary GI was associated with increased lung cancer risk, and the positive associations were observed for both lung adenocarcinoma and small cell lung carcinoma. Replication in an independent dataset is merited for a broader interpretation of our results.

6.
Am J Kidney Dis ; 2020 Sep 02.
Artigo em Inglês | MEDLINE | ID: mdl-32890592

RESUMO

Kidney disease is a common, complex, costly, and life-limiting condition. Most kidney disease registries or information systems have been limited to single institutions or regions. A national US Department of Veterans Affairs (VA) Renal Information System (VA-REINS) was recently developed. We describe its creation and present key initial findings related to chronic kidney disease (CKD) without kidney replacement therapy (KRT). Data from the VA's Corporate Data Warehouse were processed and linked with national Medicare data for patients with CKD receiving KRT. Operational definitions for VA user, CKD, acute kidney injury, and kidney failure were developed. Among 7 million VA users in fiscal year 2014, CKD was identified using either a strict or liberal operational definition in 1.1 million (16.4%) and 2.5 million (36.3%) veterans, respectively. Most were identified using an estimated glomerular filtration rate laboratory phenotype, some through proteinuria assessment, and very few through International Classification of Diseases, Ninth Revision coding. The VA spent ∼$18 billion for the care of patients with CKD without KRT, most of which was for CKD stage 3, with higher per-patient costs by CKD stage. VA-REINS can be leveraged for disease surveillance, population health management, and improving the quality and value of care, thereby enhancing VA's capacity as a patient-centered learning health system for US veterans.

7.
CMAJ ; 192(35): E995-E1002, 2020 Aug 31.
Artigo em Inglês | MEDLINE | ID: mdl-32868271

RESUMO

BACKGROUND: Decisions about dialysis for advanced kidney disease are often strongly shaped by sociocultural and system-level factors rather than the priorities and values of individual patients. We examined international variation in the uptake of conservative approaches to the care of patients with advanced kidney disease, in particular discontinuation of dialysis. METHODS: We employed an observational cohort study design using data collected from patients maintained on long-term hemodialysis between 1996 and 2015 in facilities across 12 developed countries participating in the Dialysis Outcomes and Practice Patterns Study (DOPPS). The main outcome was discontinuation of dialysis therapy. We analyzed the association between several patient characteristics and time to dialysis discontinuation by country and phase of study entry. RESULTS: A total of 259 343 DOPPS patients contributed data to the study, of whom 48 519 (18.7%) died during the study period. Of the decedents, 5808 (12.0%) discontinued dialysis before death. Rates of discontinuation were higher within the first few months after initiation of dialysis, among older adults, among those with a greater number of comorbidities and among those living in an institution. After adjustment for age, sex, dialysis duration, diabetes and dialysis era, rates of discontinuation were highest in Canada, the United States and Australia/New Zealand (33.8, 31.4 and 21.5 per 1000/yr, respectively) and lowest in Japan and Italy (< 0.1 per 1000/yr). Crude discontinuation rates were highest in dialysis facilities that were more likely to offer comprehensive conservative renal care to older adults. INTERPRETATION: We found persistent international variation in average rates of dialysis discontinuation not explained by differences in patient case-mix. These differences may reflect physician-, facility- and society-level differences in clinical practice. There may be opportunities for international cross-collaboration to improve support for patients with end-stage renal disease who prefer a more conservative approach.

8.
Kidney Med ; 2(3): 286-296, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32734248

RESUMO

Rationale & Objective: Previous studies of inflammation and anemia management in hemodialysis (HD) patients may be biased due to patient differences. We used a self-matched longitudinal design to test whether new inflammation, defined as an acute increase in C-reactive protein (CRP) level, reduces hemoglobin response to erythropoiesis-stimulating agent (ESA) treatment. Study Design: Self-matched longitudinal design. Setting & Participants: 3,568 new inflammation events, defined as CRP level > 10 mg/L following a 3-month period with CRP level ≤ 5 mg/L, were identified from 12,389 HD patients in the Dialysis Outcomes and Practice Patterns Study (DOPPS) phases 4 to 6 (2009-2018) in 10 countries in which CRP is routinely measured. Predictor: "After" (vs "before") observing a high CRP level. Outcomes: Within-patient changes in hemoglobin level, ESA dose, and ESA hyporesponsiveness (hemoglobin < 10 g/dL and ESA dose > 6,000 [Japan] or >8,000 [Europe] U/wk). Analytical Approach: Linear mixed models and modified Poisson regression. Results: Comparing before with after periods, mean hemoglobin level decreased from 11.2 to 10.9 g/dL (adjusted mean change, -0.26 g/dL), while mean ESA dose increased from 6,320 to 6,960 U/wk (adjusted relative change, 8.4%). The prevalence of ESA hyporesponsiveness increased from 7.6% to 12.3%. Both the unadjusted and adjusted prevalence ratios of ESA hyporesponsiveness were 1.68 (95% CI, 1.48-1.91). These associations were consistent in sensitivity analyses varying CRP thresholds and were stronger when the CRP level increase was sustained over the 3-month after period. Limitations: Residual confounding by unmeasured time-varying risk factors for ESA hyporesponsiveness. Conclusions: In the 3 months after HD patients experienced an increase in CRP levels, hemoglobin levels declined quickly, ESA doses increased, and the prevalence of ESA hyporesponsiveness increased appreciably. Routine CRP measurement could identify inflammation as a cause of worsened anemia. In turn, these findings speak to a potentially important role for anemia therapies that are less susceptible to the effects of inflammation.

9.
Br J Cancer ; 123(9): 1456-1463, 2020 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-32830199

RESUMO

BACKGROUND: Alcohol is a well-established risk factor for head and neck cancer (HNC). This study aims to explore the effect of alcohol intensity and duration, as joint continuous exposures, on HNC risk. METHODS: Data from 26 case-control studies in the INHANCE Consortium were used, including never and current drinkers who drunk ≤10 drinks/day for ≤54 years (24234 controls, 4085 oral cavity, 3359 oropharyngeal, 983 hypopharyngeal and 3340 laryngeal cancers). The dose-response relationship between the risk and the joint exposure to drinking intensity and duration was investigated through bivariate regression spline models, adjusting for potential confounders, including tobacco smoking. RESULTS: For all subsites, cancer risk steeply increased with increasing drinks/day, with no appreciable threshold effect at lower intensities. For each intensity level, the risk of oral cavity, hypopharyngeal and laryngeal cancers did not vary according to years of drinking, suggesting no effect of duration. For oropharyngeal cancer, the risk increased with durations up to 28 years, flattening thereafter. The risk peaked at the higher levels of intensity and duration for all subsites (odds ratio = 7.95 for oral cavity, 12.86 for oropharynx, 24.96 for hypopharynx and 6.60 for larynx). CONCLUSIONS: Present results further encourage the reduction of alcohol intensity to mitigate HNC risk.

10.
Clin Kidney J ; 13(3): 425-433, 2020 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-32699623

RESUMO

Background: Anemia at hemodialysis (HD) initiation is common. Correcting low hemoglobin (Hgb) before HD initiation may improve survival by avoiding potential harms of chronic anemia, high doses of erythropoiesis-stimulating agents (ESAs) and intravenous (IV) iron in the early HD period, and/or rapid Hgb rise. Methods: We included 4604 incident HD patients from 21 countries in the Dialysis Outcomes and Practice Patterns Study Phases 4-5 (2009-15). Because low Hgb at HD start may reflect comorbidity or ESA hyporesponse, we restricted our analysis to the 80% of patients who achieved Hgb ≥10 g/dL 91-120 days after HD start (Month 4). Results: About 53% of these patients had Hgb <10 g/dL in Month 1 (<30 days after HD start); they were younger with a similar comorbidity profile (versus Hgb ≥10 g/dL). Month 1 Hgb was associated with first-year HD mortality (adjusted hazard ratio for 1 g/dL higher Hgb was 0.89; 95% confidence interval: 0.81-0.97), despite minimal differences in Month 4 Hgb. Patients with lower Hgb in Month 1 received higher doses of ESA, but not IV iron, over the first 3 months of HD. Results were consistent when excluding catheter users or adjusting for IV iron and ESA dose over the first 3 months. Conclusions: Even among patients with Hgb ≥10 g/dL 3 months later, anemia at HD initiation was common and associated with elevated mortality. A more proactive approach to anemia management in advanced chronic kidney disease (CKD) may thus improve survival on HD, though long-term prospective studies of non-dialysis CKD patients are needed.

11.
Am J Kidney Dis ; 76(3): 340-349.e1, 2020 09.
Artigo em Inglês | MEDLINE | ID: mdl-32387021

RESUMO

RATIONALE & OBJECTIVE: Native Hawaiians and Pacific Islanders (NHPI) have been reported to have the highest rates of incident end-stage kidney disease (ESKD) compared with other races in the United States. However, these estimates were likely biased upward due to the exclusion of nearly half the NHPI population that reports multiple races in the US Census. We sought to estimate the incidence rate of ESKD, including individuals reporting multiple races, and describe the clinical characteristics of incident cases by race and location. STUDY DESIGN: Health care database study. SETTING & PARTICIPANTS: US residents of the 50 states and 3 Pacific Island territories of the United States whose ESKD was recorded in the US Renal Data System (USRDS) between 2007 and 2016, as well as US residents recorded in the 2010 Census. PREDICTORS: Age, sex, race, body mass index, primary cause of ESKD, comorbid conditions, estimated glomerular filtration rate, pre-ESKD nephrology care, and hemoglobin A1c level among ESKD cases. OUTCOME: Initiation of maintenance dialysis or transplantation for kidney failure. ANALYTICAL APPROACH: Crude ESKD incidence rates (cases/person-years) were estimated using both single- and multiple-race reporting. RESULTS: Even after inclusion of multirace reporting, NHPI had the highest ESKD incidence rate among all races in the 50 states (921 [95% CI, 904-938] per million population per year)-2.7 times greater than whites and 1.2 times greater than blacks. Also using multirace reporting, the NHPI ESKD incident rate in the US territories was 941 (95% CI, 895-987) per million population per year. Diabetes was listed as the primary cause of ESKD most frequently for NHPI and American Indians/Alaska Natives. Sensitivity analysis adjusting for age and sex demonstrated greater differences in rates between NHPI and other races. Diabetes was the primary cause of ESKD in 60% of incident NHPI cases. Patients with ESKD living in the territories had received less pre-ESKD nephrology care than had patients living in the 50 states. LIMITATIONS: Different methods of race classification in the USRDS versus the US Census. CONCLUSIONS: NHPI living in the 50 US states and Pacific territories had the highest rates of ESKD incidence compared with other races. Further research and efforts are required to understand the reasons for and define how best to address this racial disparity.


Assuntos
Falência Renal Crônica/etnologia , Grupo com Ancestrais Oceânicos/estatística & dados numéricos , Adulto , Idoso , Índice de Massa Corporal , Comorbidade , Nefropatias Diabéticas/etnologia , Feminino , Taxa de Filtração Glomerular , Hemoglobina A Glicada/análise , Hawaii/epidemiologia , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Ilhas do Pacífico/epidemiologia , Fatores Socioeconômicos , Estados Unidos/epidemiologia
12.
Clin Transl Gastroenterol ; 11(4): e00151, 2020 04.
Artigo em Inglês | MEDLINE | ID: mdl-32251017

RESUMO

INTRODUCTION: We aimed to estimate the effects of a family history of colorectal cancer (CRC) or esophageal cancer on the risk of Barrett's esophagus (BE) and identify variants in cancer genes that may explain the association. METHODS: Men scheduled for screening colonoscopy were recruited to undergo upper endoscopy. Cases and noncases were screenees with and without BE, respectively. The effects of family histories on BE were estimated with logistic regression, adjusting for the potential confounders. We additionally recruited men recently diagnosed with BE by clinically indicated endoscopies. Banked germline DNA from cases of BE with ≥2 first-degree relatives (FDRs) with CRC and/or an FDR with esophageal cancer underwent next-generation sequencing using a panel of 275 cancer genes. RESULTS: Of the 822 men screened for CRC who underwent upper endoscopy, 70 were newly diagnosed with BE (8.5%). BE was associated with family histories of esophageal cancer (odds ratio = 2.63; 95% confidence interval = 1.07-6.47) and CRC in ≥2 vs 0 FDRs (odds ratio = 3.73; 95% confidence interval = 0.898-15.4). DNA analysis of subjects with both BE and a family history of cancer identified one or more germline variants of interest in genes associated with cancer predisposition in 10 of 14 subjects, including the same novel variant in EPHA5 in 2 unrelated individuals. DISCUSSION: We found an increased risk for BE associated with a family history of esophageal cancer or CRC. Although analysis of germline DNA yielded no clinically actionable findings, discovery of the same EPHA5 variant of uncertain significance in 2 of 14 cases merits additional investigation.

13.
Clin Epidemiol ; 12: 235-243, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32161503

RESUMO

Purpose: Due to complex medical profiles, adults with neurodevelopmental disabilities (NDDs) may have a heightened risk for early development of chronic kidney disease (CKD) and accelerated CKD progression to advanced stages and kidney failure. The purpose of this study was to estimate the incidence rate of advanced CKD for adults with NDDs and compare the incidence rate to adults without NDDs. Patients and Methods: Data were used from the Optum Clinformatics® Data Mart to conduct this retrospective cohort study. The calendar year 2013 was used to identify eligible participants: individuals ≥18 years of age and without advanced CKD. Participants were followed from 01/01/2014 to advanced CKD, loss to follow-up, death, or end of the study period (12/31/2017), whichever came first. Diagnostic, procedure, and diagnosis-related group codes identified NDDs (intellectual disabilities, cerebral palsy, autism spectrum disorders), incident cases of advanced CKD (CKD stages 4+), diabetes, cardiovascular diseases, and hypertension present in the year 2013. Crude incidence rates (IR) of advanced CKD and IR ratios (IRR), comparing adults with vs without NDDs (with 95% CI) were estimated. Then, Cox regression estimated the hazard ratio (HR and 95% CI) for advanced CKD, comparing adults with NDDs to adults without NDDs while adjusting for covariates. Results: Adults with NDDs (n=33,561) had greater crude IR of advanced CKD (IRR=1.32; 95% CI=1.24-1.42) compared to adults without NDDs (n=6.5M). The elevated rate of advanced CKD among adults with NDDs increased after adjusting for demographics (HR=2.19; 95% CI=2.04-2.34) and remained elevated with further adjustment for hypertension and diabetes (HR=2.01; 95% CI=1.87-2.15) plus cardiovascular disease (HR=1.84; 95% CI=1.72-1.97). Stratified analyses showed that the risk of advanced CKD was greater for all NDD subgroups. Conclusion: Study findings suggest that adults with NDDs have a greater risk of advanced CKD than do adults without NDDs, and that difference is not explained by covariates used in our analysis.

14.
Clin Epidemiol ; 12: 51-60, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32021471

RESUMO

Purpose: Mortality among first-year hemodialysis (HD) patients remains unacceptably high. To address this problem, we estimate the proportions of early HD deaths that are potentially preventable by modifying known risk factors. Methods: We included 15,891 HD patients (within 60 days of starting HD) from 21 countries in the Dialysis Outcomes and Practice Patterns Study (1996-2015), a prospective cohort study. Using Cox regression adjusted for potential confounders, we estimated the fraction of first-year deaths attributable to one or more of twelve modifiable risk factors (the population attributable fraction, AF) identified from the published literature by comparing predicted survival based on risk factors observed vs counterfactually set to reference levels. Results: The highest AFs were for catheter use (22%), albumin <3.5 g/dL (19%), and creatinine <6 mg/dL (12%). AFs were 5%-9% for no pre-HD nephrology care, no residual urine volume, systolic blood pressure <130 or ≥160 mm Hg, phosphorus <3.5 or ≥5.5 mg/dL, hemoglobin <10 or ≥12 g/dL, and white blood cell count >10,000/µL. AFs for ferritin, calcium, and PTH were <3%. Overall, 65% (95% CI: 59%-71%) of deaths were attributable to these 12 risk factors. Additionally, the AF for C-reactive protein >10 mg/L was 21% in facilities where it was routinely measured. Conclusion: A substantial proportion of first-year HD deaths could be prevented by successfully modifying a few risk factors. Highest priorities should be decreasing catheter use and limiting malnutrition/inflammation whenever possible.

15.
Br J Cancer ; 122(6): 745-748, 2020 03.
Artigo em Inglês | MEDLINE | ID: mdl-31929514

RESUMO

High dietary glycaemic index (GI) and glycaemic load (GL) may increase cancer risk. However, limited information was available on GI and/or GL and head and neck cancer (HNC) risk. We conducted a pooled analysis on 8 case-control studies (4081 HNC cases; 7407 controls) from the International Head and Neck Cancer Epidemiology (INHANCE) consortium. We estimated the odds ratios (ORs) and 95% confidence intervals (CIs) of HNC, and its subsites, from fixed- or mixed-effects logistic models including centre-specific quartiles of GI or GL. GI, but not GL, had a weak positive association with HNC (ORQ4 vs. Q1 = 1.16; 95% CI = 1.02-1.31). In subsites, we found a positive association between GI and laryngeal cancer (ORQ4 vs. Q1 = 1.60; 95% CI = 1.30-1.96) and an inverse association between GL and oropharyngeal cancer (ORQ4 vs. Q1 = 0.78; 95% CI = 0.63-0.97). This pooled analysis indicates a modest positive association between GI and HNC, mainly driven by laryngeal cancer.

16.
Am J Epidemiol ; 189(4): 330-342, 2020 04 02.
Artigo em Inglês | MEDLINE | ID: mdl-31781743

RESUMO

Head and neck cancer (HNC) risk prediction models based on risk factor profiles have not yet been developed. We took advantage of the large database of the International Head and Neck Cancer Epidemiology (INHANCE) Consortium, including 14 US studies from 1981-2010, to develop HNC risk prediction models. Seventy percent of the data were used to develop the risk prediction models; the remaining 30% were used to validate the models. We used competing-risk models to calculate absolute risks. The predictors included age, sex, education, race/ethnicity, alcohol drinking intensity, cigarette smoking duration and intensity, and/or family history of HNC. The 20-year absolute risk of HNC was 7.61% for a 60-year-old woman who smoked more than 20 cigarettes per day for over 20 years, consumed 3 or more alcoholic drinks per day, was a high school graduate, had a family history of HNC, and was non-Hispanic white. The 20-year risk for men with a similar profile was 6.85%. The absolute risks of oropharyngeal and hypopharyngeal cancers were generally lower than those of oral cavity and laryngeal cancers. Statistics for the area under the receiver operating characteristic curve (AUC) were 0.70 or higher, except for oropharyngeal cancer in men. This HNC risk prediction model may be useful in promoting healthier behaviors such as smoking cessation or in aiding persons with a family history of HNC to evaluate their risks.


Assuntos
Neoplasias de Cabeça e Pescoço/epidemiologia , Modelos Teóricos , Idoso , Estudos de Casos e Controles , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Medição de Risco , Estados Unidos/epidemiologia
18.
Clin Nephrol ; 93(1): 113-119, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-31496516

RESUMO

We hypothesized that high incidence rates of end-stage renal disease (ESRD) in certain counties of the U.S. are partly due to patients with a type of ESRD resembling chronic kidney disease of uncertain etiology (CKDu), which has been observed in Central America and other countries. Using data on 338,126 incident ESRD patients from the United States Renal Data System (USRDS) (2011 - 2013) and the Behavior Risk Factor Surveillance System (BRFSS) Supplement on county-level variables (2006), we describe both patient-level and county-level characteristics in counties with the highest quartile of ESRD incidence rate standardized for age, sex, and race (> 420 cases/million population/year) compared to the rest of the U.S. and two specific "hotspots" of ESRD: the San Joaquin Valley and the Rio Grande Valley. Logistic regression was used to examine characteristics associated with patients who had either missing cause of ESRD or "unknown" listed as the primary cause of ESRD. High incidence rates of ESRD were observed in southern Texas, the Southeast and parts of California (including the San Joaquin valley area), while low rates were seen in the Northwest and the Mountain Regions. The median crude incidence rate of ESRD was 335 (range 0 - 2,341) new cases per million population per year among counties. Significant predictors of missing/unknown primary cause of ESRD included: older age, white or unknown race, non-Hispanic ethnicity, lack of comorbidities at ESRD onset, lower estimated glomerular filtration rate (eGFR) at initiation, and lack of pre-dialysis care. Large areas of the U.S. have very high rates of ESRD incidence. We cannot confirm that CKDu is present in the U.S. based on this preliminary work. This topic therefore requires further investigation, as many of these patients may well be undocumented aliens working as farm laborers and therefore not registered in the USRDS.
.


Assuntos
Falência Renal Crônica/epidemiologia , Feminino , Humanos , Incidência , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Insuficiência Renal Crônica/epidemiologia , Estados Unidos/epidemiologia
19.
J Neurosurg Pediatr ; : 1-8, 2019 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-31675690

RESUMO

OBJECTIVE: Thickened or fatty filum terminale is an occult lesion that can cause tethered cord syndrome requiring surgical untethering. This study's objectives were to estimate the incidence of tethered fibrofatty filum terminale (TFFT) in a large insured pediatric population, identify predictors of surgery among those TFFT patients, and assess a diagnostic algorithm. METHODS: TFFT was defined according to the ICD-9-CM code for cord tethering (742.59), after excluding codes for diastematomyelia, lipomyelomeningocele, terminal myelocystocele, meningocele, and myelomeningocele. Utilizing the Optum Insight database for 2001-2014, the authors identified pediatric patients (< 21 years) in the US who were diagnosed with a tethered cord and estimated the TFFT incidence rates in that source population and the surgical untethering probability among TFFT patients over the 14-year period. Logistic regression was used to estimate the effects (adjusted OR and 95% CI) of age at diagnosis, sex, Charlson Comorbidity Index (CCI) score, diagnosis of Chiari malformation type I, diagnosis of syrinx, and the probability of surgery by US census region. Lastly, to evaluate their algorithm for identifying TFFT from ICD-9 codes, the authors estimated its positive predictive value (PPV) among 50 children who were diagnosed at their institution and met the ICD-9-CM criteria. RESULTS: There were 3218 diagnoses of TFFT, with 482 of these pediatric patients undergoing tethered cord release during the study period. The estimated incidence rate was 12.0 per 100,000/year (95% CI 11.6-12.4 per 100,000/year). The incidence rate was slightly higher in females than in males (12.7 vs 11.4 per 100,000/year). The probability of surgery in the total pediatric TFFT population was 15.0% (95% CI 13.8%-16.2%) and was greater in children with a syrinx (OR 2.2, 95% CI 1.6-3.0), children 7-11 years of age at diagnosis versus < 1 year (OR 1.5, 95% CI 1.1-2.0), CCI score ≥ 3 versus 0 (OR 2.3, 95% CI 1.4-3.8), and residents of the Western vs Northeastern US (OR 2.3, 95% CI 1.6-3.5). In the authors' own institution's database, the PPV of TFFT was 35/50 (70.0%, 95% CI 57.3%-82.7%) for identifying tethered cord due to fibrofatty filum terminale among childhood positives. CONCLUSIONS: Patients with comorbidities or an associated syrinx showed a higher risk of untethering procedures for TFFT. Also, surgery was appreciably more frequent in the Western US. These findings signify the need for a collaborative prospective cohort study of long-term outcomes for TFFT patients with and without surgery to determine which patients should have surgery.

20.
Cancer Epidemiol ; 63: 101615, 2019 12.
Artigo em Inglês | MEDLINE | ID: mdl-31586822

RESUMO

BACKGROUND: Tobacco use is a well-established risk factor for head and neck cancer (HNC). However, less is known about the potential impact of exposure to tobacco at an early age on HNC risk. METHODS: We analyzed individual-level data on ever tobacco smokers from 27 case-control studies (17,146 HNC cases and 17,449 controls) in the International Head and Neck Cancer Epidemiology (INHANCE) consortium. Adjusted odds ratios (ORs) and 95% confidence intervals (CIs) were estimated using random-effects logistic regression models. RESULTS: Without adjusting for tobacco packyears, we observed that younger age at starting tobacco use was associated with an increased HNC risk for ever smokers (OR<10 years vs. ≥30 years: 1.64, 95% CI: 1.35, 1.97). However, the observed association between age at starting tobacco use and HNC risk became null after adjusting for tobacco packyears (OR<10 years vs. ≥30 years: 0.97, 95% CI: 0.80, 1.19). In the stratified analyses on HNC subsites by tobacco packyears or years since quitting, no difference in the association between age at start and HNC risk was observed. CONCLUSIONS: Results from this pooled analysis suggest that increased HNC risks observed with earlier age at starting tobacco smoking are largely due to longer duration and higher cumulative tobacco exposures.


Assuntos
Neoplasias de Cabeça e Pescoço/epidemiologia , Neoplasias de Cabeça e Pescoço/etiologia , Tabaco/efeitos adversos , Adulto , Fatores Etários , Idoso , Estudos de Casos e Controles , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Fatores de Risco
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...