Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
BMC Nephrol ; 25(1): 197, 2024 Jun 17.
Article in English | MEDLINE | ID: mdl-38886636

ABSTRACT

BACKGROUND: Hyperphosphatemia is associated with increased morbidity and mortality in patients with end-stage kidney disease (ESKD). Whereas clinical and observational studies have demonstrated the effectiveness of sucroferric oxyhydroxide (SO) in controlling serum phosphorus (sP) in ESKD, data on the real-world impact of switching to SO in patients on peritoneal dialysis (PD) are limited. In this retrospective database analysis, we examine the impact of SO on sP management over a 1-year period among PD patients prescribed SO as part of routine clinical care. METHODS: We analyzed de-identified data from adults on PD in Fresenius Kidney Care clinics who were prescribed SO monotherapy between May 2018 and December 2019 as part of routine clinical management. Changes from baseline in sP levels, phosphate binder (PB) pill burden, and laboratory parameters were evaluated during the four consecutive 91-day intervals of SO treatment. RESULTS: The mean age of the 402 patients who completed 1 year of SO was 55.2 years at baseline, and they had been on PD for an average of 19.9 months. SO was initiated with no baseline PB recorded in 36.1% of patients, whereas the remaining 257 patients were switched to SO from sevelamer (39.7%), calcium acetate (30.4%), lanthanum (1.2%), ferric citrate (14.0%), or more than one PB (14.8%). Mean sP at baseline was 6.26 mg/dL. After being prescribed SO, the percentage of patients achieving sP ≤ 5.5 mg/dL increased from 32.1% (baseline) to 46.5-54.0% during the 1-year follow-up, whereas the mean number of PB pills taken per day decreased from 7.7 at baseline (among patients on a baseline PB) to 4.6 to 5.4. Serum phosphorus and PB pill burden decreased regardless of changes in residual kidney function over the 12-month period. Similar results were observed for the full cohort (976 patients who either completed or discontinued SO during the 1-year follow-up). CONCLUSIONS: Patients on PD who were prescribed SO as part of routine care for phosphorus management experienced significant reductions in SP and PB pills per day and improvements in sP target achievement, suggesting the effectiveness of SO on SP management with a concurrent reduction in pill burden.


Subject(s)
Ferric Compounds , Hyperphosphatemia , Kidney Failure, Chronic , Peritoneal Dialysis , Phosphorus , Humans , Middle Aged , Male , Retrospective Studies , Female , Ferric Compounds/therapeutic use , Phosphorus/blood , Hyperphosphatemia/drug therapy , Hyperphosphatemia/etiology , Hyperphosphatemia/blood , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/blood , Follow-Up Studies , Sucrose/therapeutic use , Drug Combinations , Aged , Adult
2.
JAMA Netw Open ; 7(3): e241848, 2024 Mar 04.
Article in English | MEDLINE | ID: mdl-38488798

ABSTRACT

This cross-sectional study uses Surveillance, Epidemiology, and End Results registry data to analyze colorectal adenocarcinoma staging incidence of patients aged 46 to 49 years from 2000 to 2020.


Subject(s)
Adenocarcinoma , Colorectal Neoplasms , Humans , United States/epidemiology , Incidence , Colorectal Neoplasms/epidemiology , Colorectal Neoplasms/pathology , Adenocarcinoma/epidemiology , Adenocarcinoma/pathology
3.
Am J Nephrol ; 55(2): 127-135, 2024.
Article in English | MEDLINE | ID: mdl-38091973

ABSTRACT

INTRODUCTION: Sucroferric oxyhydroxide (SO), a non-calcium, chewable, iron-based phosphate binder (PB), effectively lowers serum phosphorus (sP) concentrations while reducing pill burden relative to other PBs. To date, SO studies have largely examined treatment-experienced, prevalent hemodialysis populations. We aimed to explore the role of first-line SO initiated during the first year of dialysis. METHODS: We retrospectively analyzed deidentified data from adults receiving in-center hemodialysis who were prescribed SO monotherapy within the first year of hemodialysis as part of routine clinical care. All patients continuing SO monotherapy for 12 months were included. Changes from baseline in sP, achievement of sP ≤5.5 and ≤4.5 mg/dL, and other laboratory parameters were analyzed quarterly for 1 year. RESULTS: The overall cohort included 596 patients, 286 of whom had a dialysis vintage ≤3 months. In the 3 months preceding SO initiation, sP rapidly increased (mean increases of 1.02 and 1.65 mg/dL in the overall cohort and incident cohort, respectively). SO treatment was associated with significant decreases in quarterly sP (mean decreases of 0.26-0.36; p < 0.0001 for each quarter and overall). While receiving SO, 55-60% of patients achieved sP ≤5.5 mg/dL and 21-24% achieved sP ≤4.5 mg/dL (p < 0.0001 for each quarter and overall vs. baseline). Daily PB pill burden was approximately 4 pills. Serum calcium concentrations increased and intact parathyroid hormone concentrations decreased during SO treatment (p < 0.0001 vs. baseline). CONCLUSIONS: Among patients on hemodialysis, initiating SO as a first-line PB resulted in significant reductions in sP while maintaining a relatively low PB pill burden.


Subject(s)
Hyperphosphatemia , Phosphorus , Adult , Humans , Hyperphosphatemia/drug therapy , Hyperphosphatemia/etiology , Retrospective Studies , Renal Dialysis/adverse effects , Renal Dialysis/methods , Ferric Compounds/therapeutic use , Sucrose , Phosphates , Drug Combinations
4.
Environ Sci Technol ; 57(23): 8768-8775, 2023 06 13.
Article in English | MEDLINE | ID: mdl-37232460

ABSTRACT

Constant efforts have been devoted to exploring new disinfection byproducts in drinking water causally related to adverse health outcomes. In this study, five halogenated nucleobases were identified as emerging disinfection byproducts in drinking water, including 5-chlorouracil, 6-chlorouracil, 2-chloroadenine, 6-chloroguanine, and 5-bromouracil. We developed a solid phase extraction-ultraperformance liquid chromatography-tandem mass spectrometry method with the limits of detection (LOD) and recoveries ranging between 0.04-0.86 ng/L and 54-93%, respectively. The detection frequency of the five halogenated nucleobases ranged from 73 to 100% with a maximum concentration of up to 65.3 ng/L in the representative drinking water samples. The cytotoxicity of the five identified halogenated nucleobases in Chinese hamster ovary (CHO-K1) cells varied with great disparity, in which the cytotoxicity of 2-chloroadenine (IC50 = 9.4 µM) is appropriately three times higher than emerging DBP 2,6-dichloro-1,4-benzoquinone (IC50 = 42.4 µM), indicating the significant toxicological risk of halogenated nucleobase-DBPs. To the best of our knowledge, this study reports the analytical method, occurrence, and toxicity of halogenated nucleobase-DBPs for the first time. These findings will provide a theoretical basis for further research on probing the relationship between its mutagenicity and human health risk.


Subject(s)
Disinfectants , Drinking Water , Water Pollutants, Chemical , Water Purification , Cricetinae , Animals , Humans , Disinfection/methods , Drinking Water/analysis , Drinking Water/chemistry , CHO Cells , Halogenation , Cricetulus , Water Pollutants, Chemical/analysis , Water Purification/methods , Disinfectants/analysis
5.
J Hazard Mater ; 452: 131242, 2023 06 15.
Article in English | MEDLINE | ID: mdl-36963195

ABSTRACT

Identification of emerging disinfection byproducts (DBPs) of health relevance is important to uncover the health risk of drinking water observed in epidemiology studies. In this study, mutagenic chlorinated nucleotides were proposed as potential DBPs in drinking water, and the formation and transformation pathways of these DBPs in chlorination of nucleotides were carefully investigated. A total of eleven chlorinated nucleotides and analogs were provisionally identified as potential DBPs, such as monochloro uridine/cytidine/adenosine acid and dichloro cytidine acid, and the formation mechanisms involved chlorination, decarbonization, hydrolysis, oxidation and decarboxylation. The active sites of nucleotides that reacted with chlorine were on the aromatic heterocyclic rings of nucleobases, and the carbon among the two nitrogen atoms in the nucleobases tended to be transformed into carboxyl group or be eliminated, further forming ring-opening or reorganization products. Approximately 0.2-4.0 % (mol/mol) of these chlorinated nucleotides and analogs finally decomposed to small-molecule aliphatic DBPs, primarily including haloacetic acids, trichloromethane, and trichloroacetaldehyde. Eight intermediates, particularly chlorinated imino-D-ribose and imino-D-ribose, were tentatively identified in chlorination of uridine. This study provides the first set of preliminary evidence for indicating the promising occurrence of chlorinated nucleotides and analogs as potential toxicological-relevant DBPs after disinfection of drinking water.


Subject(s)
Disinfectants , Drinking Water , Water Pollutants, Chemical , Water Purification , Disinfection , Drinking Water/chemistry , Disinfectants/analysis , Nucleotides , Ribose , Water Pollutants, Chemical/chemistry , Chlorine/chemistry , Halogenation , Cytidine
6.
Environ Sci Technol ; 57(9): 3581-3589, 2023 03 07.
Article in English | MEDLINE | ID: mdl-36802564

ABSTRACT

Xenobiotics were generally detoxified in organisms through interaction with endogenous molecules, which may also generate metabolites of increased toxicity. Halobenzoquinones (HBQs), a group of highly toxic emerging disinfection byproducts (DBPs), can be metabolized by reacting with glutathione (GSH) to form various glutathionylated conjugates (SG-HBQs). In this study, the cytotoxicity of HBQs in CHO-K1 cells showed a wavy curve as a function of increased GSH dosage, which was inconsistent with the commonly recognized progressive detoxification curve. We hypothesized that the formation and cytotoxicity of GSH-mediated HBQ metabolites contribute to the unusual wave-shaped cytotoxicity curve. Results showed that glutathionyl-methoxyl HBQs (SG-MeO-HBQs) were identified to be the primary metabolites significantly correlated with the unusual cytotoxicity variation of HBQs. The formation pathway was initiated by stepwise metabolism via hydroxylation and glutathionylation to produce detoxified hydroxyl HBQs (OH-HBQs) and SG-HBQs, followed by methylation to generate SG-MeO-HBQs of potentiated toxicity. To further verify the occurrence of the aforementioned metabolism in vivo, SG-HBQs and SG-MeO-HBQs were detected in the liver, kidney, spleen, testis, bladder, and feces of HBQ-exposed mice, with the highest concentration quantified in the liver. The present study supported that the co-occurrence of metabolism can be antagonistic, which enhanced our understanding of the toxicity and metabolic mechanism of HBQs.


Subject(s)
Drinking Water , Cricetinae , Animals , Mice , Drinking Water/analysis , Disinfection , Halogenation , Glutathione , Cricetulus
8.
Int Urol Nephrol ; 55(2): 377-387, 2023 Feb.
Article in English | MEDLINE | ID: mdl-35953565

ABSTRACT

OBJECTIVE: Despite the growing number of elderly hemodialysis patients, the influence of age on nutritional parameters, serum phosphorus (sP), and use of phosphate-binder (PB) medications has not been well characterized. We aimed to describe age-related differences in patient characteristics in a large, real-world cohort of maintenance hemodialysis patients, and to examine the impact of age on sP management with sucroferric oxyhydroxide (SO). METHODS: We retrospectively analyzed de-identified data from 2017 adult, in-center hemodialysis patients who switched from another PB to SO monotherapy as part of routine clinical care. Changes in baseline PB pill burden, sP levels, and nutritional and dialytic clearance parameters were assessed across varying age groups through 6 months. RESULTS: At baseline, older patients had lower mean sP, serum albumin, and pre-dialysis weights compared with younger patients. Prescription of SO was associated with a 62% increase in the proportion of patients achieving sP ≤ 5.5 mg/dl and a 42% reduction in daily pill burden. The proportion of patients achieving sP ≤ 5.5 mg/dl after transitioning to SO increased by 113, 96, 68, 77, 61, 37 and 40% among those aged 19-29, 30-39, 40-49, 50-59, 60-69, 70-79, and ≥ 80 years, respectively. CONCLUSIONS: Older patients had worse nutritional parameters, lower pill burden, and lower sP at baseline versus younger counterparts. Prescription of SO was associated with improved sP control and reduced pill burden across all ages.


Subject(s)
Hyperphosphatemia , Phosphorus , Adult , Aged , Humans , Hyperphosphatemia/drug therapy , Hyperphosphatemia/etiology , Retrospective Studies , Renal Dialysis , Drug Combinations
9.
Int J Nephrol Renovasc Dis ; 15: 139-149, 2022.
Article in English | MEDLINE | ID: mdl-35431567

ABSTRACT

Purpose: In prior analyses of real-world cohorts of hemodialysis patients switched from one phosphate binder (PB) to sucroferric oxyhydroxide (SO), SO therapy has been associated with improvements in serum phosphorus (sP) and reductions in daily PB pill burden. To characterize how SO initiation patterns have changed over time, we examined the long-term effectiveness of SO in a contemporary (2018-2019) cohort. Patients and Methods: Adult Fresenius Kidney Care hemodialysis patients first prescribed SO monotherapy as part of routine care between May 2018 and May 2019 (N = 1792) were followed for 1 year. All patients received a non-SO PB during a 91-day baseline period before SO prescription. Mean PB pills/day and laboratory parameters were compared before and during SO treatment. Results were divided into consecutive 91-day intervals (Q1-Q4) and analyzed using linear mixed-effects regression and Cochran's Q test. These results were contrasted with findings from a historical (2014-2015) cohort (N = 530). Results: The proportion of patients achieving sP ≤5.5 mg/dl increased after switching to SO (from 27.0% at baseline to 37.8%, 45.1%, 44.7%, and 44.0% at Q1, Q2, Q3, and Q4, respectively; P < 0.0001 for all). The mean daily PB pill burden decreased from a baseline of 7.7 to 4.4, 4.6, 4.8, and 4.9, respectively, across quarters (P < 0.0001 for all). Patients in the contemporary cohort had improved sP control (27.0% achieving sP ≤5.5 mg/dl vs 17.7%) and lower daily PB pill burden (mean 7.7 vs 8.5 pills/day) at baseline than those in the historical cohort. Overall use of active vitamin D was similar between cohorts, although higher use of oral active vitamin D (63.9% vs 15.7%) and lower use of IV active vitamin D lower (23.4% vs 74.2%) was observed in the contemporary cohort. Conclusion: Despite evolving treatment patterns, switching to SO resulted in improved sP control with fewer pills per day in this contemporary hemodialysis cohort.

10.
Cancer Epidemiol Biomarkers Prev ; 31(2): 334-341, 2022 02.
Article in English | MEDLINE | ID: mdl-35082122

ABSTRACT

BACKGROUND: Carcinoids, frequently classified as "colorectal cancer" contribute to rising early-onset colorectal cancer (EOCRC) incidence rates (IR) and have distinct staging distributions compared to often advanced stage adenocarcinomas (screening target). Thus, assessing temporal shifts in early-onset distant stage adenocarcinoma can impact public health. METHODS: 2000-2016 Surveillance Epidemiology and End Results (SEER) 18 yearly adenocarcinoma IRs were stratified by stage (in situ, localized, regional, distant), age (20-29, 30-39, 40-49, 50-54-year-olds), subsite (colorectal, rectal-only, colon-only), and race [non-Hispanic whites, non-Hispanic Blacks (NHB), Hispanics] in 103,975 patients. Three-year average annual IR changes (pooled 2000-2002 IRs compared with 2014-2016) and cancer stage proportions (percent contribution of each cancer stage) were calculated. RESULTS: Comparing 2000-2002 with 2014-2016, the steepest percent increases are in distant stage cancers. Colon-only, distant adenocarcinoma increased most in 30-39-year-olds (49%, 0.75/100,000→1.12/100,00, P < 0.05). Rectal-only, distant stage increases were steepest in 20-29-year-olds (133%, 0.06/100,000→0.14/100,000, P < 0.05), followed by 30-39-year-olds (97%, 0.39/100,000→0.77/100,000, P < 0.05) and 40-49-year-olds (48%, 1.38/100,000→2.04/100,000, P < 0.05). Distant stage proportions (2000-2002 to 2014-2016) increased for colon-only and rectal-only subsites in young patients with the largest increases for rectal-only in 20-29-year-olds (18%→31%) and 30-39-year-olds (20%→29%). By race, distant stage proportion increases were largest for rectal-only in 20-29-year-old NHBs (0%→46%) and Hispanics (28%→41%). Distant colon proportion increased most in 20-29-year-old NHBs (20%→34%). CONCLUSIONS: Youngest patients show greatest burdens of distant colorectal adenocarcinoma. Although affecting all races, burdens are higher in NHB and Hispanic subgroups, although case counts remain relatively low. IMPACT: Optimizing earlier screening initiatives and risk-stratifying younger patients by symptoms and family history are critical to counteract rising distant stage disease.


Subject(s)
Adenocarcinoma/epidemiology , Colorectal Neoplasms/epidemiology , Adenocarcinoma/diagnosis , Adult , Age Factors , Colorectal Neoplasms/diagnosis , Female , Humans , Male , Mass Screening/standards , Mass Screening/statistics & numerical data , Middle Aged , Neoplasm Staging , Risk Assessment , SEER Program , United States/epidemiology
11.
ASAIO J ; 68(1): 96-102, 2022 01 01.
Article in English | MEDLINE | ID: mdl-34172639

ABSTRACT

There is little research on factors that influence the choice of dialyzer in patients undergoing hemodialysis. In patients at risk for poorer outcomes, including those with hypoalbuminemia, understanding how this choice impacts clinical parameters could inform patient management. The objective of this real-world analysis was to evaluate the use and performance of four single-use (i.e., nonreuse [NR]), high-flux Optiflux dialyzers with varying surface areas (F160NR [1.5 m2], F180NR [1.7 m2], F200NR [1.9 m2], and F250NR [2.5 m2]) in patients (N = 271) with baseline hypoalbuminemia (≤3.5 g/dl) receiving hemodialysis at a medium-sized dialysis organization. Thrice weekly, in-center dialysis was delivered for 6 months without adjustments to the hemodialysis prescription. Larger dialyzers were more frequently used in men, patients with higher body mass indices, and those with diabetes. Increases in serum albumin from baseline (month 1) to month 6 (p < 0.05) were observed with all dialyzer sizes. A mean increase in hemoglobin of 0.31 g/dl was also observed (p < 0.001). Among patients exhibiting increased serum albumin levels (n = 177), reductions in the neutrophil-to-lymphocyte ratio, a marker of inflammation, were observed (mean: 0.90; p < 0.001). These results support the use of high-flux dialyzers in patients with hypoalbuminemia.


Subject(s)
Hypoalbuminemia , Hemoglobins , Humans , Hypoalbuminemia/etiology , Male , Membranes, Artificial , Renal Dialysis/adverse effects , Serum Albumin
12.
Clin Colorectal Cancer ; 21(2): e62-e75, 2022 06.
Article in English | MEDLINE | ID: mdl-34756680

ABSTRACT

BACKGROUND: The National Comprehensive Cancer Network (NCCN) guidelines have recommended tailored chemotherapy for stage III high-risk (T4 and/or N2) and low-risk (T1-T3 and N1) colon cancer since 2018. Studies have investigated the effect of relative dose intensity (RDI) of FOLFOX on stage III colon cancer survival, however, none has performed a stratified analysis by risk profiles. This study aims to identify the FOLFOX optimal RDI for high-risk and low-risk stage III colon cancer patients. METHODS: Data on 407 eligible patients, diagnosed with stage III colon cancer in 2011 who received FOLFOX, were collected by 8 population-based cancer registries. Multivariable Cox model and Fine-Gray competing risks model were employed to explore Optimal RDI defined as the lowest RDI administered without significant differences in either overall or cause-specific death. RESULTS: Among the 168 high-risk patients, the optimal RDI cut-off was 70% (HR = 1.59 with 95% CI: 0.69-3.66 in overall mortality; HR = 1.24 with 95% CI: 0.42-3.64 in cause-specific mortality when RDI < 70% vs. RDI ≥ 70%). Among the 239 low-risk patients, none of the evaluated cut-offs were associated with significant differences in risk of death between comparison groups. The lowest assessed RDI was 45%, HR = 0.80; 95% CI: 0.24 to 2.73 for overall mortality and HR = 0.53; 95% CI: 0.06 to 4.95 for cause-specific mortality, when RDI <45% versus RDI ≥45%. CONCLUSIONS: There is no significant harm on the risk of death when reducing RDI by <30% for high-risk patients. For the low-risk patients, we found that RDI as low as 45% did not significantly affect the risk of death.


Subject(s)
Antineoplastic Combined Chemotherapy Protocols , Colonic Neoplasms , Antineoplastic Combined Chemotherapy Protocols/adverse effects , Chemotherapy, Adjuvant , Colonic Neoplasms/pathology , Humans , Neoplasm Staging , Proportional Hazards Models , Retrospective Studies
13.
JAMA Netw Open ; 4(11): e2130433, 2021 11 01.
Article in English | MEDLINE | ID: mdl-34751760

ABSTRACT

Importance: Early-onset colorectal cancer incidence rates are rising faster in White individuals than Black individuals. However, prior National Cancer Institute Surveillance, Epidemiology, and End Results (SEER) racial stratification analyses used smaller SEER 13 databases, combined patients under age 50 years, did not stratify by sex, and did not focus on adenocarcinoma histologic subtypes (screening target). Objective: To perform a race- and sex-stratified adenocarcinoma incidence rate analysis in individuals aged 40 to 49 years using larger SEER 18 databases with expanded race data to better understand the colorectal cancer burden in those at or approaching screening age. Design, Setting, and Participants: This cross-sectional study used 2000 to 2017 SEER 18 annual age-adjusted colorectal cancer incidence rates stratified by anatomic subsite (colon or rectum), adenocarcinoma histology, race (non-Hispanic Black or non-Hispanic White), and sex for individuals aged 40 to 49 years, and yearly annual percent change (APC) incidence rates were calculated. Annual rate ratios (ARRs) between subgroups were determined. Statistical analysis was performed from January to March 2021. Main Outcomes and Measurements: Early-onset colorectal cancer incidence rates, APCs, and ARRs. Results: In this study, a total of 46 728 colorectal cancer cases were identified in 45 429 patients aged 40 to 49 years from 2000 to 2017. Among the 45 429 patients included in this study, 6480 (14.2%) were Black and 27 426 (60.4%) were White; the mean (SD) age was 45.5 (2.8) years. Among White individuals aged 40 to 49 years, colorectal adenocarcinoma incidence rates increased from 19.6 per 100 000 person-years in 2000 to 25.2 per 100 000 person-years in 2017 (APC, 1.6; 95% CI, 1.3 to 1.9). Among Black individuals aged 40 to 49 years, colorectal adenocarcinoma incidence rates were not significantly changed (26.4 per 100 000 person-years in 2000 and 25.8 per 100 000 person-years in 2017 [APC, -0.03; 95% CI, -0.5 to 0.5]). There were no significant differences in ARRs of absolute colorectal incidence rates between White and Black individuals from 2014 to 2017. Rectal-only absolute adenocarcinoma incidence rates in Black and White individuals remained similar from 2000 to 2008 but significantly diverged in 2009. As of 2017, rectal absolute incidence rates were 39% higher among White individuals than among Black individuals with increasing APC (APC, 2.2; 95% CI, 1.6 to 2.8) whereas rectal adenocarcinoma incidence rates among Black individuals were decreasing, although the APC was not statistically significant (APC, -1.4; 95% CI, -2.6 to 0.1). Absolute colonic adenocarcinoma incidence rates remained higher in Black individuals. The study subgroups with the largest divergence in APCs were rectal adenocarcinoma in White vs Black women (APC of 2.2 [95% CI, 1.6 to 2.8] vs APC of -1.7 [95% CI, -3.6 to 0.3], respectively). Conclusions and Relevance: This study found that colorectal adenocarcinoma incidence rates in people aged 40 to 49 years were increasing among White individuals but stabilized among Black individuals with absolute incidence rates becoming equivalent. Absolute rectal adenocarcinoma incidence rates were 39% lower in Black individuals with a widening disparity in rectal cancer between White and Black women. Possible contributors include introduction of a screening threshold of age 45 years in Black individuals in 2008. Although the average-risk screening age has now shifted to age 45 years in all racial groups, these data can help motivate real-world implementation of guidelines to maximize screening rates that have historically been suboptimal in younger individuals.


Subject(s)
Adenocarcinoma/epidemiology , Black People/statistics & numerical data , Colorectal Neoplasms/epidemiology , White People/statistics & numerical data , Adult , Cross-Sectional Studies , Early Detection of Cancer , Female , Humans , Incidence , Male , Middle Aged , Rectal Neoplasms/diagnosis , Rectal Neoplasms/epidemiology , SEER Program , United States/epidemiology
14.
Ann Intern Med ; 174(2): 157-166, 2021 02.
Article in English | MEDLINE | ID: mdl-33315473

ABSTRACT

BACKGROUND: Early-onset colorectal cancer (EOCRC) incidence rates (IRs) are rising, according to previous cancer registry analyses. However, analysis of histologic subtypes, including adenocarcinoma (the focus of CRC screening and diagnostic testing) and carcinoid tumors (which are classified as "colorectal cancer" in SEER [Surveillance, Epidemiology, and End Results] databases but have a distinct pathogenesis and are managed differently from adenocarcinoma), has not been reported. OBJECTIVE: To assess EOCRC IRs and changes in IRs over time, stratified by histology. DESIGN: Retrospective analysis. SETTING: Yearly IRs according to SEER 18 data from 2000 to 2016 on age-specific colon-only, rectal-only, and combined-site CRC cases, stratified by histology ("overall" CRC [all histologic subtypes], adenocarcinoma, and carcinoid tumors) and age. PATIENTS: 119 624 patients with CRC. MEASUREMENTS: IRs per 100 000 population, changes in 3-year average annual IRs (pooled IRs from 2000 to 2002 vs. those from 2014 to 2016), and annual percentage change (APC) in persons aged 20 to 29, 30 to 39, 40 to 49, and 50 to 54 years. RESULTS: The steepest changes in adenocarcinoma 3-year average annual IRs were for rectal-only cases in persons aged 20 to 29 years (+39% [0.33 to 0.46 per 100 000]; P < 0.050) and 30 to 39 years (+39% [1.92 to 2.66 per 100 000]; P < 0.050) and colon-only cases in those aged 30 to 39 years (+20% [3.30 to 3.97 per 100 000]; P < 0.050). Corresponding APCs were 1.6% (P < 0.050), 2.2% (P < 0.050), and 1.2% (P < 0.050), respectively. In persons aged 40 to 49 years, 3-year average annual IRs increased in both colon-only (+13% [12.21 to 13.85 per 100 000]; P < 0.050) and rectal-only (+16% [7.50 to 8.72 per 100 000]; P < 0.050) subsites. Carcinoid tumors were common, representing approximately 4% to 20% of all colorectal and 8% to 34% of all rectal cancer cases, depending on age group and calendar year. Colon-only carcinoid tumors were rare. Colorectal carcinoid tumor IRs increased more steeply than adenocarcinoma in all age groups, thus affecting the contribution of carcinoid tumors to overall cancer cases over time. These changes were driven by rectal subsites and were most pronounced in persons aged 50 to 54 years, in whom rectal carcinoid tumors increased by 159% (2.36 to 6.10 per 100 000) between 2000 to 2002 and 2014 to 2016, compared with 10% for adenocarcinoma (18.07 to 19.84 per 100 000), ultimately accounting for 22.6% of all rectal cancer cases. LIMITATION: Population-based data. CONCLUSION: These findings underscore the importance of assessing histologic CRC subtypes independently. Doing so may lead to a better understanding of the drivers of temporal changes in overall CRC incidence and a more accurate measurement of outcomes from efforts to reduce adenocarcinoma risk, and can guide future research. PRIMARY FUNDING SOURCE: None.


Subject(s)
Adenocarcinoma/epidemiology , Carcinoid Tumor/epidemiology , Colorectal Neoplasms/epidemiology , Adenocarcinoma/pathology , Adult , Age Factors , Age of Onset , Carcinoid Tumor/pathology , Colonic Neoplasms/epidemiology , Colonic Neoplasms/pathology , Colorectal Neoplasms/pathology , Female , Humans , Incidence , Male , Middle Aged , Rectal Neoplasms/epidemiology , Rectal Neoplasms/pathology , Retrospective Studies , Risk Factors , SEER Program , United States/epidemiology , Young Adult
15.
Int J Nephrol Renovasc Dis ; 14: 475-486, 2021.
Article in English | MEDLINE | ID: mdl-34992426

ABSTRACT

BACKGROUND: It has been proposed that substituting citrate-acidified dialysate (CAD) solutions for acetate-acidified dialysate (AAD) could improve hemodynamics and dialysis tolerance and reduce the requirement for systemic anticoagulation. Citrate chelates ionized calcium, but long-term effects of CAD use during maintenance hemodialysis have not been well studied. While many studies of the effects of CAD on serum calcium and intact parathyroid hormone (iPTH) have been short-term or have been limited by sample size, we aimed to determine if there are any long-term (i.e., 6-month) changes from pre-dialysis iPTH levels when patients are switched from AAD to CAD. METHODS: This retrospective cohort study compared various clinical parameters, including pre-dialysis iPTH and serum calcium as well as single pool Kt/V, from eligible patients who received in-center hemodialysis thrice-weekly in geographically matched CAD (n=3) or AAD clinics (n=12). CAD clinics were defined as clinics converting from AAD to CAD if >85% of the patients were prescribed CAD after implementation of CAD within the clinic. RESULTS: Pre-dialysis iPTH was not significantly different from baseline to 6-month follow-up within either CAD or AAD clinics. Moreover, the mean change from baseline to month 6 in iPTH between patients (n=142) in CAD clinics (-17 pg/mL) and patients (n=671) in AAD clinics (13 pg/mL) was similar (p = 0.24). Likewise, the differences in the mean change in serum calcium concentrations and dialysis adequacy (single pool Kt/V) were not significant between CAD and AAD clinics. For subgroups of patients who were never prescribed cinacalcet or calcium-based phosphate binders, there were no significantly different categorical shifts in iPTH between CAD and AAD clinics. CONCLUSION: Similar trends in single pool Kt/V, iPTH, and serum calcium levels were observed in clinics that switched from AAD to CAD versus the geographically matched AAD clinics. These results support CAD as a potential alternative to AAD in hemodialysis.

16.
Cancer Med ; 9(23): 9150-9159, 2020 12.
Article in English | MEDLINE | ID: mdl-33094553

ABSTRACT

BACKGROUND: Although early-onset colorectal cancer (EOCRC) incidence rates (IRs) are increasing, geographic and intra-racial IR disparities are not well defined. METHODS: 2000-2015 Surveillance, Epidemiology, and End Results (SEER) program CRC IR Analysis (170,434 cases) was performed from ages 30 to 60 in four US regions, 18 individual registries, metropolitan and nonmetropolitan locations and stratified by race. Analyses were conducted in 1-year and 5-year age increments. RESULTS: Wide US regional EOCRC IR variations exist: For example, age 45 IRs in the south are 26.8/100,000, 36.0% higher than the West, 19.7/100,000 (p < 0.0001). Disparities magnify between individual registries: EOCRC IRs in highest risk registries were 177-348% (Alaska Natives), 75-200% (Hawaii), 76-128% (Louisiana), and 61-125% (Kentucky) higher than lowest risk registries depending on age. EOCRC IRs are 18.2%-25.6% higher in nonmetropolitan versus metropolitan settings. Wide geographic intra-racial disparities exist. Within the White population, the greatest IR difference (78.8%) was between Kentucky (5.9/100,000) and Los Angeles (3.3/100,000) in 30- to 34-year-olds (p < .0001). Within the Black population, the greatest difference (136.2%) was between rural Georgia (30.7/100,000) and California excluding San Francisco-Oakland/San Jose-Monterey/Los Angeles (13/100,000) in 40- to 44-year-olds (p = 0003). CONCLUSION: Marked geographic EOCRC disparities exist with disproportionately high IRs in Alaska Natives, Hawaii, and southern registries. Geographic intra-racial disparities are present within White and Black populations. In Blacks, there are disproportionately high EOCRC IRs in rural Georgia. Although vigilance is required in all populations, attention must be paid to these higher risk populations. Potential interventions include assuring early investigation of symptoms, targeting modifiable risk factors and utilizing earlier age 45 screening options supported by some guidelines.


Subject(s)
Colorectal Neoplasms/ethnology , Health Status Disparities , Racial Groups , Adult , Age of Onset , Colorectal Neoplasms/diagnosis , Female , Humans , Incidence , Male , Middle Aged , Race Factors , Risk Assessment , Risk Factors , SEER Program , United States/epidemiology
17.
JAMA Netw Open ; 3(1): e1920407, 2020 01 03.
Article in English | MEDLINE | ID: mdl-32003823

ABSTRACT

Importance: Early-onset colorectal cancer incidence rates among patients aged 45 to 49 years have been considered much lower compared with the rates among patients aged 50 to 54 years, prompting debate about earlier screening benefits at 45 years. However, the observed incidence rates in the Surveillance, Epidemiology, and End Results (SEER) registries may underestimate colorectal cancer case burdens in those younger than 50 years compared with those older than 50 years because average-risk screening is generally not performed to detect preclinical cases of colorectal cancer. Finding steep incidence increases of invasive stage (beyond in situ) cases of colorectal cancer from age 49 to 50 years would be consistent with high rates of preexisting, undetected cancers in younger patients ultimately receiving a diagnosis of colorectal cancer after undergoing screening at 50 years. Objective: To assess the preclinical burden of colorectal cancer by analyzing its incidence in 1-year age increments, focusing on the transition between ages 49 and 50 years. Design, Setting, and Participants: Data from the SEER 18 registries, representing 28% of the US population, were used to conduct a cross-sectional study of colorectal cancer incidence rates from January 1, 2000, to December 31, 2015, in 1-year age increments (ages 30-60 years) stratified by US region (South, West, Northeast, and Midwest), sex, race, disease stage, and tumor location. Statistical analysis was conducted from November 1, 2018, to December 15, 2019. Main Outcomes and Measures: Incidence rates of colorectal cancer. Results: A total of 170 434 cases of colorectal cancer were analyzed among 165 160 patients (92 247 men [55.9%]; mean [SD] age, 51.6 [6.7] years). Steep increases in the incidence of colorectal cancer in the SEER 18 registries were found from 49 to 50 years of age (46.1% increase: 34.9 [95% CI, 34.1-35.8] to 51.0 [95% CI, 50.0-52.1] per 100 000 population). Steep rate increases from 49 to 50 years of age were also seen in all US regions, men and women, white and black populations, and in colon and rectal cancers. The rate ratio incidence increase in the SEER 18 registries from 49 to 50 years of age (1.46 [95% CI, 1.43-1.51]) was significantly higher than earlier 1-year age transitions. Steep rate increases in the SEER 18 registries were found from 49 to 50 years of age in localized-stage (75.9% increase: 11.2 [95% CI, 10.7-11.7] to 19.7 [95% CI, 19.0-20.3] per 100 000) and regional-stage (30.3% increase: 13.2 [95% CI, 12.7-13.8] to 17.2 [95% CI, 16.7-17.8] per 100 000) colorectal cancers. A total of 8799 of the 9474 cases (92.9%) of colorectal cancer in the SEER 18 registries from 2000 to 2015 that were diagnosed among individuals aged 50 years were invasive. Conclusions and Relevance: Steep incidence increases between 49 and 50 years of age are consistent with previously undetected colorectal cancers diagnosed via screening uptake at 50 years. These cancers are not reflected in observed rates of colorectal cancer in the SEER registries among individuals younger than 50 years. Hence, using observed incidence rates from 45 to 49 years of age alone to assess potential outcomes of earlier screening may underestimate cancer prevention benefits.


Subject(s)
Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/epidemiology , Early Detection of Cancer/statistics & numerical data , Early Detection of Cancer/trends , Adult , Age Factors , Cross-Sectional Studies , Female , Forecasting , Humans , Incidence , Male , Middle Aged , United States/epidemiology
18.
Anticancer Res ; 38(9): 5253-5260, 2018 Sep.
Article in English | MEDLINE | ID: mdl-30194175

ABSTRACT

BACKGROUND/AIM: A re-excision for positive margin(s) following a lumpectomy for invasive breast cancer is a standard recommendation. However, for elderly women with stage I estrogen receptor-positive (ER+) tumors, who may be at higher surgical risk, whether radiation therapy without re-excision will be adequate is not known. PATIENTS AND METHODS: We evaluated a cohort of 53,950 women aged ≥70 years with Stage I, ER+ breast cancer who had lumpectomy and anti-hormone therapy diagnosed between 2004 and 2011 in the National Cancer Data Base. Patients were divided into four groups: 1) negative margins without radiation (XRT), 2) negative margins with XRT, 3) positive margins without XRT, and 4) positive margins with XRT. Clinicopathological and sociodemographic variables were compared among these groups. Univariable and multivariable analysis were employed. RESULTS: The 5-year overall survival (OS) rates for the groups were as follows: 1) negative margins without radiation (XRT); 77.1%, 2) negative margins with XRT; 90.0%, 3) positive margins without XRT; 62.9%, and 4) positive margins with XRT; 86.8% (p<0.0001). Significant predictors (p<0.01) of OS include treatment groups, age, income status, facility type, facility location, tumor size, tumor grade, and comorbidities. CONCLUSION: Radiation therapy for positive surgical margins without re-excision may be a viable option for elderly women with stage I, ER+ tumor treated with lumpectomy and hormonal therapy.


Subject(s)
Breast Neoplasms/radiotherapy , Breast Neoplasms/surgery , Receptors, Estrogen/metabolism , Aged , Aged, 80 and over , Antineoplastic Agents, Hormonal/therapeutic use , Breast Neoplasms/metabolism , Breast Neoplasms/pathology , Female , Humans , Margins of Excision , Mastectomy, Segmental/methods , Neoplasm Staging , Survival Analysis , Treatment Outcome
19.
Clin Transl Gastroenterol ; 9(9): 185, 2018 09 20.
Article in English | MEDLINE | ID: mdl-30237431

ABSTRACT

OBJECTIVE: Although widely recommended, Lynch syndrome (LS) testing with tumor microsatellite instability (MSI) and/or immunohistochemistry (IHC) is infrequently performed in early-onset colorectal cancer (CRC), and CRC generally. Reasons are poorly understood. Hence, we conducted a national survey focusing on gastroenterologists, as they are frequently first to diagnose CRC, assessing testing barriers and which specialist is felt responsible for ordering MSI/IHC. Additionally, we assessed factors influencing timing of MSI/IHC ordering; testing on colonoscopy biopsy, opposed to post-operative surgical specimens, assists decisions on preoperative germline genetic testing and extent of colonic resection (ECR). METHODS: A 21-question web-based survey was distributed through an American College of Gastroenterology email listing. RESULTS: In total 509 completed the survey. 442 confirmed gastroenterologists were analyzed. Only 33.4% felt gastroenterologists were responsible for MSI/IHC ordering; pathologists were believed most responsible (38.6%). Cost, unfamiliarity interpreting results and unavailable genetic counseling most commonly prevented routine ordering (33.3%, 29.2%, 24.9%, respectively). In multivariable analysis, non-academic and rural settings were associated with cost and genetic counseling barriers. Only 46.1% felt MSI/IHC should always be performed on colonoscopy biopsy. Guideline familiarity predicted whether respondents felt surgical resection should be delayed until results returned given potential effect on ECR decisions. CONCLUSION: Inconsistencies in who is felt should order MSI/IHC may lead to diffusion of responsibility, preventing consistent testing, including preoperatively. Assuring institutional universal testing protocols are in place, with focus on timing of testing, can optimize care. Strategies addressing cost barriers and genomic service availability in rural and non-academic settings can enhance testing. Greater emphasis on guideline familiarity is required.


Subject(s)
Colorectal Neoplasms, Hereditary Nonpolyposis/diagnosis , Genetic Testing , Immunohistochemistry , Practice Patterns, Physicians' , Age of Onset , Colorectal Neoplasms, Hereditary Nonpolyposis/genetics , Colorectal Neoplasms, Hereditary Nonpolyposis/surgery , Female , Gastroenterologists , Genetic Testing/economics , Health Care Costs , Humans , Immunohistochemistry/economics , Male , Microsatellite Instability , Pathologists , Physician's Role , Rural Population , Surveys and Questionnaires
20.
Surgery ; 163(6): 1213-1219, 2018 06.
Article in English | MEDLINE | ID: mdl-29525735

ABSTRACT

BACKGROUND: The Cancer and Leukemia Group B 9,343 demonstrated that postoperative radiation can be safely omitted in women ≥70 years who underwent breast-conserving therapy for clinical stage I (T1N0M0) estrogen receptor positive breast cancer treated with antihormonal therapy. Whether such results are observed in real-world population is unknown. In this hospital-based data, we report the survival outcomes of patients who received adjuvant radiation therapy versus those who did not. METHODS: Using the National Cancer Data Base, we evaluated a cohort of 47,358 women with newly diagnosed breast cancer between 2004 and 2011 who underwent a lumpectomy and antihormonal therapy with the following criteria: age ≥70 years, clinical stage I, estrogen receptor positive, and negative margins. Patients were stratified into 2 groups: (1) radiation therapy and (2) no radiation therapy. Propensity score matching was used to compensate for differences in demographic and clinical characteristics of the patients. Univariate and multivariable survival analysis were employed to determine factors associated with overall survival. RESULTS: The 5-year overall survival after propensity score matching was 87.2% for radiation therapy and 79.4% for no radiation therapy (P < .0001). The median survival time was 113.7 months for radiation therapy and 105.2 months for no radiation therapy. After adjusting for sociodemographic and clinical factors, the risk of overall deaths was significantly higher for those not receiving radiation therapy (hazard ratio = 1.66; 95% confidence interval, 1.54-1.79). Other significant adjusted predictors (P < .05) of poor overall survival were, advanced age, comprehensive community cancer program, facility location, poorly differentiated tumor, and high comorbidity index. CONCLUSION: Patients who received radiation therapy had better survival outcomes than those who did not, revealing discordance between results of randomized trials and real-world setting.


Subject(s)
Breast Neoplasms/mortality , Breast Neoplasms/therapy , Radiotherapy, Adjuvant , Age Factors , Aged , Aged, 80 and over , Breast Neoplasms/pathology , Cohort Studies , Female , Humans , Margins of Excision , Mastectomy, Segmental , Neoplasm Staging , Patient Selection , Propensity Score , Survival Analysis , Survival Rate
SELECTION OF CITATIONS
SEARCH DETAIL
...