Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 25
Filtrar
1.
J Hepatol ; 2024 Apr 05.
Artículo en Inglés | MEDLINE | ID: mdl-38583491

RESUMEN

BACKGROUND & AIMS: Functional cure (FC) for chronic hepatitis B (CHB) requires finite treatment. Two agents under investigation aimed at achieving FC are small interfering RNA JNJ-73763989 (JNJ-3989) and capsid assembly modulator JNJ-56136379 (JNJ-6379; bersacapavir). METHODS: REEF-2, a phase 2b, double-blind, placebo-controlled, randomized study (ClinicalTrials.gov Identifier: NCT04129554), enrolled 130 nucleos(t)ide analog (NA)-suppressed hepatitis B e-antigen (HBeAg)-negative CHB patients who received JNJ-3989 (200 mg subcutaneously every 4 weeks)+JNJ-6379 (250 mg oral daily)+NA (oral daily; active arm) or placebos for JNJ-3989 and JNJ-6379 + active NA (control arm) for 48 weeks followed by 48 weeks off-treatment follow-up. RESULTS: At Follow-up Week 24, no patients achieved the primary endpoint of FC (off-treatment hepatitis B surface antigen [HBsAg] seroclearance). No patients achieved FC at Follow-up Week 48. There was pronounced on-treatment reduction in mean HBsAg from baseline at Week 48 in the active arm versus no decline in the control arm (1.89 vs 0.06 log10 IU/mL; P = 0.001). At Follow-up Week 48, reductions from baseline were >1 log10 IU/mL in 81.5% versus 12.5% of patients in the active and control arms, respectively, and 38/81 (46.9%) patients in the active arm achieved HBsAg <100 IU/mL versus 6/40 (15.0%) patients in the control arm. Off-treatment HBV DNA relapse and alanine aminotransferase (ALT) increases were less frequent in the active arm with 7/77 (9.1%) and 11/41 (26.8%) patients in the active and control arms, respectively, restarting NA during follow-up. CONCLUSIONS: Finite 48-week treatment with JNJ-3989+JNJ-6379+NA resulted in fewer and less severe posttreatment HBV DNA increases and ALT flares, and a higher proportion of patients with off-treatment HBV DNA suppression, with or without HBsAg suppression, but did not result in FC. GOV IDENTIFIER: NCT04129554.

2.
Lipids Health Dis ; 22(1): 67, 2023 May 25.
Artículo en Inglés | MEDLINE | ID: mdl-37231413

RESUMEN

BACKGROUND: In contrast to guidelines related to lipid therapy in other areas, 2012 Kidney Disease Improving Global Outcomes (KDIGO) guidelines recommend conducting a lipid profile upon diagnosis of chronic kidney disease (CKD) and treating all patients older than 50 years without defining a target for lipid levels. We evaluated multinational practice patterns for lipid management in patients with advanced CKD under nephrology care. METHODS: We analyzed lipid-lowering therapy (LLT), LDL- cholesterol (LDL-C) levels, and nephrologist-specified LDL-C goal upper limits in adult patients with eGFR < 60 ml/min from nephrology clinics in Brazil, France, Germany, and the United States (2014-2019). Models were adjusted for CKD stage, country, cardiovascular risk indicators, sex, and age. RESULTS: LLT treatment differed significantly by country, from 51% in Germany to 61% in the US and France (p = 0.002) for statin monotherapy. For ezetimibe with or without statins, the prevalence was 0.3% in Brazil to 9% in France (< 0.001). Compared with patients not taking lipid-lowering therapy, LDL-C was lower among treated patients (p < 0.0001) and differed significantly by country (p < 0.0001). At the patient level, the LDL-C levels and statin prescription did not vary significantly by CKD stage (p = 0.09 LDL-C and p = 0.24 statin use). Between 7-23% of untreated patients in each country had LDL-C ≥ 160 mg/dL. Only 7-17% of nephrologists believed that LDL-C should be < 70 mg/dL. CONCLUSION: There is substantial variation in practice patterns regarding LLT across countries but not across CKD stages. Treated patients appear to benefit from LDL-C lowering, yet a significant proportion of hyperlipidemia patients under nephrologist care are not receiving treatment.


Asunto(s)
Dislipidemias , Inhibidores de Hidroximetilglutaril-CoA Reductasas , Nefrología , Insuficiencia Renal Crónica , Adulto , Humanos , Estados Unidos , LDL-Colesterol , Dislipidemias/epidemiología , Insuficiencia Renal Crónica/tratamiento farmacológico , Resultado del Tratamiento
3.
Clin Kidney J ; 16(1): 176-183, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36726438

RESUMEN

Background: Hyperkalemia (HK) is a frequent condition in patients with chronic kidney disease (CKD) that is associated with high morbidity and mortality. Patiromer has recently been introduced as a potassium binder. Data on patiromer use in patients with CKD in the real-world setting in Europe are lacking. We describe time to discontinuation and changes in serum potassium levels among German CKD stage 3-5 patients starting patiromer. Methods: Duration of patiromer use was estimated by Kaplan-Meier curve, starting at patiromer initiation and censoring for death, dialysis, transplant or loss to follow-up. Serum potassium levels and renin-angiotensin-aldosterone system inhibitor (RAASi) use are described at baseline and during follow-up, restricted to patients remaining on patiromer. Results: We identified 140 patiromer users within our analysis sample [81% CKD stage 4/5, 83% receiving RAASi, and median K+ 5.7 (5.4, 6.3) mmol/L]. Thirty percent of patiromer users had prior history of polystyrene sulfonate use. Overall, 95% of patiromer users stayed on treatment past 1 month, with 53% continuing for over a year. Mean serum potassium levels decreased after patiromer initiation and remained stable under treatment during follow-up (up to 180 days). Among these patients, 73%-82% used RAASis during the time periods before and after patiromer initiation, with no obvious trend indicating discontinuation. Conclusion: Real-world evidence of patiromer use in Germany shows that, in line with what has been observed in clinical trials, patients on patiromer have a reduction in serum potassium when used long-term. Moreover, most patients on patiromer do not discontinue treatment prior to 1 year after initiation.

4.
Kidney Med ; 4(6): 100475, 2022 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-35637925

RESUMEN

Rationale & Objective: Some US hemodialysis (HD) facilities switched from oral cinacalcet to intravenous etelcalcetide as the primary calcimimetic therapy to control parathyroid hormone (PTH) levels after the introduction of etelcalcetide in 2017. Although clinical trials have demonstrated the superior efficacy of etelcalcetide versus cinacalcet, evidence comparing real-world effectiveness is lacking. Study Design: Prospective cohort. Setting & Participants: Patients receiving HD enrolled in US Dialysis Outcomes and Practice Patterns Study facilities. Exposure: We classified HD facilities on the basis of whether >75% of calcimimetic users were prescribed etelcalcetide ("etelcalcetide-first") or cinacalcet ("cinacalcet-first") from March-August 2019. Outcomes: PTH, calcium, and phosphorus levels among calcimimetic users, all averaged in the 6 months after the exposure assessment period. Analytical Approach: We used adjusted linear regression to compare outcomes using 2 approaches: (1) cross-sectional comparison of etelcalcetide-first and cinacalcet-first HD facilities; (2) pre-post comparison of HD facilities that switched from cinacalcet-first to etelcalcetide-first using facilities that remained cinacalcet-first as a comparison group. Results: We identified 45 etelcalcetide-first and 67 cinacalcet-first HD facilities; etelcalcetide-first (vs cinacalcet-first) facilities were more likely to be from small or independent dialysis organizations (86% vs 22%) and had higher total calcimimetic use (43% vs 29%) and lower active vitamin D use (66% vs 82%). In the cross-sectional analysis comparing etelcalcetide-first and cinacalcet-first HD facilities, the adjusted mean difference in PTH levels was -115 pg/mL (95% CI, -196 to -34) and the prevalence of a PTH level of >600 pg/mL was lower (prevalence difference, -11.4%; 95% CI, -19.3% to -3.5%). Among facilities that switched to etelcalcetide-first, the mean PTH level decreased from 671 to 484 pg/mL and the prevalence of a PTH level of >600 pg/mL decreased from 39% to 21%. Among facilities that remained cinacalcet-first, the mean PTH level increased from 632 to 698 pg/mL and the prevalence of a PTH level of >600 pg/mL increased from 37% to 43%. The adjusted difference-in-difference between the switch to etelcalcetide-first and the continuation of cinacalcet-first was -169 pg/mL (-249 to -90 pg/mL) for the mean PTH and -14.4% (-22.0% to -6.8%) for a PTH level of >600 pg/mL. We also observed slightly lower serum calcium levels and minimal differences in serum phosphorus levels between the etelcalcetide-first and the cinacalcet-first facilities. Limitations: Residual confounding. Conclusions: We observed better PTH control in HD facilities that switched from using cinacalcet to etelcalcetide as the primary calcimimetic therapy. Further research is needed to investigate how the greater real-world effectiveness of intravenous etelcalcetide (vs oral cinacalcet) may affect clinical outcomes.

5.
Kidney Med ; 4(2): 100395, 2022 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-35243307

RESUMEN

RATIONALE & OBJECTIVE: Potential surrogate end points for kidney failure have been proposed in chronic kidney disease (CKD); however, they must be evaluated to ensure accurate, powerful, and harmonized research, particularly among patients with advanced CKD. The aim of the current study was to investigate the power and predictive ability of surrogate kidney failure end points in a population with moderate-to-advanced CKD. STUDY DESIGN: Analysis of longitudinal data of a large multinational CKD observational study (Chronic Kidney Disease Outcomes and Practice Patterns Study). SETTING & PARTICIPANTS: CKD stage 3-5 patients from Brazil, France, Germany, and the United States. OUTCOMES: Reaching an estimated glomerular filtration rate (eGFR) < 15 mL/min/1.73 m2 or eGFR decline of ≥40%, and composite end points of these individual end points. ANALYTICAL APPROACH: Each end point was used as a time-varying indicator in the Cox model to predict the time to kidney replacement therapy (KRT; dialysis or transplant) and was compared by the number of events and prediction accuracy. RESULTS: 8,211 patients had a median baseline eGFR of 27 mL/min/1.73 m2 (interquartile range, 21-36 mL/min/1.73 m2) and 1,448 KRT events over a median follow-up of 2.7 years (interquartile range, 1.2-3.0 years). Among CKD stage 4 patients, the eGFR < 15 mL/min/1.73 m2 end point had higher prognostic ability than 40% eGFR decline, but the end points were similar for CKD stage 3 patients. The combination of eGFR < 15 mL/min/1.73 m2 and 40% eGFR decline had the highest prognostic ability for predicting KRT, regardless of the CKD stage. Including KRT in the composite can increase the number of events and, therefore, the power. LIMITATIONS: Variable visit frequency resulted in variable eGFR measurement frequency. CONCLUSIONS: The composite end point can be useful for CKD progression studies among patients with advanced CKD. Harmonized use of this approach has the potential to accelerate the translation of new discoveries to clinical practice by identifying risk factors and treatments for kidney failure.

6.
Am J Kidney Dis ; 79(3): 362-373, 2022 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-34273436

RESUMEN

RATIONALE & OBJECTIVE: Clinical trial data have demonstrated the efficacy of etelcalcetide for reducing parathyroid hormone (PTH) levels in hemodialysis (HD) patients. We provide a real-world summary of etelcalcetide utilization, dosing, effectiveness, and discontinuation since its US introduction in April 2017. STUDY DESIGN: New-user design within prospective cohort. SETTING & PARTICIPANTS: 2,596 new users of etelcalcetide from April 2017 through August 2019 in a national sample of adult maintenance HD patients in the US Dialysis Outcomes and Practice Patterns Study (DOPPS). PREDICTORS: Baseline PTH, prior cinacalcet use, initial etelcalcetide dose. OUTCOME: Trajectories of etelcalcetide dose, chronic kidney disease-mineral and bone disease (CKD-MBD) medications, and levels of PTH, serum calcium, and phosphorus in the 12 months after etelcalcetide initiation. ANALYTICAL APPROACH: Cumulative incidence methods for etelcalcetide discontinuation and linear generalized estimating equations for trajectory analyses. RESULTS: By August 2019, etelcalcetide prescriptions increased to 6% of HD patients from their first use in April 2017. Starting etelcalcetide dose was 15 mg/wk in 70% of patients and 7.5 mg/wk in 27% of patients; 49% of new users were prescribed cinacalcet in the prior 3 months. Etelcalcetide discontinuation was 9%, 17%, and 27% by 3, 6, and 12 months after initiation. One year after etelcalcetide initiation, mean PTH levels declined by 40%, from 948 to 566 pg/mL, and the proportion of patients with PTH within target (150-599 pg/mL) increased from 33% to 64% overall, from 0 to 60% among patients with baseline PTH ≥ 600 pg/mL, and from 30% to 63% among patients with prior cinacalcet use. The proportion of patients with serum phosphorus > 5.5 mg/dL decreased from 55% to 45%, while the prevalence of albumin-corrected serum calcium < 7.5 mg/dL remained at 1%-2%. There were increases in use of active vitamin D (from 77% to 87%) and calcium-based phosphate binders (from 41% to 50%) in the 12 months after etelcalcetide initiation. LIMITATIONS: Data are unavailable for provider dosing protocols, dose holds, or reasons for discontinuation. CONCLUSIONS: In the 12 months after etelcalcetide initiation, patients had large and sustained reductions in PTH levels. These results support the utility of etelcalcetide as an effective therapy to achieve the KDIGO-recommended guidelines for CKD-MBD markers in HD patients.


Asunto(s)
Enfermedades Óseas , Trastorno Mineral y Óseo Asociado a la Enfermedad Renal Crónica , Hiperparatiroidismo Secundario , Insuficiencia Renal Crónica , Adulto , Enfermedades Óseas/complicaciones , Calcio , Trastorno Mineral y Óseo Asociado a la Enfermedad Renal Crónica/tratamiento farmacológico , Trastorno Mineral y Óseo Asociado a la Enfermedad Renal Crónica/etiología , Estudios de Cohortes , Humanos , Hiperparatiroidismo Secundario/etiología , Minerales , Hormona Paratiroidea , Péptidos , Estudios Prospectivos , Diálisis Renal/efectos adversos , Insuficiencia Renal Crónica/complicaciones , Insuficiencia Renal Crónica/terapia
7.
J Am Soc Nephrol ; 32(8): 2020-2030, 2021 08.
Artículo en Inglés | MEDLINE | ID: mdl-34244326

RESUMEN

BACKGROUND: Approximately 30%-45% of patients with nondialysis CKD have iron deficiency. Iron therapy in CKD has focused primarily on supporting erythropoiesis. In patients with or without anemia, there has not been a comprehensive approach to estimating the association between serum biomarkers of iron stores, and mortality and cardiovascular event risks. METHODS: The study included 5145 patients from Brazil, France, the United States, and Germany enrolled in the Chronic Kidney Disease Outcomes and Practice Patterns Study, with first available transferrin saturation (TSAT) and ferritin levels as exposure variables. We used Cox models to estimate hazard ratios (HRs) for all-cause mortality and major adverse cardiovascular events (MACE), with progressive adjustment for potentially confounding variables. We also used linear spline models to further evaluate functional forms of the exposure-outcome associations. RESULTS: Compared with patients with a TSAT of 26%-35%, those with a TSAT ≤15% had the highest adjusted risks for all-cause mortality and MACE. Spline analysis found the lowest risk at TSAT 40% for all-cause mortality and MACE. Risk of all-cause mortality, but not MACE, was also elevated at TSAT ≥46%. Effect estimates were similar after adjustment for hemoglobin. For ferritin, no directional associations were apparent, except for elevated all-cause mortality at ferritin ≥300 ng/ml. CONCLUSIONS: Iron deficiency, as captured by TSAT, is associated with higher risk of all-cause mortality and MACE in patients with nondialysis CKD, with or without anemia. Interventional studies evaluating the effect on clinical outcomes of iron supplementation and therapies for alternative targets are needed to better inform strategies for administering exogenous iron.


Asunto(s)
Anemia Ferropénica/sangre , Enfermedades Cardiovasculares/epidemiología , Ferritinas/sangre , Insuficiencia Renal Crónica/sangre , Transferrina/metabolismo , Anciano , Anciano de 80 o más Años , Anemia Ferropénica/etiología , Biomarcadores/sangre , Brasil/epidemiología , Femenino , Francia/epidemiología , Alemania/epidemiología , Humanos , Masculino , Mortalidad , Modelos de Riesgos Proporcionales , Insuficiencia Renal Crónica/complicaciones , Factores de Riesgo , Estados Unidos/epidemiología
8.
Clin Kidney J ; 14(5): 1436-1442, 2021 May.
Artículo en Inglés | MEDLINE | ID: mdl-33959272

RESUMEN

BACKGROUND: Beta-2 microglobulin (ß2M) accumulates in hemodialysis (HD) patients, but its consequences are controversial, particularly in the current era of high-flux dialyzers. High-flux HD treatment improves ß2M removal, yet ß2M and other middle molecules may still contribute to adverse events. We investigated patient factors associated with serum ß2M, evaluated trends in ß2M levels and in hospitalizations due to dialysis-related amyloidosis (DRA), and estimated the effect of ß2M on mortality. METHODS: We studied European and Japanese participants in the Dialysis Outcomes and Practice Patterns Study. Analysis of DRA-related hospitalizations spanned 1998-2018 (n = 23 976), and analysis of ß2M and mortality in centers routinely measuring ß2M spanned 2011-18 (n = 5332). We evaluated time trends with linear and Poisson regression and mortality with Cox regression. RESULTS: Median ß2M changed nonsignificantly from 2.71 to 2.65 mg/dL during 2011-18 (P = 0.87). Highest ß2M tertile patients (>2.9 mg/dL) had longer dialysis vintage, higher C-reactive protein and lower urine volume than lowest tertile patients (≤2.3 mg/dL). DRA-related hospitalization rates [95% confidence interval (CI)] decreased from 1998 to 2018 from 3.10 (2.55-3.76) to 0.23 (0.13-0.42) per 100 patient-years. Compared with the lowest ß2M tertile, adjusted mortality hazard ratios (95% CI) were 1.16 (0.94-1.43) and 1.38 (1.13-1.69) for the middle and highest tertiles. Mortality risk increased monotonically with ß2M modeled continuously, with no indication of a threshold. CONCLUSIONS: DRA-related hospitalizations decreased over 10-fold from 1998 to 2018. Serum ß2M remains positively associated with mortality, even in the current high-flux HD era.

9.
Clin Kidney J ; 14(3): 820-830, 2021 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-33777365

RESUMEN

BACKGROUND: Dialysis adequacy, as measured by single pool Kt/V, is an important parameter for assessing hemodialysis (HD) patients' health. Guidelines have recommended Kt/V of 1.2 as the minimum dose for thrice-weekly HD. We describe Kt/V achievement, its predictors and its relationship with mortality in the Gulf Cooperation Council (GCC) (Bahrain, Kuwait, Oman, Qatar, Saudi Arabia and the United Arab Emirates). METHODS: We analyzed data (2012-18) from the prospective cohort Dialysis Outcomes and Practice Patterns Study for 1544 GCC patients ≥18 years old and on dialysis >180 days. RESULTS: Thirty-four percent of GCC HD patients had low Kt/V (<1.2) versus 5%-17% in Canada, Europe, Japan and the USA. Across the GCC countries, low Kt/V prevalence ranged from 10% to 54%. In multivariable logistic regression, low Kt/V was more common (P < 0.05) with larger body weight and height, being male, shorter treatment time (TT), lower blood flow rate (BFR), greater comorbidity burden and using HD versus hemodiafiltration. In adjusted Cox models, low Kt/V was strongly related to higher mortality in women [hazard ratio (HR) = 1.91, 95% confidence interval (CI) 1.09-3.34] but not in men (HR = 1.16, 95% CI 0.70-1.92). Low BFR (<350 mL/min) and TT (<4 h) were common; 41% of low Kt/V cases were attributable to low BFR or TT (52% for women and 36% for men). CONCLUSION: Relatively large proportions of GCC HD patients have low Kt/V. Increasing BFR to ≥350 mL/min and TT to ≥4 h thrice weekly will reduce low Kt/V prevalence and may improve survival in GCC HD patients-particularly among women.

10.
Kidney Int Rep ; 6(2): 437-448, 2021 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-33615069

RESUMEN

INTRODUCTION: The uptake of the Kidney Disease: Improving Global Outcomes (KDIGO) 2012 chronic kidney disease (CKD) Guideline is not fully described in real-world nephrology practice across the world. METHODS: We used baseline data from the CKD Outcomes and Practice Patterns Study (2013-2017), a 4-country cohort of patients with estimated glomerular filtration rate <60 ml/min per 1.73 m2 recruited from national samples of nephrology clinics, to describe adherence to measures for monitoring and delaying CKD progression. Data were collected as in clinical practice, except laboratory measures per protocol in France. RESULTS: The mean age ranged from 65 years in Brazil to 72 years in Germany. Albuminuria (mostly proteinuria) was measured routinely in 36% to 43% of patients in Brazil, Germany, and the United States. Blood pressure control (≤140/90 mm Hg) ranged from 49% in France to 76% in Brazil; <40% of patients had blood pressure ≤130/80 mm Hg everywhere but Brazil (52%). More than 40% of nephrologists in Brazil reported a systolic blood pressure target ≤130 mm Hg for nondiabetic patients without proteinuria, but only 19% to 24% elsewhere. Prescription of renin-angiotensin aldosterone system inhibitors ranged from 52% in the United States to 81% in Germany. Dietary advice was more frequent for salt than protein intake; dietitian visits were uncommon. In nondiabetic patients, achievement of all 3 targets including blood pressure ≤130/80 mm Hg, renin-angiotensin aldosterone system inhibition, and dietary advice, ranged from 10% in the United States to 32% in Brazil; in treated diabetic patients, this ranged from 6% to 11% after including hemoglobin A1c target. CONCLUSION: Adherence to recommendations to slow CKD progression is low in typical practice settings, and substantial variation among countries for some indicates opportunities for improvement.

11.
Nephrol Dial Transplant ; 36(9): 1694-1703, 2021 08 27.
Artículo en Inglés | MEDLINE | ID: mdl-33624825

RESUMEN

BACKGROUND: Iron deficiency (ID) is a common condition in nondialysis-dependent chronic kidney disease (NDD-CKD) patients that is associated with poorer clinical outcomes. However, the effect of ID on health-related quality of life (HRQoL) in this population is unknown. We analyzed data from a multinational cohort of NDD-CKD Stages 3-5 patients to test the association between transferrin saturation (TSAT) index and ferritin with HRQoL. METHODS: Patients from Brazil (n = 205), France (n = 2015) and the USA (n = 293) in the Chronic Kidney Disease Outcomes and Practice Patterns Study (CKDopps, 2013-2019) were included. We evaluated the association of TSAT and ferritin (and functional and absolute ID, defined as TSAT ≤20% and ferritin ≥300 or <50 ng/mL) on pre-specified HRQoL measures, including the 36-item Kidney Disease Quality of Life physical component summary (PCS) and mental component summary (MCS) as the primary outcomes. Models were adjusted for confounders including hemoglobin (Hb). RESULTS: TSAT ≤15% and ferritin <50 ng/mL and ≥300 ng/mL were associated with worse PCS scores, but not with MCS. Patients with composite TSAT ≤20% and ferritin <50 or ≥300 ng/mL had lower functional status and worse PCS scores than those with a TSAT of 20-30% and ferritin 50-299 ng/mL. Patients with a lower TSAT were less likely to perform intense physical activity. Adjustment for Hb only slightly attenuated the observed effects. CONCLUSIONS: Low TSAT levels, as well as both low TSAT with low ferritin and low TSAT with high ferritin, are associated with worse physical HRQoL in NDD-CKD patients, even after accounting for Hb level. Interventional studies of iron therapy on HRQoL among NDD-CKD individuals are needed to confirm these findings.


Asunto(s)
Anemia Ferropénica , Anemia , Insuficiencia Renal Crónica , Anemia/etiología , Anemia Ferropénica/etiología , Biomarcadores , Humanos , Hierro , Calidad de Vida , Insuficiencia Renal Crónica/terapia
12.
J Ren Nutr ; 30(5): 404-414, 2020 09.
Artículo en Inglés | MEDLINE | ID: mdl-31980326

RESUMEN

OBJECTIVE: Conflicting findings and knowledge gaps exist regarding links between anemia, physical activity, health-related quality of life (HRQOL), chronic kidney disease (CKD) progression, and mortality in moderate-to-advanced CKD. Using the CKD Outcomes and Practice Patterns Study, we report associations of hemoglobin (Hgb) with HRQOL and physical activity, and associations of Hgb and physical activity with CKD progression and mortality in stage 3-5 nondialysis (ND)-CKD patients. DESIGN AND METHODS: Prospectively collected data were analyzed from 2,121 ND-CKD stage 3-5 patients, aged ≥18 years, at 43 nephrologist-run US and Brazil CKD Outcomes and Practice Patterns Study-participating clinics. Cross-sectional associations were assessed of Hgb levels with HRQOL and physical activity levels (from validated Kidney Disease Quality of Life Instrument and Rapid Assessment of Physical Activity surveys). CKD progression (first of ≥40% estimated glomerular filtration rate [eGFR] decline, eGFR<10 mL/min/1.73 m2, or end-stage kidney disease) and all-cause mortality with Hgb and physical activity levels were also evaluated. Linear, logistic, and Cox regression analyses were adjusted for country, demographics, smoking, eGFR, serum albumin, very high proteinuria, and 13 comorbidities. RESULTS: HRQOL was worse, with severe anemia (Hgb<10 g/dL), but also evident for mild/moderate anemia (Hgb 10-12 g/dL), relative to Hgb>12 g/dL. Odds of being highly physically active were substantially greater at Hgb>10.5 g/dL. Lower Hgb was strongly associated with greater CKD progression and mortality, even after extensive adjustment. Physical inactivity was strongly associated with greater mortality and weakly associated with CKD progression. Possible residual confounding is a limitation. CONCLUSION: This multicenter international study provides real-world observational evidence for greater HRQOL, physical activity, lower CKD progression, and greater survival in ND-CKD patients with Hgb levels >12 g/dL, exceeding current treatment guideline recommendations. These findings help inform future studies aimed at understanding the impact of new anemia therapies and physical activity regimens on improving particular dimensions of ND-CKD patient well-being and clinical outcomes.


Asunto(s)
Ejercicio Físico/fisiología , Hemoglobinas/fisiología , Calidad de Vida , Insuficiencia Renal Crónica/mortalidad , Insuficiencia Renal Crónica/fisiopatología , Anciano , Brasil/epidemiología , Estudios de Cohortes , Progresión de la Enfermedad , Femenino , Humanos , Masculino , Estudios Prospectivos , Estados Unidos/epidemiología
13.
Pract Radiat Oncol ; 10(1): e27-e36, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-31382026

RESUMEN

PURPOSE: The heart has been identified as a potential significant organ at risk in patients with locally advanced non-small cell lung cancer treated with radiation. Practice patterns and radiation dose delivered to the heart in routine practice in academic and community settings are unknown. METHODS AND MATERIALS: Between 2012 and 2017, 746 patients with stage III non-small cell lung cancer were treated with radiation within the statewide Michigan Radiation Oncology Quality Consortium (MROQC). Cardiac radiation dose was characterized, including mean and those exceeding historical or recently proposed Radiation Therapy Oncology Group and NRG Oncology constraints. Sites were surveyed to determine dose constraints used in practice. Patient-, anatomic-, and treatment-related associations with cardiac dose were analyzed using multivariable regression analysis and inverse probability weighting. RESULTS: Thirty-eight percent of patients had a left-sided primary, and 80% had N2 or N3 disease. Median prescription was 60 Gy (interquartile range, 60-66 Gy). Twenty-two percent of patients were prescribed 60 Gy in 2012, which increased to 62% by 2017 (P < .001). Median mean heart dose was 12 Gy (interquartile range, 5-19 Gy). The volume receiving 30 Gy (V30 Gy) exceeded 50% in 5% of patients, and V40 Gy was >35% in 3% of cases. No heart dose constraint was uniformly applied. Intensity modulated radiation therapy (IMRT) usage increased from 33% in 2012 to 86% in 2017 (P < .001) and was significantly associated with more complex cases (larger planning target volume, higher stage, and preexisting cardiac disease). In multivariable regression analysis, IMRT was associated with a lower percent of the heart receiving V30 Gy (absolute reduction = 3.0%; 95% confidence interval, 0.5%-5.4%) and V50 Gy (absolute reduction = 3.6%; 95% confidence interval, 2.4%-4.8%) but not mean dose. In inverse probability weighting analysis, IMRT was associated with 29% to 48% relative reduction in percent of the heart receiving V40-V60 Gy without increasing lung or esophageal dose or compromising planning target volume coverage. CONCLUSIONS: Within MROQC, historical cardiac constraints were met in most cases, yet 1 in 4 patients received a mean heart dose exceeding 20 Gy. Future work is required to standardize heart dose constraints and to develop treatment approaches that allow for constraints to be met without compromising other planning goals.


Asunto(s)
Carcinoma de Pulmón de Células no Pequeñas/radioterapia , Corazón/efectos de la radiación , Neoplasias Pulmonares/radioterapia , Traumatismos por Radiación/prevención & control , Radioterapia de Intensidad Modulada/efectos adversos , Factores de Edad , Anciano , Carcinoma de Pulmón de Células no Pequeñas/patología , Relación Dosis-Respuesta en la Radiación , Femenino , Humanos , Neoplasias Pulmonares/patología , Masculino , Michigan/epidemiología , Persona de Mediana Edad , Estadificación de Neoplasias , Órganos en Riesgo/efectos de la radiación , Guías de Práctica Clínica como Asunto , Pautas de la Práctica en Medicina/normas , Pautas de la Práctica en Medicina/estadística & datos numéricos , Traumatismos por Radiación/epidemiología , Traumatismos por Radiación/etiología , Oncología por Radiación/normas , Oncología por Radiación/estadística & datos numéricos , Dosificación Radioterapéutica/normas , Planificación de la Radioterapia Asistida por Computador/métodos , Planificación de la Radioterapia Asistida por Computador/normas , Radioterapia de Intensidad Modulada/normas , Factores Sexuales
14.
Adv Radiat Oncol ; 3(4): 662-672, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30370368

RESUMEN

PURPOSE: This study aimed to analyze the potential clinical impact of the differences between planned and accumulated doses on the development and use of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: Thirty patients who were previously treated with stereotactic body radiation therapy for liver cancer and for whom the accumulated dose was computed were assessed retrospectively. The linear quadratic equivalent dose at 2 Gy per fraction and generalized equivalent uniform dose were calculated for planned and accumulated doses. Stomach and duodenal Lyman-Kutcher-Burman NTCP models (α/ß = 2.5; n = .09) were developed on the basis of planned and accumulated generalized equivalent uniform doses and the differences between the models assessed. In addition, the error in determining the probability of toxicity on the basis of the planned dose was evaluated by comparing planned doses in the NTCP model that were created from accumulated doses. RESULTS: The standard, planned-dose NTCP model overestimates toxicity risk for both the duodenal and stomach models at doses that are below approximately 20 Gy (6 fractions) and underestimates toxicity risk for doses above approximately 20 Gy (6 fractions). Building NTCP models with accumulated rather than planned doses changes the predicted risk by up to 16% (mean: 6%; standard deviation: 7%) for duodenal toxicity and 6% (mean: 2%; standard deviation: 2%) for stomach toxicity. For a protocol that plans a 10% iso-toxicity risk to the duodenum, a 15.7 Gy (6 fractions) maximum dose constraint would be necessary when using standard NTCP models on the basis of a planned dose and a 17.6 Gy (6 fractions) maximum dose would be allowed when using NTCP models on the basis of accumulated doses. CONCLUSIONS: Assuming that accumulated dose is a more accurate representation of the true delivered dose than the planned dose, this simulation study indicates the need for prospective clinical trials to evaluate the impact of building NTCP models on the basis of accumulated doses.

15.
Clin Trials ; 15(4): 386-397, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29779418

RESUMEN

Background/Aims The goal of phase I clinical trials for cytotoxic agents is to find the maximum dose with an acceptable risk of severe toxicity. The most common designs for these dose-finding trials use a binary outcome indicating whether a patient had a dose-limiting toxicity. However, a patient may experience multiple toxicities, with each toxicity assigned an ordinal severity score. The binary response is then obtained by dichotomizing a patient's richer set of data. We contribute to the growing literature on new models to exploit this richer toxicity data, with the goal of improving the efficiency in estimating the maximum tolerated dose. Methods We develop three new, related models that make use of the total number of dose-limiting and low-level toxicities a patient experiences. We use these models to estimate the probability of having at least one dose-limiting toxicity as a function of dose. In a simulation study, we evaluate how often our models select the true maximum tolerated dose, and we compare our models with the continual reassessment method, which uses binary data. Results Across a variety of simulation settings, we find that our models compare well against the continual reassessment method in terms of selecting the true optimal dose. In particular, one of our models which uses dose-limiting and low-level toxicity counts beats or ties the other models, including the continual reassessment method, in all scenarios except the one in which the true optimal dose is the highest dose available. We also find that our models, when not selecting the true optimal dose, tend to err by picking lower, safer doses, while the continual reassessment method errs more toward toxic doses. Conclusion Using dose-limiting and low-level toxicity counts, which are easily obtained from data already routinely collected, is a promising way to improve the efficiency in finding the true maximum tolerated dose in phase I trials.


Asunto(s)
Ensayos Clínicos Fase I como Asunto , Citotoxinas/toxicidad , Efectos Colaterales y Reacciones Adversas Relacionados con Medicamentos , Dosis Máxima Tolerada , Teorema de Bayes , Simulación por Computador , Relación Dosis-Respuesta a Droga , Humanos , Proyectos de Investigación
16.
Med Phys ; 45(4): 1369-1378, 2018 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-29474748

RESUMEN

PURPOSE: Investigate the impact on prostate orientation caused by use and removal of a Foley catheter, and the dosimetric impact on men prospectively treated with prostate stereotactic body radiotherapy (SBRT). METHODS: Twenty-two men underwent a CT simulation with a Foley in place (FCT), followed immediately by a second treatment planning simulation without the Foley (TPCT). The change in prostate orientation was determined by rigid registration of three implanted transponders between FCT and TPCT and compared to measured orientation changes during treatment. The impact on treatment planning and delivery was investigated by analyzing the measured rotations during treatment relative to both CT scans, and introducing rotations of ±15° in the treatment plan to determine the maximum impact of allowed rotations. RESULTS: Removing the Foley caused a statistically significant prostate rotation (P < 0.0028) compared to normal biological motion in 60% of patients. The largest change in rotation due to removing a Foley occurs about the left-right axis (tilt) which has a standard deviation two to five times larger than changes in rotation about the Sup-Inf (roll) and Ant-Post (yaw) axes. The change in tilt due to removing a Foley for prone and supine patients was -1.1° ± 6.0° and 0.3° ± 7.4°, showing no strong directional bias. The average tilt during treatment was -1.6° ± 7.1° compared to the TPCT and would have been -2.0° ± 7.1° had the FCT been used as the reference. The TPCT was a better or equivalent representation of prostate tilt in 82% of patients, vs 50% had the FCT been used for treatment planning. However, 92.7% of fractions would still have been within the ±15° rotation limit if only the FCT were used for treatment planning. When rotated ±15°, urethra V105% = 38.85Gy  < 20% was exceeded in 27% of the instances, and prostate (CTV) coverage was maintained above D95%  > 37 Gy in all but one instance. CONCLUSIONS: Removing a Foley catheter can cause large prostate rotations. There does not appear to be a clear dosimetric benefit to obtaining the CT scan with a Foley catheter to define the urethra given the changes in urethral position from removing the Foley catheter. If urethral sparing is desired without the use of a Foley, utilization of an MRI to define the urethra may be necessary, or a pseudo-urethral planning organ at risk volume (PRV) may be used to limit dosimetric hot spots.


Asunto(s)
Artefactos , Catéteres , Movimiento , Neoplasias de la Próstata/radioterapia , Radiocirugia , Ensayos Clínicos como Asunto , Humanos , Masculino , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/fisiopatología , Radiometría , Planificación de la Radioterapia Asistida por Computador , Rotación , Tomografía Computarizada por Rayos X/instrumentación
17.
Obstet Gynecol ; 129(4): 621-628, 2017 04.
Artículo en Inglés | MEDLINE | ID: mdl-28277349

RESUMEN

OBJECTIVE: To identify missed opportunities for prevention of mother-to-child transmission of human immunodeficiency virus (HIV). METHODS: Data regarding HIV-infected children born between 2002 and 2009 to HIV-infected women enrolled in the U.S. International Maternal Pediatric Adolescent AIDS Clinical Trials prospective cohort study (protocol P1025) were reviewed. The characteristics of the HIV-infected infants and their mothers and the mothers' clinical management are described. RESULTS: Twelve cases of mother-to-child transmission of HIV occurred among 1,857 liveborn neonates, for a prevalence of 0.65 per 100 live births to HIV-infected women (95% confidence interval 0.33-1.13). Four transmissions occurred in utero, three were peripartum transmissions, and the timing of transmission for five neonates was unable to be determined. None were breastfed. Seven women had plasma viral loads greater than 400 copies/mL near delivery. Six women had less than 11 weeks of antiretroviral therapy during pregnancy; three of these women had premature deliveries. One woman received no antiretroviral therapy during pregnancy because she was diagnosed with HIV postpartum. Six had poor to moderate adherence to antiretroviral therapy. Four of the five mothers with viral loads greater than 1,000 copies/mL delivered preterm neonates. There were five women who delivered by cesarean; four were nonelective cesarean deliveries and only one was an elective cesarean delivery for HIV prevention. CONCLUSION: Despite access to high-level care and follow-up, a small proportion of HIV-infected women transmitted the virus to their neonates. This case series provides insight into factors contributing to HIV perinatal transmission and can inform the development of new strategies for prevention of mother-to-child transmission of HIV. CLINICAL TRIAL REGISTRATION: ClinicalTrials.gov, https://clinicaltrials.gov, NCT00028145.


Asunto(s)
Antirretrovirales/uso terapéutico , Infecciones por VIH , Mal Uso de los Servicios de Salud , Transmisión Vertical de Enfermedad Infecciosa , Atención Perinatal , Complicaciones Infecciosas del Embarazo , Adulto , Parto Obstétrico/efectos adversos , Parto Obstétrico/métodos , Parto Obstétrico/estadística & datos numéricos , Femenino , Infecciones por VIH/congénito , Infecciones por VIH/diagnóstico , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/transmisión , Accesibilidad a los Servicios de Salud/normas , Mal Uso de los Servicios de Salud/prevención & control , Mal Uso de los Servicios de Salud/estadística & datos numéricos , Humanos , Recién Nacido , Transmisión Vertical de Enfermedad Infecciosa/prevención & control , Transmisión Vertical de Enfermedad Infecciosa/estadística & datos numéricos , Evaluación de Necesidades , Atención Perinatal/métodos , Atención Perinatal/estadística & datos numéricos , Embarazo , Complicaciones Infecciosas del Embarazo/diagnóstico , Complicaciones Infecciosas del Embarazo/tratamiento farmacológico , Mejoramiento de la Calidad , Estados Unidos/epidemiología , Carga Viral/métodos
18.
Laryngoscope ; 127(4): 971-976, 2017 04.
Artículo en Inglés | MEDLINE | ID: mdl-27796047

RESUMEN

OBJECTIVES/HYPOTHESIS: The aim of this study was to determine if drug-induced sleep endoscopy (DISE) was predictive of success for patients undergoing transoral robotic surgery (TORS) and multilevel procedures for sleep apnea. STUDY DESIGN: Retrospective case series of patients who underwent TORS surgery for sleep apnea METHODS: Before and after polysomnograms were analyzed to assess improvement, success, and cure. Improvement was defined as any decrease in apnea-hypopnea index (AHI), success as an AHI <20 with a decrease >50%, and cure as an AHI <5. DISE videos were scored using the NOHL (nose, oropharynx, hypopharynx, larynx) and VOTE (velum, oropharynx, tongue, epiglottis) classification systems. RESULTS: One hundred one patients were available for analysis. Eighty-seven percent of patients had an improvement in their AHI. Fifty-one percent met criteria for success, whereas 17% were cured. The degree of collapse at individual NOHL and VOTE subsites as well as total additive scores did not predict improvement, success, or cure. Patients with no oropharyngeal lateral collapse in the VOTE classification system were more likely to improve following surgery (P = .001); however, this effect did not hold for success or cure. Multivariate analysis of DISE variables was not predictive of success. CONCLUSIONS: In obstructive sleep apnea patients, there is a 51% success rate and a 17% cure rate. DISE, as scored by the NOHL and VOTE system, did not readily identify patients who would benefit most from surgery. Patients with lateral oropharyngeal collapse may be poorer candidates. Prospective, larger studies are required to further evaluate the use of DISE in predicting success following TORS. LEVEL OF EVIDENCE: 4 Laryngoscope, 127:971-976, 2017.


Asunto(s)
Endoscopía/métodos , Hipnóticos y Sedantes/administración & dosificación , Cirugía Endoscópica por Orificios Naturales/métodos , Procedimientos Quirúrgicos Robotizados/métodos , Apnea Obstructiva del Sueño/cirugía , Adulto , Anciano , Estudios de Cohortes , Femenino , Estudios de Seguimiento , Humanos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis Multivariante , Polisomnografía/métodos , Valor Predictivo de las Pruebas , Cuidados Preoperatorios/métodos , Estudios Prospectivos , Índice de Severidad de la Enfermedad , Sueño/efectos de los fármacos , Apnea Obstructiva del Sueño/diagnóstico , Resultado del Tratamiento
19.
Otolaryngol Head Neck Surg ; 155(1): 106-12, 2016 07.
Artículo en Inglés | MEDLINE | ID: mdl-26980915

RESUMEN

OBJECTIVE: To determine biomarkers of recurrence and survival in patients with spindle cell variant squamous cell carcinoma (SpSCC) of the head and neck. STUDY DESIGN: Retrospective case control study. SETTING: Tertiary academic center. SUBJECTS AND METHODS: Thirty-two SpSCC patients (mean age, 68.8) between 1987 and 2009 were identified and reviewed. A tissue microarray (TMA) was constructed from tumor specimens. Tumor biomarkers under study included p16, epidermal growth factor receptor (EGFR), p53, EZH2, cyclin D1, CD104, HGFa, p21, and cMET. An additional TMA was constructed from patients with non-SpSCC oral cavity squamous cell carcinoma for comparative purposes. The main outcomes were overall survival (OS), disease-specific survival (DSS), and recurrence-free survival (RFS). RESULTS: In the SpSCC cohort, tumors positive for cMet had worse OS (P < .001). Patients positive for cMet (P = .007), cyclin D1 (P = .019), and p16 (P = .004) had worse DSS. Recurrence-free survival was also worse in patients with tumors positive for cMet (P = .037), cyclin D1 (P = .012), and p16 (P < .001). Compared with the oral cavity cohort, there was a significantly larger proportion of patients in the SpSCC group with tumors staining positive for cMet and a lower proportion of tumors positive for cyclin D1. CONCLUSION: cMet, cyclin D1, and p16 are predictive tumor biomarkers for risk of recurrence and worse DSS in patients with SpSCC.


Asunto(s)
Biomarcadores de Tumor/análisis , Carcinoma de Células Escamosas/patología , Neoplasias de Cabeza y Cuello/patología , Sarcoma/patología , Adulto , Anciano , Anciano de 80 o más Años , Estudios de Casos y Controles , Femenino , Humanos , Masculino , Análisis por Micromatrices , Persona de Mediana Edad , Recurrencia Local de Neoplasia , Estudios Retrospectivos , Factores de Riesgo , Carcinoma de Células Escamosas de Cabeza y Cuello , Tasa de Supervivencia
20.
J Surg Res ; 201(1): 196-201, 2016 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-26850202

RESUMEN

BACKGROUND: Recent data show patients with advanced colorectal cancer (CRC) are surviving longer. What is unknown is how specific treatment modalities affect long-term survival. Conditional survival, or survival prognosis based on time already survived, is becoming an acceptable means of estimating prognosis for long-term survivors. We evaluated the impact of cancer-directed surgery on long-term survival in patients with advanced CRC. METHODS: We used Surveillance, Epidemiology, and End Results data to identify 64,956 patients with advanced (Stage IV) CRC diagnosed from 2000-2009. Conditional survival estimates by stage, age, and cancer-directed surgery were obtained based on Cox proportional hazards regression model of disease-specific survival. RESULTS: A total of 64,956 (20.1%) patients had advanced disease at the time of diagnosis. The proportion of those patients who underwent cancer-directed surgery was 65.1% (n = 42,176). Cancer-directed surgery for patients with advanced stage disease was associated with a significant improvement in traditional survival estimates compared to patients who did not undergo surgery (hazard ratio = 2.22 [95% confidence interval, 2.17-2.27]). Conditional survival estimates show improvement in conditional 5-y disease-specific survival across all age groups, demonstrating sustained survival benefits for selected patients with advanced CRC. CONCLUSIONS: Five-year disease-specific conditional survival improves dramatically over time for selected patients with advanced CRC who undergo cancer-directed surgery. This information is important in determining long-term prognosis and will help inform treatment planning for advanced CRC.


Asunto(s)
Neoplasias Colorrectales/mortalidad , Neoplasias Colorrectales/cirugía , Anciano , Anciano de 80 o más Años , Colon/patología , Neoplasias Colorrectales/patología , Femenino , Humanos , Masculino , Persona de Mediana Edad , Metástasis de la Neoplasia , Recto/patología , Programa de VERF , Estados Unidos/epidemiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...