Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 71
Filtrar
1.
Stat Med ; 34(7): 1150-68, 2015 Mar 30.
Artículo en Inglés | MEDLINE | ID: mdl-25546152

RESUMEN

Treatment preferences of groups (e.g., clinical centers) have often been proposed as instruments to control for unmeasured confounding-by-indication in instrumental variable (IV) analyses. However, formal evaluations of these group-preference-based instruments are lacking. Unique challenges include the following: (i) correlations between outcomes within groups; (ii) the multi-value nature of the instruments; (iii) unmeasured confounding occurring between and within groups. We introduce the framework of between-group and within-group confounding to assess assumptions required for the group-preference-based IV analyses. Our work illustrates that, when unmeasured confounding effects exist only within groups but not between groups, preference-based IVs can satisfy assumptions required for valid instruments. We then derive a closed-form expression of asymptotic bias of the two-stage generalized ordinary least squares estimator when the IVs are valid. Simulations demonstrate that the asymptotic bias formula approximates bias in finite samples quite well, particularly when the number of groups is moderate to large. The bias formula shows that when the cluster size is finite, the IV estimator is asymptotically biased; only when both the number of groups and cluster size go to infinity, the bias disappears. However, the IV estimator remains advantageous in reducing bias from confounding-by-indication. The bias assessment provides practical guidance for preference-based IV analyses. To increase their performance, one should adjust for as many measured confounders as possible, consider groups that have the most random variation in treatment assignment and increase cluster size. To minimize the likelihood for these IVs to be invalid, one should minimize unmeasured between-group confounding.


Asunto(s)
Bioestadística/métodos , Factores de Confusión Epidemiológicos , Modelos Estadísticos , Anemia/sangre , Anemia/tratamiento farmacológico , Sesgo , Causalidad , Ensayos Clínicos Fase III como Asunto/estadística & datos numéricos , Análisis por Conglomerados , Simulación por Computador , Hematínicos/administración & dosificación , Hemoglobinas/metabolismo , Humanos , Análisis de los Mínimos Cuadrados , Funciones de Verosimilitud , Estudios Observacionales como Asunto/estadística & datos numéricos , Diálisis Renal
2.
Clin J Am Soc Nephrol ; 7(12): 1977-87, 2012 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-22977208

RESUMEN

BACKGROUND AND OBJECTIVES: When hemodialysis dose is scaled to body water (V), women typically receive a greater dose than men, but their survival is not better given a similar dose. This study sought to determine whether rescaling dose to body surface area (SA) might reveal different associations among dose, sex, and mortality. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Single-pool Kt/V (spKt/V), equilibrated Kt/V, and standard Kt/V (stdKt/V) were computed using urea kinetic modeling on a prevalent cohort of 7229 patients undergoing thrice-weekly hemodialysis. Data were obtained from the Centers for Medicare & Medicaid Services 2008 ESRD Clinical Performance Measures Project. SA-normalized stdKt/V (SAN-stdKt/V) was calculated as stdKt/V × ratio of anthropometric volume to SA/17.5. Patients were grouped into sex-specific dose quintiles (reference: quintile 1 for men). Adjusted hazard ratios (HRs) for 1-year mortality were calculated using Cox regression. RESULTS: spKt/V was higher in women (1.7 ± 0.3) than in men (1.5 ± 0.2; P<0.001), but SAN-stdKt/V was lower (women: 2.3 ± 0.2; men: 2.5 ± 0.3; P<0.001). For both sexes, mortality decreased as spKt/V increased, until spKt/V was 1.6-1.7 (quintile 4 for men: HR, 0.62; quintile 3 for women: HR, 0.64); no benefit was observed with higher spKt/V. HR for mortality decreased further at higher SAN-stdKt/V in both sexes (quintile 5 for men: HR, 0.69; quintile 5 for women: HR, 0.60). CONCLUSIONS: SA-based dialysis dose results in dose-mortality relationships substantially different from those with volume-based dosing. SAN-stdKt/V analyses suggest women may be relatively underdosed when treated by V-based dosing. SAN-stdKt/V as a measure for dialysis dose may warrant further study.


Asunto(s)
Superficie Corporal , Fallo Renal Crónico/mortalidad , Fallo Renal Crónico/terapia , Diálisis Renal/métodos , Anciano , Distribución de Chi-Cuadrado , Humanos , Fallo Renal Crónico/sangre , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Factores Sexuales , Estadísticas no Paramétricas , Factores de Tiempo , Urea/sangre
3.
Kidney Int ; 82(5): 570-80, 2012 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-22718187

RESUMEN

KDOQI practice guidelines recommend predialysis blood pressure <140/90 mm Hg; however, most prior studies had found elevated mortality with low, not high, systolic blood pressure. This is possibly due to unmeasured confounders affecting systolic blood pressure and mortality. To lessen this bias, we analyzed 24,525 patients by Cox regression models adjusted for patient and facility characteristics. Compared with predialysis systolic blood pressure of 130-159 mm Hg, mortality was 13% higher in facilities with 20% more patients at systolic blood pressure of 110-129 mm Hg and 16% higher in facilities with 20% more patients at systolic blood pressure of ≥160 mm Hg. For patient-level systolic blood pressure, mortality was elevated at low (<130 mm Hg), not high (≥180 mm Hg), systolic blood pressure. For predialysis diastolic blood pressure, mortality was lowest at 60-99 mm Hg, a wide range implying less chance to improve outcomes. Higher mortality at systolic blood pressure of <130 mm Hg is consistent with prior studies and may be due to excessive blood pressure lowering during dialysis. The lowest risk facility systolic blood pressure of 130-159 mm Hg indicates this range may be optimal, but may have been influenced by unmeasured facility practices. While additional study is needed, our findings contrast with KDOQI blood pressure targets, and provide guidance on optimal blood pressure range in the absence of definitive clinical trial data.


Asunto(s)
Presión Sanguínea , Hipertensión/mortalidad , Fallo Renal Crónico/terapia , Pautas de la Práctica en Medicina/estadística & datos numéricos , Diálisis Renal/mortalidad , Australia , Comorbilidad , Estudios Transversales , Europa (Continente) , Femenino , Humanos , Hipertensión/fisiopatología , Japón , Fallo Renal Crónico/mortalidad , Modelos Lineales , Masculino , Persona de Mediana Edad , Nueva Zelanda , Guías de Práctica Clínica como Asunto , Pautas de la Práctica en Medicina/normas , Modelos de Riesgos Proporcionales , Estudios Prospectivos , Diálisis Renal/efectos adversos , Diálisis Renal/normas , Medición de Riesgo , Factores de Riesgo , Resultado del Tratamiento , Estados Unidos
4.
JAMA ; 306(17): 1891-901, 2011 Nov 02.
Artículo en Inglés | MEDLINE | ID: mdl-22045767

RESUMEN

CONTEXT: Solid organ transplant recipients have elevated cancer risk due to immunosuppression and oncogenic viral infections. Because most prior research has concerned kidney recipients, large studies that include recipients of differing organs can inform cancer etiology. OBJECTIVE: To describe the overall pattern of cancer following solid organ transplantation. DESIGN, SETTING, AND PARTICIPANTS: Cohort study using linked data on solid organ transplant recipients from the US Scientific Registry of Transplant Recipients (1987-2008) and 13 state and regional cancer registries. MAIN OUTCOME MEASURES: Standardized incidence ratios (SIRs) and excess absolute risks (EARs) assessing relative and absolute cancer risk in transplant recipients compared with the general population. RESULTS: The registry linkages yielded data on 175,732 solid organ transplants (58.4% for kidney, 21.6% for liver, 10.0% for heart, and 4.0% for lung). The overall cancer risk was elevated with 10,656 cases and an incidence of 1375 per 100,000 person-years (SIR, 2.10 [95% CI, 2.06-2.14]; EAR, 719.3 [95% CI, 693.3-745.6] per 100,000 person-years). Risk was increased for 32 different malignancies, some related to known infections (eg, anal cancer, Kaposi sarcoma) and others unrelated (eg, melanoma, thyroid and lip cancers). The most common malignancies with elevated risk were non-Hodgkin lymphoma (n = 1504; incidence: 194.0 per 100,000 person-years; SIR, 7.54 [95% CI, 7.17-7.93]; EAR, 168.3 [95% CI, 158.6-178.4] per 100,000 person-years) and cancers of the lung (n = 1344; incidence: 173.4 per 100,000 person-years; SIR, 1.97 [95% CI, 1.86-2.08]; EAR, 85.3 [95% CI, 76.2-94.8] per 100,000 person-years), liver (n = 930; incidence: 120.0 per 100,000 person-years; SIR, 11.56 [95% CI, 10.83-12.33]; EAR, 109.6 [95% CI, 102.0-117.6] per 100,000 person-years), and kidney (n = 752; incidence: 97.0 per 100,000 person-years; SIR, 4.65 [95% CI, 4.32-4.99]; EAR, 76.1 [95% CI, 69.3-83.3] per 100,000 person-years). Lung cancer risk was most elevated in lung recipients (SIR, 6.13 [95% CI, 5.18-7.21]) but also increased among other recipients (kidney: SIR, 1.46 [95% CI, 1.34-1.59]; liver: SIR, 1.95 [95% CI, 1.74-2.19]; and heart: SIR, 2.67 [95% CI, 2.40-2.95]). Liver cancer risk was elevated only among liver recipients (SIR, 43.83 [95% CI, 40.90-46.91]), who manifested exceptional risk in the first 6 months (SIR, 508.97 [95% CI, 474.16-545.66]) and a 2-fold excess risk for 10 to 15 years thereafter (SIR, 2.22 [95% CI, 1.57-3.04]). Among kidney recipients, kidney cancer risk was elevated (SIR, 6.66 [95% CI, 6.12-7.23]) and bimodal in onset time. Kidney cancer risk also was increased in liver recipients (SIR, 1.80 [95% CI, 1.40-2.29]) and heart recipients (SIR, 2.90 [95% CI, 2.32-3.59]). CONCLUSION: Compared with the general population, recipients of a kidney, liver, heart, or lung transplant have an increased risk for diverse infection-related and unrelated cancers.


Asunto(s)
Neoplasias/epidemiología , Trasplante de Órganos/efectos adversos , Adolescente , Adulto , Anciano , Niño , Preescolar , Estudios de Cohortes , Femenino , Humanos , Tolerancia Inmunológica , Huésped Inmunocomprometido , Incidencia , Lactante , Masculino , Persona de Mediana Edad , Sistema de Registros/estadística & datos numéricos , Riesgo , Estados Unidos/epidemiología , Adulto Joven
6.
Am J Kidney Dis ; 57(2): 266-75, 2011 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-21251541

RESUMEN

BACKGROUND: Hemodialysis patients with larger hemoglobin level fluctuations have higher mortality rates. We describe facility-level interpatient hemoglobin variability, its relation to patient mortality, and factors associated with facility-level hemoglobin variability or achieving hemoglobin levels of 10.5-12.0 g/dL. Facility-level hemoglobin variability may reflect within-patient hemoglobin variability and facility-level anemia-control practices. STUDY DESIGN: Prospective cohort study. SETTING & PARTICIPANTS: Data from the Dialysis Outcomes and Practice Patterns Study (DOPPS; 26,510 hemodialysis patients, 930 facilities, 12 countries, 1996-2008) and from the Centers for Medicare & Medicaid Services (CMS; 193,291 hemodialysis patients, 3,741 US facilities, 2002). PREDICTORS: Standard deviation (SD) in single-measurement hemoglobin levels in hemodialysis patients in facility cross-sections (facility-level hemoglobin SD); patient characteristics; facility practices. OUTCOMES: Patient-level mortality; additionally, facility practices correlated with facility-level hemoglobin SD or patient hemoglobin levels of 10.5-12.0 g/dL. RESULTS: Facility-level hemoglobin SD varied more than 5-fold across DOPPS facilities (range, 0.5-2.7 g/dL; mean, 1.3 g/dL) and by country (range, 1.1 in Japan-DOPPS [2005/2006] to 1.7 g/dL in Spain-DOPPS [1998/1999]), with substantial decreases seen in many countries from 1998 to 2007. Facility-level hemoglobin SD was related inversely to patient age, but was associated minimally with more than 30 other patient characteristics and facility mean hemoglobin levels. Several anemia management practices were associated strongly with facility-level hemoglobin SD and having a hemoglobin level of 10.5-12.0 g/dL. When examined in CMS data, facility-level hemoglobin SD was positively associated with within-patient hemoglobin SD during the prior 6 months. Patient mortality rates were higher with greater facility-level hemoglobin SD (DOPPS: HR, 1.08 per 0.5-g/dL greater facility-level hemoglobin SD [95% CI, 1.02-1.15; P = 0.006]; CMS: HR, 1.16 per 0.5-g/dL greater facility-level hemoglobin SD [95% CI, 1.11-1.21; P < 0. 001]). LIMITATIONS: Residual confounding. CONCLUSIONS: Facility-level hemoglobin SD was associated strongly and positively with patient mortality, not tightly linked to numerous patient characteristics, but related strongly to facility anemia management practices. Facility-level hemoglobin variability may be modifiable and its optimization may improve hemodialysis patient survival.


Asunto(s)
Hemoglobinas/metabolismo , Fallo Renal Crónico/mortalidad , Fallo Renal Crónico/terapia , Pautas de la Práctica en Medicina , Diálisis Renal , Índice de Severidad de la Enfermedad , Anemia/prevención & control , Estudios de Cohortes , Relación Dosis-Respuesta a Droga , Femenino , Hematínicos/uso terapéutico , Humanos , Japón , Fallo Renal Crónico/sangre , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Estudios Retrospectivos , España , Tasa de Supervivencia , Estados Unidos
7.
Transplantation ; 88(2): 231-6, 2009 Jul 27.
Artículo en Inglés | MEDLINE | ID: mdl-19623019

RESUMEN

BACKGROUND: We propose a continuous kidney donor risk index (KDRI) for deceased donor kidneys, combining donor and transplant variables to quantify graft failure risk. METHODS: By using national data from 1995 to 2005, we analyzed 69,440 first-time, kidney-only, deceased donor adult transplants. Cox regression was used to model the risk of death or graft loss, based on donor and transplant factors, adjusting for recipient factors. The proposed KDRI includes 14 donor and transplant factors, each found to be independently associated with graft failure or death: donor age, race, history of hypertension, history of diabetes, serum creatinine, cerebrovascular cause of death, height, weight, donation after cardiac death, hepatitis C virus status, human leukocyte antigen-B and DR mismatch, cold ischemia time, and double or en bloc transplant. The KDRI reflects the rate of graft failure relative to that of a healthy 40-year-old donor. RESULTS: Transplants of kidneys in the highest KDRI quintile (>1.45) had an adjusted 5-year graft survival of 63%, compared with 82% and 79% in the two lowest KDRI quintiles (<0.79 and 0.79-<0.96, respectively). There is a considerable overlap in the KDRI distribution by expanded and nonexpanded criteria donor classification. CONCLUSIONS: The graded impact of KDRI on graft outcome makes it a useful decision-making tool at the time of the deceased donor kidney offer.


Asunto(s)
Trasplante de Riñón/efectos adversos , Medición de Riesgo , Donantes de Tejidos , Adolescente , Adulto , Cadáver , Creatinina/sangre , Femenino , Rechazo de Injerto/epidemiología , Rechazo de Injerto/mortalidad , Supervivencia de Injerto , Historia del Siglo XVI , Humanos , Trasplante de Riñón/mortalidad , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Adulto Joven
8.
J Am Soc Nephrol ; 20(5): 1094-101, 2009 May.
Artículo en Inglés | MEDLINE | ID: mdl-19357257

RESUMEN

Recent studies have associated rosiglitazone, a thiazolidinedione drug, with adverse cardiovascular outcomes in the general population with diabetes. Using data from the Dialysis Outcomes and Practice Patterns Study in the United States, we examined cardiovascular hospitalization and mortality associated with prescription of rosiglitazone, compared with other oral hypoglycemic agents, among 2393 long-term hemodialysis patients who were followed for a median of 1.1 yr. We assessed mortality risk using Cox models in patient-level and dialysis facility-level analyses that used the facility proportion of patients on rosiglitazone as the predictor (instrumental variable approach) and adjusted the models for demographics, comorbid conditions, laboratory values, and achieved dialysis dosage. Compared with patients prescribed other oral hypoglycemic agents, patients prescribed rosiglitazone had significantly higher all-cause (hazard ratio [HR] 1.38; 95% confidence interval [CI] 1.05 to 1.82) and cardiovascular (HR 1.59; 95% CI 1.14 to 2.22) mortality, and their adjusted HR for hospitalization with myocardial infarction was 3.5-fold higher (P = 0.02). We did not observe similar associations in a secondary analysis evaluating pioglitazone. By the instrumental variable approach, facilities with more than the median adjusted percentage (6.2%) of patients who had diabetes and were prescribed rosiglitazone had significantly higher all-cause mortality (HR 1.36; 95% CI 1.15 to 1.62) and cardiovascular mortality (HR 1.42; 95% CI 1.07 to 1.88) than facilities with less than the median expected percentage prescribed rosiglitazone. Our practice-based findings suggest significant associations of rosiglitazone use with higher cardiovascular and all-cause mortality among hemodialysis patients with diabetes.


Asunto(s)
Nefropatías Diabéticas/terapia , Hipoglucemiantes/toxicidad , Fallo Renal Crónico/terapia , Diálisis Renal/mortalidad , Tiazolidinedionas/toxicidad , Anciano , Enfermedades Cardiovasculares/mortalidad , Angiopatías Diabéticas/mortalidad , Nefropatías Diabéticas/tratamiento farmacológico , Nefropatías Diabéticas/mortalidad , Femenino , Hospitalización/estadística & datos numéricos , Humanos , Fallo Renal Crónico/mortalidad , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Rosiglitazona
9.
Lifetime Data Anal ; 15(3): 343-56, 2009 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-19184420

RESUMEN

Relative risk frailty models are used extensively in analyzing clustered and/or recurrent time-to-event data. In this paper, Laplace's approximation for integrals is applied to marginal distributions of data arising from parametric relative risk frailty models. Under regularity conditions, the approximate maximum likelihood estimators (MLE) are consistent with a rate of convergence that depends on both the number of subjects and number of members per subject. We compare the approximate MLE against alternative estimators using limited simulation and demonstrate the utility of Laplace's approximation approach by analyzing U.S. patient waiting time to deceased kidney transplant data.


Asunto(s)
Funciones de Verosimilitud , Riesgo , Algoritmos , Humanos , Fallo Renal Crónico/cirugía , Trasplante de Riñón/estadística & datos numéricos , Modelos Estadísticos , Análisis Multivariante , Distribución de Poisson , Modelos de Riesgos Proporcionales , Estadísticas no Paramétricas , Factores de Tiempo , Obtención de Tejidos y Órganos/estadística & datos numéricos , Estados Unidos , Listas de Espera
10.
Am J Kidney Dis ; 53(3): 475-91, 2009 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-19150158

RESUMEN

BACKGROUND: Previously, the Dialysis Outcomes and Practice Patterns Study (DOPPS) has shown large international variations in vascular access practice. Greater mortality risks have been seen for hemodialysis (HD) patients dialyzing with a catheter or graft versus a native arteriovenous fistula (AVF). To further understand the relationship between vascular access practice and outcomes, we have applied practice-based analyses (using an instrumental variable approach) to decrease the treatment-by-indication bias of prior patient-level analyses. STUDY DESIGN: A prospective observational study of HD practices. SETTING & PARTICIPANTS: Data collected from 1996 to 2004 from 28,196 HD patients from more than 300 dialysis units participating in the DOPPS in 12 countries. PREDICTOR OR FACTOR: Patient-level or case-mix-adjusted facility-level vascular access use. OUTCOMES/MEASUREMENTS: Mortality and hospitalization risks. RESULTS: After adjusting for demographics, comorbid conditions, and laboratory values, greater mortality risk was seen for patients using a catheter (relative risk, 1.32; 95% confidence interval, 1.22 to 1.42; P < 0.001) or graft (relative risk, 1.15; 95% confidence interval, 1.06 to 1.25; P < 0.001) versus an AVF. Every 20% greater case-mix-adjusted catheter use within a facility was associated with 20% greater mortality risk (versus facility AVF use, P < 0.001); and every 20% greater facility graft use was associated with 9% greater mortality risk (P < 0.001). Greater facility catheter and graft use were both associated with greater all-cause and infection-related hospitalization. Catheter and graft use were greater in the United States than in Japan and many European countries. More than half the 36% to 43% greater case-mix-adjusted mortality risk for HD patients in the United States versus the 5 European countries from the DOPPS I and II was attributable to differences in vascular access practice, even after adjusting for other HD practices. Vascular access practice differences accounted for nearly 30% of the greater US mortality compared with Japan. LIMITATIONS: Possible existence of unmeasured facility- and patient-level confounders that could impact the relationship of vascular access use with outcomes. CONCLUSIONS: Facility-based analyses diminish treatment-by-indication bias and suggest that less catheter and graft use improves patient survival.


Asunto(s)
Derivación Arteriovenosa Quirúrgica/mortalidad , Derivación Arteriovenosa Quirúrgica/estadística & datos numéricos , Catéteres de Permanencia/estadística & datos numéricos , Fallo Renal Crónico/terapia , Diálisis Renal/mortalidad , Diálisis Renal/métodos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos
11.
Nephrol Dial Transplant ; 24(3): 963-72, 2009 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-19028748

RESUMEN

BACKGROUND: Retrospective studies of haemodialysis patients from large dialysis organizations in the United States have indicated that intravenous vitamin D may be associated with a survival benefit. However, patients prescribed vitamin D are generally healthier than those who are not, suggesting that treatment by indication may have biased previous findings. Additionally, no survival benefit associated with vitamin D has been shown in a recent meta-analysis in CKD patients. Because treatment-by-indication bias due to both measured and unmeasured confounders cannot be completely accounted for in standard regression or marginal structural models (MSMs), this study evaluates the association between vitamin D and mortality among participants in the Dialysis Outcomes and Practice Patterns Study (DOPPS) using standard regression and MSMs with an expanded set of covariates, as well as by instrumental variable models to minimize potential bias due to unmeasured confounders. METHODS: Data from 38 066 DOPPS participants from 12 countries between 1996 and 2007 were analysed. Mortality risk was assessed using standard baseline and time-varying Cox regression models, adjusted for demographics and detailed comorbidities, and MSMs. In models similar to instrumental variable analysis, the facility percentage of patients prescribed vitamin D, adjusted for the patient case mix, was used to predict patient-level mortality. RESULTS: Vitamin D prescription was significantly higher in the USA compared to other countries. On average, patients prescribed vitamin D had fewer comorbidities compared to those who were not. Vitamin D therapy was associated with lower mortality in adjusted time-varying standard regression models [relative ratio (RR) = 0.92 (95% confidence interval: 0.87-0.96)] and baseline MSMs [RR = 0.84 (0.78-0.98)] and time-varying MSMs [RR = 0.78 (0.73-0.84)]. No significant differences in mortality were observed in adjusted baseline standard regression models for patients with or without vitamin D prescription [RR = 0.98 (0.93-1.02)] or for patients in facility practices where vitamin D prescription was more frequent [RR for facilities in 75th versus 25th percentile of vitamin D prescription = 0.99 (0.94-1.04)]. CONCLUSIONS: Vitamin D was associated with a survival benefit in models prone to bias due to unmeasured confounding. In agreement with a meta-analysis of randomized controlled studies, no difference in mortality was observed in instrumental variable models that tend to be more independent of unmeasured confounding. These findings indicate that a randomized controlled trial of vitamin D and clinical outcomes in haemodialysis patients are needed and can be ethically conducted.


Asunto(s)
Enfermedades Renales/mortalidad , Enfermedades Renales/terapia , Diálisis Renal , Vitamina D/uso terapéutico , Vitaminas/uso terapéutico , Adulto , Anciano , Enfermedad Crónica , Factores de Confusión Epidemiológicos , Femenino , Humanos , Enfermedades Renales/complicaciones , Masculino , Persona de Mediana Edad , Selección de Paciente , Pautas de la Práctica en Medicina , Estudios Retrospectivos , Sesgo de Selección , Tasa de Supervivencia
13.
Nephrol Dial Transplant ; 23(10): 3227-33, 2008 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-18424461

RESUMEN

BACKGROUND: The Dialysis Outcomes and Practice Patterns Study (DOPPS) database was used to develop and validate a practice-related risk score (PRS) based on modifiable practices to help facilities assess potential areas for improving patient care. METHODS: Relative risks (RRs) from a multivariable Cox mortality model, based on observational haemodialysis (HD) patient data from DOPPS I (1996-2001, seven countries), were used. The four practices were the percent of patients with Kt/V > or =1.2, haemoglobin > or =11 g/dl (110 g/l), albumin > or =4.0 g/dl (40g/l) and catheter use, and were significantly related to mortality when modelled together. DOPPS II data (2002-2004, 12 countries) were used to evaluate the relationship between PRS and mortality risk using Cox regression. RESULTS: For facilities in DOPPS I and II, changes in PRS over time were significantly correlated with changes in the standardized mortality ratio (SMR). The PRS ranged from 1.0 to 2.1. Overall, the adjusted RR of death was 1.05 per 0.1 points higher PRS (P < 0.0001). For facilities in both DOPPS I and II (N = 119), a 0.2 decrease in PRS was associated with a 0.19 decrease in SMR (P = 0.005). On average, facilities that improved PRS practices showed significantly reduced mortality over the same time frame. CONCLUSIONS: The PRS assesses modifiable HD practices that are linked to improved patient survival. Further refinements might lead to improvements in the PRS and will address regional variations in the PRS/mortality relationship.


Asunto(s)
Instituciones de Atención Ambulatoria/normas , Diálisis Renal/normas , Adulto , Instituciones de Atención Ambulatoria/estadística & datos numéricos , Bases de Datos Factuales , Humanos , Fallo Renal Crónico/terapia , Modelos de Riesgos Proporcionales , Garantía de la Calidad de Atención de Salud/estadística & datos numéricos , Diálisis Renal/mortalidad , Diálisis Renal/estadística & datos numéricos , Medición de Riesgo
14.
Med Care ; 46(2): 120-6, 2008 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-18219239

RESUMEN

BACKGROUND: In developing "pay-for-performance" and capitation systems that provide incentives for improving the quality and efficiency of care, policymakers need to determine which healthcare providers to evaluate and reward. OBJECTIVES: This study demonstrates methods for determining and understanding the relative contributions of facilities and physicians to the quality and cost of care. Specifically, this study distinguishes levels of variation in resource utilization (RU), based on research to support the development of an expanded Medicare dialysis prospective payment system. RESEARCH DESIGN: Mixed models were used to estimate the variation in RU across institutional providers, physicians, patients, and months (within patients), after adjusting for case-mix. SUBJECTS: The study includes 10,367 Medicare hemodialysis patients treated in a 4.2% stratified random sample of dialysis facilities in 2003. MEASURES: Monthly RU was measured by the average Medicare allowable charge per dialysis session for separately billable dialysis-related services (mainly injectable medications and laboratory tests) from Medicare claims. RESULTS: There was financially significant variation in RU across institutional providers and to a lesser degree across physicians, after adjusting for differences in case-mix. The remaining variation in RU reflects unexplained differences across patients that persist over time and transitory fluctuations for individual patients. CONCLUSIONS: The greater variation in RU occurring across dialysis facilities than across physicians is consistent with targeting payments to facilities, but alignment of incentives between facilities and physicians remains an important goal. Similar analytic methods may be useful in designing payment policies that reward providers for improving the quality of care.


Asunto(s)
Instituciones de Atención Ambulatoria/economía , Medicare Part B/normas , Planes de Incentivos para los Médicos , Garantía de la Calidad de Atención de Salud/economía , Reembolso de Incentivo , Diálisis Renal/economía , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Instituciones de Atención Ambulatoria/normas , Grupos Diagnósticos Relacionados , Recursos en Salud/estadística & datos numéricos , Humanos , Medicare Part B/economía , Persona de Mediana Edad , Modelos Econométricos , Sistema de Pago Prospectivo , Diálisis Renal/normas , Ajuste de Riesgo , Estados Unidos
15.
Liver Transpl ; 13(12): 1678-83, 2007 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-18044787

RESUMEN

Obese patients are at higher risk for morbidity and mortality after liver transplantation (LT) than nonobese recipients. However, there are no reports assessing the survival benefit of LT according to recipient body mass index (BMI). A retrospective cohort of liver transplant candidates who were initially wait-listed between September 2001 and December 2004 was identified in the Scientific Registry of Transplant Recipients database. Adjusted Cox regression models were fitted to assess the association between BMI and liver transplant survival benefit (posttransplantation vs. waiting list mortality). During the study period, 25,647 patients were placed on the waiting list. Of these, 4,488 (17%) underwent LT by December 31, 2004. At wait-listing and transplantation, similar proportions were morbidly obese (BMI>or=40; 3.8% vs. 3.4%, respectively) and underweight (BMI<20; 4.5% vs. 4.0%, respectively). Underweight patients experienced a significantly higher covariate-adjusted risk of death on the waiting list (hazard ratio [HR]=1.61; P<0.0001) compared to normal weight candidates (BMI 20 to <25), but underweight recipients had a similar risk of posttransplantation death (HR=1.28; P=0.15) compared to recipients of normal weight. In conclusion, compared to patients on the waiting list with a similar BMI, all subgroups of liver transplant recipients demonstrated a significant (P<0.0001) survival benefit, including morbidly obese and underweight recipients. Our results suggest that high or low recipient BMI should not be a contraindication for LT.


Asunto(s)
Índice de Masa Corporal , Fallo Hepático/mortalidad , Trasplante de Hígado , Obesidad/complicaciones , Delgadez/complicaciones , Adulto , Femenino , Estudios de Seguimiento , Humanos , Fallo Hepático/complicaciones , Fallo Hepático/fisiopatología , Fallo Hepático/cirugía , Masculino , Persona de Mediana Edad , Obesidad/mortalidad , Obesidad/fisiopatología , Obesidad/cirugía , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Medición de Riesgo , Delgadez/mortalidad , Delgadez/fisiopatología , Delgadez/cirugía , Factores de Tiempo , Resultado del Tratamiento , Listas de Espera
16.
Transplantation ; 83(8): 1069-74, 2007 Apr 27.
Artículo en Inglés | MEDLINE | ID: mdl-17452897

RESUMEN

BACKGROUND: Elderly patients (ages 70 yr and older) are among the fastest-growing group starting renal-replacement therapy in the United States. The outcomes of elderly patients who receive a kidney transplant have not been well studied compared with those of their peers on the waiting list. METHODS: Using the Scientific Registry of Transplant Recipients, we analyzed data from 5667 elderly renal transplant candidates who initially were wait-listed from January 1, 1990 to December 31, 2004. Of these candidates, 2078 received a deceased donor transplant, and 360 received a living donor transplant by 31 December 2005. Time-to-death was studied using Cox regression models with transplant as a time-dependent covariate. Mortality hazard ratios (RRs) of transplant versus waiting list were adjusted for recipient age, sex, race, ethnicity, blood type, panel reactive antibody, year of placement on the waiting list, dialysis modality, comorbidities, donation service area, and time from first dialysis to first placement on the waiting list. RESULTS: Elderly transplant recipients had a 41% lower overall risk of death compared with wait-listed candidates (RR=0.59; P<0.0001). Recipients of nonstandard, that is, expanded criteria donor, kidneys also had a significantly lower mortality risk (RR=0.75; P<0.0001). Elderly patients with diabetes and those with hypertension as a cause of end-stage renal disease also experienced a large benefit. CONCLUSIONS: Transplantation offers a significant reduction in mortality compared with dialysis in the wait-listed elderly population with end-stage renal disease.


Asunto(s)
Trasplante de Riñón/estadística & datos numéricos , Pacientes/estadística & datos numéricos , Sistema de Registros , Distribución por Edad , Anciano , Femenino , Rechazo de Injerto/mortalidad , Rechazo de Injerto/patología , Supervivencia de Injerto , Humanos , Trasplante de Riñón/efectos adversos , Masculino , Factores de Riesgo , Tasa de Supervivencia , Factores de Tiempo
17.
Stat Med ; 26(1): 139-55, 2007 Jan 15.
Artículo en Inglés | MEDLINE | ID: mdl-16526006

RESUMEN

In this paper, we propose a model for medical costs recorded at regular time intervals, e.g. every month, as repeated measures in the presence of a terminating event, such as death. Prior models have related monthly medical costs to time since entry, with extra costs at the final observations at the time of death. Our joint model for monthly medical costs and survival time incorporates two important new features. First, medical cost and survival may be correlated because more 'frail' patients tend to accumulate medical costs faster and die earlier. A joint random effects model is proposed to account for the correlation between medical costs and survival by a shared random effect. Second, monthly medical costs usually increase during the time period prior to death because of the intensive care for dying patients. We present a method for estimating the pattern of cost prior to death, which is applicable if the pattern can be characterized as an additive effect that is limited to a fixed time interval, say b units of time before death. This 'turn back time' method for censored observations censors cost data b units of time before the actual censoring time, while keeping the actual censoring time for the survival data. Time-dependent covariates can be included. Maximum likelihood estimation and inference are carried out through a Monte Carlo EM algorithm with a Metropolis-Hastings sampler in the E-step. An analysis of monthly outpatient EPO medical cost data for dialysis patients is presented to illustrate the proposed methods.


Asunto(s)
Costos de la Atención en Salud/estadística & datos numéricos , Modelos Estadísticos , Mortalidad , Algoritmos , Biometría , Costo de Enfermedad , Costos de los Medicamentos/estadística & datos numéricos , Epoetina alfa , Eritropoyetina/economía , Humanos , Funciones de Verosimilitud , Método de Montecarlo , Modelos de Riesgos Proporcionales , Proteínas Recombinantes , Diálisis Renal/economía , Análisis de Supervivencia
18.
Biometrics ; 62(3): 910-7, 2006 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-16984335

RESUMEN

Survival analysis is often used to compare experimental and conventional treatments. In observational studies, the therapy may change during follow-up and such crossovers can be summarized by time-dependent covariates. Given the ever-increasing donor organ shortage, higher-risk kidneys from expanded criterion donors (ECD) are being transplanted. Transplant candidates can choose whether to accept an ECD organ (experimental therapy), or to remain on dialysis and wait for a possible non-ECD transplant later (conventional therapy). A three-group time-dependent analysis of such data involves estimating parameters corresponding to two time-dependent indicator covariates representing ECD transplant and non-ECD transplant, each compared to remaining on dialysis on the waitlist. However, the ECD hazard ratio estimated by this time-dependent analysis fails to account for the fact that patients who forego an ECD transplant are not destined to remain on dialysis forever, but could subsequently receive a non-ECD transplant. We propose a novel method of estimating the survival benefit of ECD transplantation relative to conventional therapy (waitlist with possible subsequent non-ECD transplant). Compared to the time-dependent analysis, the proposed method more accurately characterizes the data structure and yields a more direct estimate of the relative outcome with an ECD transplant.


Asunto(s)
Biometría/métodos , Estudios Cruzados , Interpretación Estadística de Datos , Humanos , Fallo Renal Crónico/mortalidad , Fallo Renal Crónico/cirugía , Fallo Renal Crónico/terapia , Trasplante de Riñón , Modelos de Riesgos Proporcionales , Terapia de Reemplazo Renal , Análisis de Supervivencia , Factores de Tiempo
19.
Am J Kidney Dis ; 47(4): 666-71, 2006 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-16564944

RESUMEN

In April 2005, Medicare began adjusting payments to dialysis providers for composite-rate services for a limited set of patient characteristics, including age, body surface area, and low body mass index. We present analyses intended to help the end-stage renal disease community understand the empirical reasons behind the new composite-rate basic case-mix adjustment. The U-shaped relationship between age and composite-rate cost that is reflected in the basic case-mix adjustment has generated significant discussion within the end-stage renal disease community. Whereas greater costs among older patients are consistent with conventional wisdom, greater costs among younger patients are caused in part by more skipped sessions and a greater incidence of certain costly comorbidities. Longer treatment times for patients with a greater body surface area combined with the largely fixed cost structure of dialysis facilities explains much of the greater cost for larger patients. The basic case-mix adjustment reflects an initial and partial adjustment for the cost of providing composite-rate services.


Asunto(s)
Fallo Renal Crónico/economía , Fallo Renal Crónico/terapia , Diálisis Renal/economía , Ajuste de Riesgo , Adulto , Anciano , Anciano de 80 o más Años , Costos y Análisis de Costo , Femenino , Humanos , Masculino , Medicare , Persona de Mediana Edad , Estados Unidos
20.
JAMA ; 294(21): 2726-33, 2005 Dec 07.
Artículo en Inglés | MEDLINE | ID: mdl-16333008

RESUMEN

CONTEXT: Transplantation using kidneys from deceased donors who meet the expanded criteria donor (ECD) definition (age > or =60 years or 50 to 59 years with at least 2 of the following: history of hypertension, serum creatinine level >1.5 mg/dL [132.6 micromol/L], and cerebrovascular cause of death) is associated with 70% higher risk of graft failure compared with non-ECD transplants. However, if ECD transplants offer improved overall patient survival, inferior graft outcome may represent an acceptable trade-off. OBJECTIVE: To compare mortality after ECD kidney transplantation vs that in a combined standard-therapy group of non-ECD recipients and those still receiving dialysis. DESIGN, SETTING, AND PATIENTS: Retrospective cohort study using data from a US national registry of mortality and graft outcomes among kidney transplant candidates and recipients. The cohort included 109,127 patients receiving dialysis and added to the kidney waiting list between January 1, 1995, and December 31, 2002, and followed up through July 31, 2004. MAIN OUTCOME MEASURE: Long-term (3-year) relative risk of mortality for ECD kidney recipients vs those receiving standard therapy, estimated using time-dependent Cox regression models. RESULTS: By end of follow-up, 7790 ECD kidney transplants were performed. Because of excess ECD recipient mortality in the perioperative period, cumulative survival did not equal that of standard-therapy patients until 3.5 years posttransplantation. Long-term relative mortality risk was 17% lower for ECD recipients (relative risk, 0.83; 95% confidence interval, 0.77-0.90; P<.001). Subgroups with significant ECD survival benefit included patients older than 40 years, both sexes, non-Hispanics, all races, unsensitized patients, and those with diabetes or hypertension. In organ procurement organizations (OPOs) with long median waiting times (>1350 days), ECD recipients had a 27% lower risk of death (relative risk, 0.73; 95% confidence interval, 0.64-0.83; P<.001). In areas with shorter waiting times, only recipients with diabetes demonstrated an ECD survival benefit. CONCLUSIONS: ECD kidney transplants should be offered principally to candidates older than 40 years in OPOs with long waiting times. In OPOs with shorter waiting times, in which non-ECD kidney transplant availability is higher, candidates should be counseled that ECD survival benefit is observed only for patients with diabetes.


Asunto(s)
Selección de Donante/normas , Trasplante de Riñón/mortalidad , Adolescente , Adulto , Anciano , Algoritmos , Niño , Preescolar , Estudios de Cohortes , Femenino , Humanos , Lactante , Fallo Renal Crónico/mortalidad , Fallo Renal Crónico/terapia , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Diálisis Renal , Estudios Retrospectivos , Análisis de Supervivencia , Listas de Espera
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...