Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 285
Filtrar
1.
Clin Transplant ; 38(8): e15435, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39158946

RESUMEN

BACKGROUND: Delayed graft function (DGF) after kidney transplantation is associated with adverse patients and allograft outcomes. A longer duration of DGF is predictive of worse graft outcomes compared to a shorter duration. Posttransplant serum ß2-microglobulin (B2M) is associated with long-term graft outcomes, but its relationship with DGF recovery is unknown. METHODS: We included all kidney-only transplant recipients with DGF enrolled in the E-DGF trial. Duration of DGF was defined as the interval between the transplant and the last dialysis session. We analyzed the association of standardized serum creatinine (Scr) and B2M on postoperative Days (POD) 1-7 during the subsequent days of DGF with the recovery of DGF. RESULTS: A total of 97 recipients with DGF were included. The mean duration of DGF was 11.0 ± 11.2 days. Higher Scr was not associated with the duration of DGF in unadjusted or adjusted models. Higher standardized B2M, in contrast, was associated with a prolonged duration of DGF. This association remained in models adjusting for baseline characteristics from POD 2 (3.19 days longer, 95% CI: 0.46-5.93; p = 0.02) through Day 6 of DGF (4.97 days longer, 95% CI: 0.75-9.20; p = 0.02). There was minimal change in mean Scr (0.01 ± 0. 10 mg/dL per day; p = 0.32), while B2M significantly decreased as the time to recovery approached (-0.14 ± 0.05 mg/L per day; p = 0.006), among recipients with DGF. CONCLUSION: B2M is more strongly associated with DGF recovery than Scr. Posttransplant B2M may be an important biomarker to monitor during DGF. TRIAL REGISTRATION: ClinicalTrials.gov identifier: NCT03864926.


Asunto(s)
Biomarcadores , Funcionamiento Retardado del Injerto , Tasa de Filtración Glomerular , Supervivencia de Injerto , Trasplante de Riñón , Microglobulina beta-2 , Humanos , Trasplante de Riñón/efectos adversos , Funcionamiento Retardado del Injerto/sangre , Funcionamiento Retardado del Injerto/etiología , Femenino , Masculino , Microglobulina beta-2/sangre , Persona de Mediana Edad , Pronóstico , Biomarcadores/sangre , Estudios de Seguimiento , Adulto , Factores de Riesgo , Rechazo de Injerto/etiología , Rechazo de Injerto/sangre , Rechazo de Injerto/diagnóstico , Fallo Renal Crónico/cirugía , Fallo Renal Crónico/sangre , Recuperación de la Función , Pruebas de Función Renal , Complicaciones Posoperatorias/sangre , Factores de Tiempo , Receptores de Trasplantes/estadística & datos numéricos
2.
Artículo en Inglés | MEDLINE | ID: mdl-39187461

RESUMEN

BACKGROUND: Hydroxychloroquine (HCQ) nonadherence is associated with a 3-fold higher risk of lupus-related hospitalization. Monitoring HCQ blood levels could improve adherence and efficacy. Yet, HCQ level monitoring is not routinely done partially due to cost and coverage concerns. To establish HCQ level monitoring cost-effectiveness, we reported: 1) risk of acute care utilization by HCQ blood levels; 2) cost of HCQ monitoring vs. acute care visits. METHODS: HCQ blood levels were measured during routine lupus visits. HCQ levels were categorized as: a) subtherapeutic (<750 ng/ml), b) therapeutic (750-1200 ng/ml), or c) supratherapeutic (>1200 ng/ml). All lupus-related acute care visits (ER visits/hospitalizations) after the index clinic visit until next follow-up were abstracted. In our primary analysis, we examined associations between HCQ levels and time to first acute care visit in all patients and subgroups with higher acute care utilization. RESULTS: A total of 39 lupus-related acute care visits were observed in 181 patients. Therapeutic HCQ blood levels were associated with 66% lower acute care utilization. In our cohort, two groups, people of Black race or Hispanic ethnicity and those with public insurance, faced 3-4x higher acute care utilization. Levels within 750-1200 ng/ml were associated with 95% lower acute care utilization in subgroups with higher acute care utilization. CONCLUSION: HCQ blood levels within 750-1200 ng/ml are associated with lower acute care utilization in all patients with lupus, including groups with higher acute care utilization. Future clinical trials should establish the causal association between HCQ level monitoring and acute care utilization in lupus.

3.
Clin Transplant ; 38(6): e15368, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-39031705

RESUMEN

Describing risk factors and outcomes in kidney transplant recipients with oxalate nephropathy (ON) may help elucidate the pathogenesis and guide treatment strategies. We used a large single-center database to identify patients with ON and categorized them into delayed graft function with ON (DGF-ON) and late ON. Incidence density sampling was used to select controls. A total of 37 ON cases were diagnosed between 1/2011 and 1/2021. DGF-ON (n = 13) was diagnosed in 1.05% of the DGF population. Pancreatic atrophy on imaging (36.4% vs. 2.9%, p = 0.002) and gastric bypass history (7.7% vs. 0%; p = 0.06) were more common in DGF-ON than with controls with DGF requiring biopsy but without evidence of ON. DGF-ON was not associated with worse graft survival (p = 0.98) or death-censored graft survival (p = 0.48). Late ON (n = 24) was diagnosed after a mean of 78.2 months. Late ON patients were older (mean age 55.1 vs. 48.4 years; p = 0.02), more likely to be women (61.7% vs. 37.5%; p = 0.03), have gastric bypass history (8.3% vs. 0.8%; p = 0.02) and pancreatic atrophy on imaging (38.9% vs. 13.3%; p = 0.02). Late ON was associated with an increased risk of graft failure (HR 2.0; p = 0.07) and death-censored graft loss (HR 2.5; p = 0.10). We describe two phenotypes of ON after kidney transplantation: DGF-ON and late ON. Our study is the first to our knowledge to evaluate DGF-ON with DGF controls without ON. Although limited by small sample size, DGF-ON was not associated with adverse outcomes when compared with controls. Late ON predicted worse allograft outcomes.


Asunto(s)
Supervivencia de Injerto , Trasplante de Riñón , Fenotipo , Complicaciones Posoperatorias , Humanos , Trasplante de Riñón/efectos adversos , Femenino , Masculino , Persona de Mediana Edad , Factores de Riesgo , Pronóstico , Estudios de Seguimiento , Complicaciones Posoperatorias/diagnóstico , Complicaciones Posoperatorias/etiología , Tasa de Filtración Glomerular , Funcionamiento Retardado del Injerto/etiología , Estudios Retrospectivos , Oxalatos/metabolismo , Pruebas de Función Renal , Enfermedades Renales/etiología , Enfermedades Renales/cirugía , Fallo Renal Crónico/cirugía , Adulto , Estudios de Casos y Controles , Rechazo de Injerto/etiología , Rechazo de Injerto/diagnóstico , Rechazo de Injerto/patología , Tasa de Supervivencia
4.
Am J Kidney Dis ; 2024 Jun 12.
Artículo en Inglés | MEDLINE | ID: mdl-38876272

RESUMEN

RATIONALE & OBJECTIVE: Exposure to extreme heat events has been linked to increased morbidity and mortality in the general population. Patients receiving maintenance dialysis may be vulnerable to greater risks from these events, but this is not well understood. We characterized the association of extreme heat events and the risk of death among patients receiving dialysis in the United States. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Data from the US Renal Data System were used to identify adults living in US urban settlements prone to extreme heat who initiated maintenance dialysis between 1997 and 2016. EXPOSURE: An extreme heat event, defined as a time-updated heat index (a humid-heat metric) exceeding 40.6°C for≥2 days or 46.1°C for≥1day. OUTCOME: Death. ANALYTICAL APPROACH: Cox proportional hazards regression to estimate the elevation in risk of death during a humid-heat event adjusted for age, sex, year of dialysis initiation, dialysis modality, poverty level, and climate region. Interactions between humid-heat and these same factors were explored. RESULTS: Among 945,251 adults in 245 urban settlements, the mean age was 63 years, and 44% were female. During a median follow-up period of 3.6 years, 498,049 adults were exposed to at least 1 of 7,154 extreme humid-heat events, and 500,025 deaths occurred. In adjusted models, there was an increased risk of death (hazard ratio 1.18 [95% CI, 1.15-1.20]) during extreme humid-heat exposure. The relative mortality risk was higher among patients living in the Southeast (P<0.001) compared with the Southwest. LIMITATIONS: Possibility of exposure misclassification, did not account for land use and air pollution co-exposures. CONCLUSIONS: This study suggests that patients receiving dialysis face an increased risk of death during extreme humid-heat exposure. PLAIN-LANGUAGE SUMMARY: Patients who receive dialysis are vulnerable to extreme weather events, and rising global temperatures may bring more frequent extreme heat events. We sought to determine whether extreme heat exposure was associated with an increased risk of death in urban-dwelling patients receiving dialysis across the United States. We found that people receiving dialysis were more likely to die during extreme humid-heat events, defined by a heat index exceeding 40.6°C (105°F) for≥2 days or 46.1°C (115°F) for≥1day. These findings inform the nephrology community about the potential importance of protecting patients receiving maintenance dialysis from the risks associated with extreme heat.

5.
Arthritis Care Res (Hoboken) ; 76(9): 1232-1245, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-38693617

RESUMEN

OBJECTIVE: Social determinants of health (SDoH) likely contribute to outcome disparities in lupus nephritis (LN). Understanding the overall burden and contribution of each domain could guide future health equity-focused interventions to improve outcomes and reduce disparities in LN. Objectives of this meta-analysis were to 1) determine the association of overall SDoH and specific SDoH domains on LN outcomes and 2) develop a framework for the multidimensional impact of SDoH on LN outcomes. METHODS: We performed a comprehensive search of studies measuring associations between SDoH and LN outcomes. We examined pooled odds of poor LN outcomes including death, end-stage kidney disease, or cardiovascular disease in patients with and without adverse SDoH. Additionally, we calculated the pooled odds ratios of outcomes by four SDoH domains: individual (eg, insurance), health care (eg, fragmented care), community (eg, neighborhood socioeconomic status), and health behaviors (eg, smoking). RESULTS: Among 531 screened studies, 31 meeting inclusion criteria and 13 with raw data were included in meta-analysis. Pooled odds of poor outcomes were 1.47-fold higher in patients with any adverse SDoH. Patients with adverse SDoH in individual and health care domains had 1.64-fold and 1.77-fold higher odds of poor outcomes. We found a multiplicative impact of having two or more adverse SDoH on LN outcomes. Black patients with public insurance and fragmented care had 12-fold higher odds of poor LN outcomes. CONCLUSION: Adverse SDoH is associated with poor LN outcomes. Having two or more adverse SDoH, specifically in different SDoH domains, had a multiplicative impact leading to worse LN outcomes, widening disparities.


Asunto(s)
Nefritis Lúpica , Determinantes Sociales de la Salud , Humanos , Disparidades en el Estado de Salud , Disparidades en Atención de Salud , Nefritis Lúpica/terapia , Factores de Riesgo
6.
Am J Nephrol ; : 1-10, 2024 May 16.
Artículo en Inglés | MEDLINE | ID: mdl-38754385

RESUMEN

INTRODUCTION: The Center for Medicare and Medicaid Services introduced an End-Stage Renal Disease Prospective Payment System (PPS) in 2011 to increase the utilization of home dialysis modalities, including peritoneal dialysis (PD). Several studies have shown a significant increase in PD utilization after PPS implementation. However, its impact on patients with kidney allograft failure remains unknown. METHODS: We conducted an interrupted time series analysis using data from the US Renal Data System (USRDS) that include all adult kidney transplant recipients with allograft failure who started dialysis between 2005 and 2019. We compared the PD utilization in the pre-PPS period (2005-2010) to the fully implemented post-PPS period (2014-2019) for early (within 90 days) and late (91-365 days) PD experience. RESULTS: A total of 27,507 adult recipients with allograft failure started dialysis during the study period. There was no difference in early PD utilization between the pre-PPS and the post-PPS period in either immediate change (0.3% increase; 95% CI: -1.95%, 2.54%; p = 0.79) or rate of change over time (0.28% increase per year; 95% CI: -0.16%, 0.72%; p = 0.18). Subgroup analyses revealed a trend toward higher PD utilization post-PPS in for-profit and large-volume dialysis units. There was a significant increase in PD utilization in the post-PPS period in units with low PD experience in the pre-PPS period. Similar findings were seen for the late PD experience. CONCLUSION: PPS did not significantly increase the overall utilization of PD in patients initiating dialysis after allograft failure.

7.
Lupus ; 33(8): 804-815, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38631342

RESUMEN

OBJECTIVE: In systemic lupus erythematosus, poor disease outcomes occur in young adults, patients identifying as Black or Hispanic, and socioeconomically disadvantaged patients. These identities and social factors differentially shape care access and quality that contribute to lupus health disparities in the US. Thus, our objective was to measure markers of care access and quality, including rheumatology visits (longitudinal care retention) and lupus-specific serology testing, by race and ethnicity, neighborhood disadvantage, and geographic context. METHODS: This cohort study used a geo-linked 20% national sample of young adult Medicare beneficiaries (ages 18-35) with lupus-coded encounters and a 1-year assessment period. Retention in lupus care required a rheumatology visit in each 6-month period, and serology testing required ≥1 complement or dsDNA antibody test within the year. Multivariable logistic regression models were fit for visit-based retention and serology testing to determine associations with race and ethnicity, neighborhood disadvantage, and geography. RESULTS: Among 1,036 young adults with lupus, 39% saw a rheumatologist every 6 months and 28% had serology testing. White beneficiaries from the least disadvantaged quintile of neighborhoods had higher visit-based retention than other beneficiaries (64% vs 30%-60%). Serology testing decreased with increasing neighborhood disadvantage quintile (aOR 0.80; 95% CI 0.71, 0.90) and in the Midwest (aOR 0.46; 0.30, 0.71). CONCLUSION: Disparities in care, measured by rheumatology visits and serology testing, exist by neighborhood disadvantage, race and ethnicity, and region among young adults with lupus, despite uniform Medicare coverage. Findings support evaluating lupus care quality measures and their impact on US lupus outcomes.


Asunto(s)
Disparidades en Atención de Salud , Lupus Eritematoso Sistémico , Medicare , Reumatología , Adolescente , Adulto , Femenino , Humanos , Masculino , Adulto Joven , Negro o Afroamericano/estadística & datos numéricos , Estudios de Cohortes , Accesibilidad a los Servicios de Salud/estadística & datos numéricos , Disparidades en Atención de Salud/estadística & datos numéricos , Modelos Logísticos , Lupus Eritematoso Sistémico/terapia , Retención en el Cuidado/estadística & datos numéricos , Estados Unidos , Hispánicos o Latinos , Blanco
8.
Transplant Direct ; 10(4): e1607, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38464426

RESUMEN

Background: Posttransplant erythrocytosis (PTE) is a well-known complication of kidney transplantation. However, the risk and outcomes of PTE among simultaneous pancreas-kidney transplant (SPKT) recipients are poorly described. Methods: We analyzed all SPKT recipients at our center between 1998 and 2021. PTE was defined as at least 2 consecutive hematocrit levels of >51% within the first 2 y of transplant. Controls were selected at a ratio of 3:1 at the time of PTE occurrence using event density sampling. Risk factors for PTE and post-PTE graft survival were identified. Results: Of 887 SPKT recipients, 108 (12%) developed PTE at a median of 273 d (interquartile range, 160-393) after transplantation. The incidence rate of PTE was 7.5 per 100 person-years. Multivariate analysis found pretransplant dialysis (hazard ratio [HR]: 3.15; 95% confidence interval [CI], 1.67-5.92; P < 0.001), non-White donor (HR: 2.14; 95% CI, 1.25-3.66; P = 0.01), female donor (HR: 1.50; 95% CI, 1.0-2.26; P = 0.05), and male recipient (HR: 2.33; 95% CI, 1.43-3.70; P = 0.001) to be associated with increased risk. The 108 cases of PTE were compared with 324 controls. PTE was not associated with subsequent pancreas graft failure (HR: 1.36; 95% CI, 0.51-3.68; P = 0.53) or kidney graft failure (HR: 1.16; 95% CI, 0.40-3.42; P = 0.78). Conclusions: PTE is a common complication among SPKT recipients, even in the modern era of immunosuppression. PTE among SPKT recipients was not associated with adverse graft outcomes, likely due to appropriate management.

9.
Transplant Direct ; 10(4): e1600, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38550773

RESUMEN

Background: Recurrence of glomerulonephritis (GN) is a significant contributor to long-term allograft failure among kidney transplant recipients (KTRs) with kidney failure because of GN. Accumulating evidence has revealed the role of vitamin D in both innate and adaptive immunity. Although vitamin D deficiency is common among KTRs, the association between 25-hydroxyvitamin D (25[OH]D) and GN recurrence in KTRs remains unclear. Methods: We analyzed data from KTRs with kidney failure caused by GN who received a transplant at our center from 2000 to 2019 and had at least 1 valid posttransplant serum 25(OH)D measurement. Survival analyses were performed using a competing risk regression model considering other causes of allograft failure, including death, as competing risk events. Results: A total of 67 cases of GN recurrence were identified in 947 recipients with GN followed for a median of 7.0 y after transplant. Each 1 ng/mL lower serum 25(OH)D was associated with a 4% higher hazard of recurrence (subdistribution hazard ratio [HR]: 1.04; 95% confidence interval [CI], 1.01-1.06). Vitamin D deficiency (≤20 ng/mL) was associated with a 2.99-fold (subdistribution HR: 2.99; 95% CI, 1.56-5.73) higher hazard of recurrence compared with vitamin D sufficiency (≥30 ng/mL). Results were similar after further adjusting for concurrent urine protein-creatinine ratio, serum albumin, and estimated glomerular filtration rate (eGFR). Conclusions: Posttransplant vitamin D deficiency is associated with a higher hazard of GN recurrence in KTRs. Further prospective observational studies and clinical trials are needed to determine any causal role of vitamin D in the recurrence of GN after kidney transplantation. More in vitro and in vivo experiments would be helpful to understand its effects on autoimmune and inflammation processes.

10.
Transplant Direct ; 10(2): e1575, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38264296

RESUMEN

Background: Kidney transplant outcomes have dramatically improved since the first successful transplant in 1954. In its early years, kidney transplantation was viewed more skeptically. Today it is considered the treatment of choice among patients with end-stage kidney disease. Methods: Our program performed its first kidney transplant in 1966 and recently performed our 12 000th kidney transplant. Here, we review and describe our experience with these 12 000 transplants. Transplant recipients were analyzed by decade of date of transplant: 1966-1975, 1976-1985, 1986-1995, 1996-2005, 2006-2015, and 2016-2022. Death-censored graft failure and mortality were outcomes of interest. Results: Of 12 000 kidneys, 247 were transplanted from 1966 to 1975, 1147 from 1976 to 1985, 2194 from 1986 to 1995, 3147 from 1996 to 2005, 3046 from 2006 to 2015, and 2219 from 2016 to 2022 compared with 1966-1975, there were statistically significant and progressively lower risks of death-censored graft failure at 1 y, 5 y, and at last follow-up in all subsequent eras. Although mortality at 1 y was lower in all subsequent eras after 1986-1995, there was no difference in mortality at 5 y or the last follow-up between eras. Conclusions: In this large cohort of 12 000 kidneys from a single center, we observed significant improvement in outcomes over time. Kidney transplantation remains a robust and ever-growing and improving field.

11.
Arthritis Care Res (Hoboken) ; 76(2): 241-250, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-37667434

RESUMEN

OBJECTIVE: Recent data show that lower hydroxychloroquine (HCQ) doses are associated with a two- to six-fold higher risk of lupus flares. Thus, establishing an effective reference range of HCQ blood levels with upper and lower bounds for efficacy may support individualizing HCQ dosing to prevent flares. METHODS: HCQ levels in whole blood and Systemic Lupus Erythematosus Disease Activity Index (SLEDAI) were measured during the baseline visit and again during a standard of care routine follow-up visit. Active cross-sectional lupus at baseline was defined as SLEDAI ≥6; a within subject flare was defined as a subsequent three-point increase in SLEDAI with clinical symptoms requiring therapy change. We examined associations between active lupus and HCQ blood levels at baseline and flares and HCQ levels during 6 to 12-month routine lupus follow-up visits using mixed regression analysis. RESULTS: Among 158 baseline patient visits, 19% had active lupus. Odds of active lupus were 71% lower in patients with levels within a 750 to 1,200 ng/mL range (adjusted odds ratio 0.29, 95% confidence interval 0.08-0.96). Using convenience sampling strategy during a pandemic, we longitudinally followed 42 patients. Among those patients, 17% flared during their follow-up visit. Maintaining HCQ levels within 750 to 1,200 ng/mL reduced the odds of a flare by 26% over a nine-month median follow-up. CONCLUSION: An effective reference range of HCQ blood levels, 750 to 1,200 ng/mL, was associated with 71% lower odds of active lupus, and maintaining levels within this range reduced odds of flares by 26%. These findings could guide clinicians to individualize HCQ doses to maintain HCQ levels within this range to maximize efficacy.


Asunto(s)
Antirreumáticos , Lupus Eritematoso Sistémico , Humanos , Hidroxicloroquina , Estudios Transversales , Valores de Referencia , Lupus Eritematoso Sistémico/diagnóstico , Lupus Eritematoso Sistémico/tratamiento farmacológico
12.
Clin Transplant ; 38(1): e15217, 2024 01.
Artículo en Inglés | MEDLINE | ID: mdl-38078682

RESUMEN

BACKGROUND: While presumably less common with modern molecular diagnostic and imaging techniques, fever of unknown origin (FUO) remains a challenge in kidney transplant recipients (KTRs). Additionally, the impact of FUO on patient and graft survival is poorly described. METHODS: A cohort of adult KTRs between January 1, 1995 and December 31, 2018 was followed at the University of Wisconsin Hospital. Patients transplanted from January 1, 1995 to December 31, 2005 were included in the "early era"; patients transplanted from January 1, 2006 to December 31, 2018 were included in the "modern era". The primary objective was to describe the epidemiology and etiology of FUO diagnoses over time. Secondary outcomes included rejection, graft and patient survival. RESULTS: There were 5590 kidney transplants at our center during the study window. FUO was identified in 323 patients with an overall incidence rate of .8/100 person-years. Considering only the first 3 years after transplant, the incidence of FUO was significantly lower in the modern era than in the early era, with an Incidence Rate Ratio (IRR) per 100 person-years of .48; 95% CI: .35-.63; p < .001. A total of 102 (31.9%) of 323 patients had an etiology determined within 90 days after FUO diagnosis: 100 were infectious, and two were malignancies. In the modern era, FUO remained significantly associated with rejection (HR = 44.1; 95% CI: 16.6-102; p < .001) but not graft failure (HR = 1.21; 95% CI: .68-2.18; p = .52) total graft loss (HR = 1.17; 95% CI: .85-1.62; p = .34), or death (HR = 1.17; 95% CI: .79-1.76; p = .43. CONCLUSIONS: FUO is less common in KTRs during the modern era. Our study suggests infection remains the most common etiology. FUO remains associated with significant increases in risk of rejection, warranting further inquiry into the management of immunosuppressive medications in SOT recipients in the setting of FUO.


Asunto(s)
Fiebre de Origen Desconocido , Trasplante de Riñón , Neoplasias , Adulto , Humanos , Incidencia , Trasplante de Riñón/efectos adversos , Fiebre de Origen Desconocido/epidemiología , Fiebre de Origen Desconocido/etiología , Fiebre de Origen Desconocido/diagnóstico
13.
Transplant Direct ; 9(9): e1526, 2023 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-37654682

RESUMEN

Background: Delayed graft function (DGF) among deceased donor kidney transplant recipients (DDKTRs) is a well-known risk factor for allograft rejection, decreased graft survival, and increased cost. Although DGF is associated with an increased risk of rejection, it is unclear whether it also increases the risk of infection. Methods: We reviewed all adult DDKTRs at our center between 2010 and 2018. The primary outcomes of interest were BK viremia, cytomegalovirus viremia, pneumonia, and urinary tract infection (UTI) within the first year of transplant. Additional analysis was made with censoring follow-up at the time of allograft rejection. Results: A total of 1512 DDKTRs were included, of whom 468 (31%) had DGF. As expected, several recipient, donor, and baseline immunological characteristics differed by DGF status. After adjustment, DGF was significantly associated with an increased risk of BK viremia (hazard ratio: 1.34; 95% confidence interval, 1.0-1.81; P = 0.049) and UTI (hazard ratio: 1.70; 95% confidence interval, 1.31-2.19; P < 0.001) but not cytomegalovirus viremia or pneumonia. Associations were similar in models censored at the time of rejection. Conclusions: DGF is associated with an increased risk of early infectious complications, mainly UTI and BK viremia. Close monitoring and appropriate management are warranted for better outcomes in this unique population.

15.
Nat Med ; 29(5): 1211-1220, 2023 05.
Artículo en Inglés | MEDLINE | ID: mdl-37142762

RESUMEN

For three decades, the international Banff classification has been the gold standard for kidney allograft rejection diagnosis, but this system has become complex over time with the integration of multimodal data and rules, leading to misclassifications that can have deleterious therapeutic consequences for patients. To improve diagnosis, we developed a decision-support system, based on an algorithm covering all classification rules and diagnostic scenarios, that automatically assigns kidney allograft diagnoses. We then tested its ability to reclassify rejection diagnoses for adult and pediatric kidney transplant recipients in three international multicentric cohorts and two large prospective clinical trials, including 4,409 biopsies from 3,054 patients (62.05% male and 37.95% female) followed in 20 transplant referral centers in Europe and North America. In the adult kidney transplant population, the Banff Automation System reclassified 83 out of 279 (29.75%) antibody-mediated rejection cases and 57 out of 105 (54.29%) T cell-mediated rejection cases, whereas 237 out of 3,239 (7.32%) biopsies diagnosed as non-rejection by pathologists were reclassified as rejection. In the pediatric population, the reclassification rates were 8 out of 26 (30.77%) for antibody-mediated rejection and 12 out of 39 (30.77%) for T cell-mediated rejection. Finally, we found that reclassification of the initial diagnoses by the Banff Automation System was associated with an improved risk stratification of long-term allograft outcomes. This study demonstrates the potential of an automated histological classification to improve transplant patient care by correcting diagnostic errors and standardizing allograft rejection diagnoses.ClinicalTrials.gov registration: NCT05306795 .


Asunto(s)
Trasplante de Riñón , Riñón , Adulto , Humanos , Masculino , Femenino , Niño , Estudios Prospectivos , Riñón/patología , Trasplante de Riñón/efectos adversos , Trasplante Homólogo , Aloinjertos , Rechazo de Injerto/diagnóstico , Biopsia
16.
Prostate ; 83(11): 1046-1059, 2023 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-37154584

RESUMEN

BACKGROUND: Cholesterol reduction is considered a mechanism through which cholesterol-lowering drugs including statins are associated with a reduced aggressive prostate cancer risk. While prior cohort studies found positive associations between total cholesterol and more advanced stage and grade in White men, whether associations for total cholesterol, low (LDL)- and high (HDL)-density lipoprotein cholesterol, apolipoprotein B (LDL particle) and A1 (HDL particle), and triglycerides are similar for fatal prostate cancer and in Black men, who experience a disproportionate burden of total and fatal prostate cancer, is unknown. METHODS: We conducted a prospective study of 1553 Black and 5071 White cancer-free men attending visit 1 (1987-1989) of the Atherosclerosis Risk in Communities Study. A total of 885 incident prostate cancer cases were ascertained through 2015, and 128 prostate cancer deaths through 2018. We estimated multivariable-adjusted hazard ratios (HRs) of total and fatal prostate cancer per 1-standard deviation increments and for tertiles (T1-T3) of time-updated lipid biomarkers overall and in Black and White men. RESULTS: Greater total cholesterol concentration (HR per-1 SD = 1.25; 95% CI = 1.00-1.58) and LDL cholesterol (HR per-1 SD = 1.26; 95% CI = 0.99-1.60) were associated with higher fatal prostate cancer risk in White men only. Apolipoprotein B was nonlinearly associated with fatal prostate cancer overall (T2 vs. T1: HR = 1.66; 95% CI = 1.05-2.64) and in Black men (HR = 3.59; 95% CI = 1.53-8.40) but not White men (HR = 1.13; 95% CI = 0.65-1.97). Tests for interaction by race were not statistically significant. CONCLUSIONS: These findings may improve the understanding of lipid metabolism in prostate carcinogenesis by disease aggressiveness, and by race while emphasizing the importance of cholesterol control.


Asunto(s)
Colesterol , Neoplasias de la Próstata , Masculino , Humanos , Triglicéridos , HDL-Colesterol , Estudios Prospectivos , Apolipoproteínas , Neoplasias de la Próstata/epidemiología , Factores de Riesgo
17.
Stat Med ; 42(13): 2101-2115, 2023 06 15.
Artículo en Inglés | MEDLINE | ID: mdl-36938960

RESUMEN

Joint modeling and landmark modeling are two mainstream approaches to dynamic prediction in longitudinal studies, that is, the prediction of a clinical event using longitudinally measured predictor variables available up to the time of prediction. It is an important research question to the methodological research field and also to practical users to understand which approach can produce more accurate prediction. There were few previous studies on this topic, and the majority of results seemed to favor joint modeling. However, these studies were conducted in scenarios where the data were simulated from the joint models, partly due to the widely recognized methodological difficulty on whether there exists a general joint distribution of longitudinal and survival data so that the landmark models, which consists of infinitely many working regression models for survival, hold simultaneously. As a result, the landmark models always worked under misspecification, which caused difficulty in interpreting the comparison. In this paper, we solve this problem by using a novel algorithm to generate longitudinal and survival data that satisfies the working assumptions of the landmark models. This innovation makes it possible for a "fair" comparison of joint modeling and landmark modeling in terms of model specification. Our simulation results demonstrate that the relative performance of these two modeling approaches depends on the data settings and one does not always dominate the other in terms of prediction accuracy. These findings stress the importance of methodological development for both approaches. The related methodology is illustrated with a kidney transplantation dataset.


Asunto(s)
Modelos Estadísticos , Humanos , Simulación por Computador , Estudios Longitudinales
18.
Arthritis Care Res (Hoboken) ; 75(9): 1886-1896, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-36752354

RESUMEN

OBJECTIVE: Patients with systemic lupus erythematosus experience the sixth highest rate of 30-day readmissions among chronic diseases. Timely postdischarge follow-up is a marker of ambulatory care quality that can reduce readmissions in other chronic conditions. Our objective was to test the hypotheses that 1) beneficiaries from populations experiencing health disparities, including patients from disadvantaged neighborhoods, will have lower odds of completed follow-up, and that 2) follow-up will predict longer time without acute care use (readmission, observation stay, or emergency department visit) or mortality. METHODS: This observational cohort study included hospitalizations in January-November 2014 from a 20% random sample of Medicare adults. Included hospitalizations had a lupus code, discharge to home without hospice, and continuous Medicare A/B coverage for 1 year before and 1 month after hospitalization. Timely follow-up included visits with primary care or rheumatology within 30 days. Thirty-day survival outcomes were acute care use and mortality adjusted for sociodemographic information and comorbidities. RESULTS: Over one-third (35%) of lupus hospitalizations lacked 30-day follow-up. Younger age, living in disadvantaged neighborhoods, and rurality were associated with lower odds of follow-up. Follow-up was not associated with subsequent acute care or mortality in beneficiaries age <65 years. In contrast, follow-up was associated with a 27% higher hazard for acute care use (adjusted hazard ratio [HR] 1.27 [95% confidence interval (95% CI) 1.09-1.47]) and 65% lower mortality (adjusted HR 0.35 [95% CI 0.19-0.67]) among beneficiaries age ≥65 years. CONCLUSION: One-third of lupus hospitalizations lacked follow-up, with significant disparities in rural and disadvantaged neighborhoods. Follow-up was associated with increased acute care, but 65% lower mortality in older systemic lupus erythematosus patients. Further development of lupus-specific postdischarge strategies is needed.


Asunto(s)
Cuidados Posteriores , Alta del Paciente , Adulto , Humanos , Anciano , Estados Unidos/epidemiología , Estudios de Cohortes , Medicare , Hospitalización , Readmisión del Paciente , Estudios Retrospectivos
19.
Ann Am Thorac Soc ; 20(8): 1107-1115, 2023 08.
Artículo en Inglés | MEDLINE | ID: mdl-36812384

RESUMEN

Rationale: Population-based data on the epidemiology of nontuberculosis mycobacterial (NTM) infections are limited, particularly with respect to variation in NTM infection among racial groups and socioeconomic strata. Wisconsin is one of a handful of states where mycobacterial disease is notifiable, allowing large, population-based analyses of the epidemiology of NTM infection in this state. Objectives: To estimate the incidence of NTM infection in Wisconsin adults, describe the geographic distribution of NTM infection across the state, identify the frequency and type of infection caused by different NTM species, and investigate associations between NTM infection and demographics and socioeconomic status. Methods: We conducted a retrospective cohort study using laboratory reports of all NTM isolates from Wisconsin residents submitted to the Wisconsin Electronic Disease Surveillance System from 2011 to 2018. For the analyses of NTM frequency, multiple reports from the same individual were enumerated as separate isolates when nonidentical, collected from different sites or collected more than one year apart. Results: A total of 8,135 NTM isolates from 6,811 adults were analyzed. Mycobacterium avium complex accounted for 76.4% of respiratory isolates. The M. chelonae-abscessus group was the most common species isolated from skin and soft tissue. The annual incidence of NTM infection was stable over the study period (from 22.1 per 100,000 to 22.4 per 100,000). The cumulative incidence of NTM infection among Black (224 per 100,000) and Asian (244 per 100,000) individuals was significantly higher compared with that among their White counterparts (97 per 100,000). Total NTM infections were significantly more frequent (P < 0.001) in individuals from disadvantaged neighborhoods, and racial disparities in the incidence of NTM infection generally remained consistent when stratified by measures of neighborhood disadvantage. Conclusions: More than 90% of NTM infections were from respiratory sites, with the vast majority caused by M. avium complex. Rapidly growing mycobacteria predominated as skin and soft tissue pathogens and were important minor respiratory pathogens. We found a stable annual incidence of NTM infection in Wisconsin between 2011 and 2018. NTM infection occurred more frequently in non-White racial groups and in individuals experiencing social disadvantage, suggesting that NTM disease may be more frequent in these groups as well.


Asunto(s)
Infecciones por Mycobacterium no Tuberculosas , Micobacterias no Tuberculosas , Adulto , Humanos , Wisconsin/epidemiología , Estudios Retrospectivos , Infecciones por Mycobacterium no Tuberculosas/epidemiología , Infecciones por Mycobacterium no Tuberculosas/microbiología , Complejo Mycobacterium avium
20.
BMC Med Res Methodol ; 23(1): 5, 2023 01 07.
Artículo en Inglés | MEDLINE | ID: mdl-36611147

RESUMEN

BACKGROUND: In the development of prediction models for a clinical event, it is common to use the static prediction modeling (SPM), a regression model that relates baseline predictors to the time to event. In many situations, the data used in training and validation are from longitudinal studies, where predictor variables are time-varying and measured at clinical visits. But these data are not used in SPM. The landmark analysis (LA), previously proposed for dynamic prediction with longitudinal data, has interpretational difficulty when the baseline is not a risk-changing clinical milestone, as is often the case in observational studies of chronic disease without intervention. METHODS: This paper studies the generalized landmark analysis (GLA), a statistical framework to develop prediction models for longitudinal data. The GLA includes the LA as a special case, and generalizes it to situations where the baseline is not a risk-changing clinical milestone with a more useful interpretation. Unlike the LA, the landmark variable does not have to be time since baseline in the GLA, but can be any time-varying prognostic variable. The GLA can also be viewed as a longitudinal generalization of localized prediction, which has been studied in the context of low-dimensional cross-sectional data. We studied the GLA using data from the Chronic Renal Insufficiency Cohort (CRIC) Study and the Wisconsin Allograft Replacement Database (WisARD) and compared the prediction performance of SPM and GLA. RESULTS: In various validation populations from longitudinal data, the GLA generally had similarly or better predictive performance than SPM, with notable improvement being seen when the validation population deviated from the baseline population. The GLA also demonstrated similar or better predictive performance than LA, due to its more general model specification. CONCLUSIONS: GLA is a generalization of the LA such that the landmark variable does not have to be the time since baseline. It has better interpretation when the baseline is not a risk-changing clinical milestone. The GLA is more adaptive to the validation population than SPM and is more flexible than LA, which may help produce more accurate prediction.


Asunto(s)
Estudios Transversales , Humanos , Pronóstico , Estudios Longitudinales , Factores de Riesgo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA