Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 281
Filtrar
1.
Am J Nephrol ; 2024 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-38754385

RESUMO

INTRODUCTION: The Center for Medicare and Medicaid Services (CMS) introduced an End Stage Renal Disease (ESRD) Prospective Payment System (PPS) in 2011 to increase the utilization of home dialysis modalities, including peritoneal dialysis (PD). Several studies have shown a significant increase in PD utilization after PPS implementation. However, its impact on patients with kidney allograft failure remains unknown. METHODS: We conducted an interrupted time series (ITS) analysis using data from the United States Renal Data System (USRDS) that include all adult kidney transplant recipients with allograft failure who started dialysis between 2005 and 2019. We compared the PD utilization in the pre-PPS period (2005-2010) to the fully implemented post-PPS period (2014 - 2019) for early (within 90 days) and late (91-365 days) PD experience. RESULTS: 27507 adult recipients with allograft failure started dialysis during the study period. There was no difference in early PD utilization between the pre-PPS and the post-PPS period in either immediate change (0.3% increase; 95%CI: -1.95%, 2.54%; p=0.79) or rate of change over time (0.28% increase per year; 95%CI: -0.16%, 0.72%; p=0.18). Subgroup analyses revealed a trend toward higher PD utilization post-PPS in for-profit and large-volume dialysis units. There was a significant increase in PD utilization in the post-PPS period in units with low PD experience in the pre-PPS period. Similar findings were seen for the late PD experience. CONCLUSION: PPS did not significantly increase the overall utilization of PD in patients initiating dialysis after allograft failure.

2.
Artigo em Inglês | MEDLINE | ID: mdl-38693617

RESUMO

BACKGROUND: Social determinants of health (SDoH) likely contribute to outcome disparities in lupus nephritis (LN). Understanding the overall burden and contribution of each domain could guide future health-equity focused interventions to improve outcomes and reduce disparities in LN. Objectives of this meta-analysis were to: 1) determine the association of overall SDoH and specific SDoH domains on LN outcomes, and 2) develop a framework for the multidimensional impact of SDoH on LN outcomes. METHODS: We performed a comprehensive search of studies measuring associations between SDoH and LN outcomes. We examined pooled odds of poor LN outcomes including mortality, end-stage kidney disease, or cardiovascular disease in patients with and without adverse SDoH. Additionally, we calculated the pooled odds ratios of outcomes by four SDoH domains: individual (e.g., insurance), healthcare (e.g., fragmented care), community (e.g., neighborhood socioeconomic status), and health behaviors (e.g., smoking). RESULTS: Among 531 screened studies, 31 met inclusion and 13 studies with raw data were included in meta-analysis. Pooled odds of poor outcomes, were 1.47-fold higher in patients with any adverse SDoH. Patients with adverse SDoH in individual and healthcare domains had 1.64-fold and 1.77-fold higher odds of poor outcomes. We found a multiplicative impact of having ≥2 adverse SDoH on LN outcomes. Patients of Black Race with public insurance and fragmented care had 12-fold higher odds of poor LN outcomes. CONCLUSION: Adverse SDoH is associated with poor LN outcomes. Having ≥2 adverse SDoH, specifically in different SDoH domains, had a multiplicative impact leading to worse LN outcomes, widening disparities.

3.
Lupus ; 33(8): 804-815, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38631342

RESUMO

OBJECTIVE: In systemic lupus erythematosus, poor disease outcomes occur in young adults, patients identifying as Black or Hispanic, and socioeconomically disadvantaged patients. These identities and social factors differentially shape care access and quality that contribute to lupus health disparities in the US. Thus, our objective was to measure markers of care access and quality, including rheumatology visits (longitudinal care retention) and lupus-specific serology testing, by race and ethnicity, neighborhood disadvantage, and geographic context. METHODS: This cohort study used a geo-linked 20% national sample of young adult Medicare beneficiaries (ages 18-35) with lupus-coded encounters and a 1-year assessment period. Retention in lupus care required a rheumatology visit in each 6-month period, and serology testing required ≥1 complement or dsDNA antibody test within the year. Multivariable logistic regression models were fit for visit-based retention and serology testing to determine associations with race and ethnicity, neighborhood disadvantage, and geography. RESULTS: Among 1,036 young adults with lupus, 39% saw a rheumatologist every 6 months and 28% had serology testing. White beneficiaries from the least disadvantaged quintile of neighborhoods had higher visit-based retention than other beneficiaries (64% vs 30%-60%). Serology testing decreased with increasing neighborhood disadvantage quintile (aOR 0.80; 95% CI 0.71, 0.90) and in the Midwest (aOR 0.46; 0.30, 0.71). CONCLUSION: Disparities in care, measured by rheumatology visits and serology testing, exist by neighborhood disadvantage, race and ethnicity, and region among young adults with lupus, despite uniform Medicare coverage. Findings support evaluating lupus care quality measures and their impact on US lupus outcomes.


Assuntos
Disparidades em Assistência à Saúde , Lúpus Eritematoso Sistêmico , Medicare , Reumatologia , Humanos , Lúpus Eritematoso Sistêmico/terapia , Estados Unidos , Adulto , Masculino , Feminino , Adulto Jovem , Adolescente , Disparidades em Assistência à Saúde/estatística & dados numéricos , Retenção nos Cuidados/estatística & dados numéricos , Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Estudos de Coortes , Modelos Logísticos , Negro ou Afro-Americano/estatística & dados numéricos
4.
Transplant Direct ; 10(4): e1600, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38550773

RESUMO

Background: Recurrence of glomerulonephritis (GN) is a significant contributor to long-term allograft failure among kidney transplant recipients (KTRs) with kidney failure because of GN. Accumulating evidence has revealed the role of vitamin D in both innate and adaptive immunity. Although vitamin D deficiency is common among KTRs, the association between 25-hydroxyvitamin D (25[OH]D) and GN recurrence in KTRs remains unclear. Methods: We analyzed data from KTRs with kidney failure caused by GN who received a transplant at our center from 2000 to 2019 and had at least 1 valid posttransplant serum 25(OH)D measurement. Survival analyses were performed using a competing risk regression model considering other causes of allograft failure, including death, as competing risk events. Results: A total of 67 cases of GN recurrence were identified in 947 recipients with GN followed for a median of 7.0 y after transplant. Each 1 ng/mL lower serum 25(OH)D was associated with a 4% higher hazard of recurrence (subdistribution hazard ratio [HR]: 1.04; 95% confidence interval [CI], 1.01-1.06). Vitamin D deficiency (≤20 ng/mL) was associated with a 2.99-fold (subdistribution HR: 2.99; 95% CI, 1.56-5.73) higher hazard of recurrence compared with vitamin D sufficiency (≥30 ng/mL). Results were similar after further adjusting for concurrent urine protein-creatinine ratio, serum albumin, and estimated glomerular filtration rate (eGFR). Conclusions: Posttransplant vitamin D deficiency is associated with a higher hazard of GN recurrence in KTRs. Further prospective observational studies and clinical trials are needed to determine any causal role of vitamin D in the recurrence of GN after kidney transplantation. More in vitro and in vivo experiments would be helpful to understand its effects on autoimmune and inflammation processes.

5.
Transplant Direct ; 10(4): e1607, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38464426

RESUMO

Background: Posttransplant erythrocytosis (PTE) is a well-known complication of kidney transplantation. However, the risk and outcomes of PTE among simultaneous pancreas-kidney transplant (SPKT) recipients are poorly described. Methods: We analyzed all SPKT recipients at our center between 1998 and 2021. PTE was defined as at least 2 consecutive hematocrit levels of >51% within the first 2 y of transplant. Controls were selected at a ratio of 3:1 at the time of PTE occurrence using event density sampling. Risk factors for PTE and post-PTE graft survival were identified. Results: Of 887 SPKT recipients, 108 (12%) developed PTE at a median of 273 d (interquartile range, 160-393) after transplantation. The incidence rate of PTE was 7.5 per 100 person-years. Multivariate analysis found pretransplant dialysis (hazard ratio [HR]: 3.15; 95% confidence interval [CI], 1.67-5.92; P < 0.001), non-White donor (HR: 2.14; 95% CI, 1.25-3.66; P = 0.01), female donor (HR: 1.50; 95% CI, 1.0-2.26; P = 0.05), and male recipient (HR: 2.33; 95% CI, 1.43-3.70; P = 0.001) to be associated with increased risk. The 108 cases of PTE were compared with 324 controls. PTE was not associated with subsequent pancreas graft failure (HR: 1.36; 95% CI, 0.51-3.68; P = 0.53) or kidney graft failure (HR: 1.16; 95% CI, 0.40-3.42; P = 0.78). Conclusions: PTE is a common complication among SPKT recipients, even in the modern era of immunosuppression. PTE among SPKT recipients was not associated with adverse graft outcomes, likely due to appropriate management.

6.
Transplant Direct ; 10(2): e1575, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38264296

RESUMO

Background: Kidney transplant outcomes have dramatically improved since the first successful transplant in 1954. In its early years, kidney transplantation was viewed more skeptically. Today it is considered the treatment of choice among patients with end-stage kidney disease. Methods: Our program performed its first kidney transplant in 1966 and recently performed our 12 000th kidney transplant. Here, we review and describe our experience with these 12 000 transplants. Transplant recipients were analyzed by decade of date of transplant: 1966-1975, 1976-1985, 1986-1995, 1996-2005, 2006-2015, and 2016-2022. Death-censored graft failure and mortality were outcomes of interest. Results: Of 12 000 kidneys, 247 were transplanted from 1966 to 1975, 1147 from 1976 to 1985, 2194 from 1986 to 1995, 3147 from 1996 to 2005, 3046 from 2006 to 2015, and 2219 from 2016 to 2022 compared with 1966-1975, there were statistically significant and progressively lower risks of death-censored graft failure at 1 y, 5 y, and at last follow-up in all subsequent eras. Although mortality at 1 y was lower in all subsequent eras after 1986-1995, there was no difference in mortality at 5 y or the last follow-up between eras. Conclusions: In this large cohort of 12 000 kidneys from a single center, we observed significant improvement in outcomes over time. Kidney transplantation remains a robust and ever-growing and improving field.

7.
Arthritis Care Res (Hoboken) ; 76(2): 241-250, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37667434

RESUMO

OBJECTIVE: Recent data show that lower hydroxychloroquine (HCQ) doses are associated with a two- to six-fold higher risk of lupus flares. Thus, establishing an effective reference range of HCQ blood levels with upper and lower bounds for efficacy may support individualizing HCQ dosing to prevent flares. METHODS: HCQ levels in whole blood and Systemic Lupus Erythematosus Disease Activity Index (SLEDAI) were measured during the baseline visit and again during a standard of care routine follow-up visit. Active cross-sectional lupus at baseline was defined as SLEDAI ≥6; a within subject flare was defined as a subsequent three-point increase in SLEDAI with clinical symptoms requiring therapy change. We examined associations between active lupus and HCQ blood levels at baseline and flares and HCQ levels during 6 to 12-month routine lupus follow-up visits using mixed regression analysis. RESULTS: Among 158 baseline patient visits, 19% had active lupus. Odds of active lupus were 71% lower in patients with levels within a 750 to 1,200 ng/mL range (adjusted odds ratio 0.29, 95% confidence interval 0.08-0.96). Using convenience sampling strategy during a pandemic, we longitudinally followed 42 patients. Among those patients, 17% flared during their follow-up visit. Maintaining HCQ levels within 750 to 1,200 ng/mL reduced the odds of a flare by 26% over a nine-month median follow-up. CONCLUSION: An effective reference range of HCQ blood levels, 750 to 1,200 ng/mL, was associated with 71% lower odds of active lupus, and maintaining levels within this range reduced odds of flares by 26%. These findings could guide clinicians to individualize HCQ doses to maintain HCQ levels within this range to maximize efficacy.


Assuntos
Antirreumáticos , Lúpus Eritematoso Sistêmico , Humanos , Hidroxicloroquina , Estudos Transversais , Valores de Referência , Lúpus Eritematoso Sistêmico/diagnóstico , Lúpus Eritematoso Sistêmico/tratamento farmacológico
8.
Clin Transplant ; 38(1): e15217, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-38078682

RESUMO

BACKGROUND: While presumably less common with modern molecular diagnostic and imaging techniques, fever of unknown origin (FUO) remains a challenge in kidney transplant recipients (KTRs). Additionally, the impact of FUO on patient and graft survival is poorly described. METHODS: A cohort of adult KTRs between January 1, 1995 and December 31, 2018 was followed at the University of Wisconsin Hospital. Patients transplanted from January 1, 1995 to December 31, 2005 were included in the "early era"; patients transplanted from January 1, 2006 to December 31, 2018 were included in the "modern era". The primary objective was to describe the epidemiology and etiology of FUO diagnoses over time. Secondary outcomes included rejection, graft and patient survival. RESULTS: There were 5590 kidney transplants at our center during the study window. FUO was identified in 323 patients with an overall incidence rate of .8/100 person-years. Considering only the first 3 years after transplant, the incidence of FUO was significantly lower in the modern era than in the early era, with an Incidence Rate Ratio (IRR) per 100 person-years of .48; 95% CI: .35-.63; p < .001. A total of 102 (31.9%) of 323 patients had an etiology determined within 90 days after FUO diagnosis: 100 were infectious, and two were malignancies. In the modern era, FUO remained significantly associated with rejection (HR = 44.1; 95% CI: 16.6-102; p < .001) but not graft failure (HR = 1.21; 95% CI: .68-2.18; p = .52) total graft loss (HR = 1.17; 95% CI: .85-1.62; p = .34), or death (HR = 1.17; 95% CI: .79-1.76; p = .43. CONCLUSIONS: FUO is less common in KTRs during the modern era. Our study suggests infection remains the most common etiology. FUO remains associated with significant increases in risk of rejection, warranting further inquiry into the management of immunosuppressive medications in SOT recipients in the setting of FUO.


Assuntos
Febre de Causa Desconhecida , Transplante de Rim , Neoplasias , Adulto , Humanos , Incidência , Transplante de Rim/efeitos adversos , Febre de Causa Desconhecida/epidemiologia , Febre de Causa Desconhecida/etiologia , Febre de Causa Desconhecida/diagnóstico
9.
Transplant Direct ; 9(9): e1526, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37654682

RESUMO

Background: Delayed graft function (DGF) among deceased donor kidney transplant recipients (DDKTRs) is a well-known risk factor for allograft rejection, decreased graft survival, and increased cost. Although DGF is associated with an increased risk of rejection, it is unclear whether it also increases the risk of infection. Methods: We reviewed all adult DDKTRs at our center between 2010 and 2018. The primary outcomes of interest were BK viremia, cytomegalovirus viremia, pneumonia, and urinary tract infection (UTI) within the first year of transplant. Additional analysis was made with censoring follow-up at the time of allograft rejection. Results: A total of 1512 DDKTRs were included, of whom 468 (31%) had DGF. As expected, several recipient, donor, and baseline immunological characteristics differed by DGF status. After adjustment, DGF was significantly associated with an increased risk of BK viremia (hazard ratio: 1.34; 95% confidence interval, 1.0-1.81; P = 0.049) and UTI (hazard ratio: 1.70; 95% confidence interval, 1.31-2.19; P < 0.001) but not cytomegalovirus viremia or pneumonia. Associations were similar in models censored at the time of rejection. Conclusions: DGF is associated with an increased risk of early infectious complications, mainly UTI and BK viremia. Close monitoring and appropriate management are warranted for better outcomes in this unique population.

11.
Nat Med ; 29(5): 1211-1220, 2023 05.
Artigo em Inglês | MEDLINE | ID: mdl-37142762

RESUMO

For three decades, the international Banff classification has been the gold standard for kidney allograft rejection diagnosis, but this system has become complex over time with the integration of multimodal data and rules, leading to misclassifications that can have deleterious therapeutic consequences for patients. To improve diagnosis, we developed a decision-support system, based on an algorithm covering all classification rules and diagnostic scenarios, that automatically assigns kidney allograft diagnoses. We then tested its ability to reclassify rejection diagnoses for adult and pediatric kidney transplant recipients in three international multicentric cohorts and two large prospective clinical trials, including 4,409 biopsies from 3,054 patients (62.05% male and 37.95% female) followed in 20 transplant referral centers in Europe and North America. In the adult kidney transplant population, the Banff Automation System reclassified 83 out of 279 (29.75%) antibody-mediated rejection cases and 57 out of 105 (54.29%) T cell-mediated rejection cases, whereas 237 out of 3,239 (7.32%) biopsies diagnosed as non-rejection by pathologists were reclassified as rejection. In the pediatric population, the reclassification rates were 8 out of 26 (30.77%) for antibody-mediated rejection and 12 out of 39 (30.77%) for T cell-mediated rejection. Finally, we found that reclassification of the initial diagnoses by the Banff Automation System was associated with an improved risk stratification of long-term allograft outcomes. This study demonstrates the potential of an automated histological classification to improve transplant patient care by correcting diagnostic errors and standardizing allograft rejection diagnoses.ClinicalTrials.gov registration: NCT05306795 .


Assuntos
Transplante de Rim , Rim , Adulto , Humanos , Masculino , Feminino , Criança , Estudos Prospectivos , Rim/patologia , Transplante de Rim/efeitos adversos , Transplante Homólogo , Aloenxertos , Rejeição de Enxerto/diagnóstico , Biópsia
12.
Prostate ; 83(11): 1046-1059, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-37154584

RESUMO

BACKGROUND: Cholesterol reduction is considered a mechanism through which cholesterol-lowering drugs including statins are associated with a reduced aggressive prostate cancer risk. While prior cohort studies found positive associations between total cholesterol and more advanced stage and grade in White men, whether associations for total cholesterol, low (LDL)- and high (HDL)-density lipoprotein cholesterol, apolipoprotein B (LDL particle) and A1 (HDL particle), and triglycerides are similar for fatal prostate cancer and in Black men, who experience a disproportionate burden of total and fatal prostate cancer, is unknown. METHODS: We conducted a prospective study of 1553 Black and 5071 White cancer-free men attending visit 1 (1987-1989) of the Atherosclerosis Risk in Communities Study. A total of 885 incident prostate cancer cases were ascertained through 2015, and 128 prostate cancer deaths through 2018. We estimated multivariable-adjusted hazard ratios (HRs) of total and fatal prostate cancer per 1-standard deviation increments and for tertiles (T1-T3) of time-updated lipid biomarkers overall and in Black and White men. RESULTS: Greater total cholesterol concentration (HR per-1 SD = 1.25; 95% CI = 1.00-1.58) and LDL cholesterol (HR per-1 SD = 1.26; 95% CI = 0.99-1.60) were associated with higher fatal prostate cancer risk in White men only. Apolipoprotein B was nonlinearly associated with fatal prostate cancer overall (T2 vs. T1: HR = 1.66; 95% CI = 1.05-2.64) and in Black men (HR = 3.59; 95% CI = 1.53-8.40) but not White men (HR = 1.13; 95% CI = 0.65-1.97). Tests for interaction by race were not statistically significant. CONCLUSIONS: These findings may improve the understanding of lipid metabolism in prostate carcinogenesis by disease aggressiveness, and by race while emphasizing the importance of cholesterol control.


Assuntos
Colesterol , Neoplasias da Próstata , Masculino , Humanos , Triglicerídeos , HDL-Colesterol , Estudos Prospectivos , Apolipoproteínas , Neoplasias da Próstata/epidemiologia , Fatores de Risco
13.
Stat Med ; 42(13): 2101-2115, 2023 06 15.
Artigo em Inglês | MEDLINE | ID: mdl-36938960

RESUMO

Joint modeling and landmark modeling are two mainstream approaches to dynamic prediction in longitudinal studies, that is, the prediction of a clinical event using longitudinally measured predictor variables available up to the time of prediction. It is an important research question to the methodological research field and also to practical users to understand which approach can produce more accurate prediction. There were few previous studies on this topic, and the majority of results seemed to favor joint modeling. However, these studies were conducted in scenarios where the data were simulated from the joint models, partly due to the widely recognized methodological difficulty on whether there exists a general joint distribution of longitudinal and survival data so that the landmark models, which consists of infinitely many working regression models for survival, hold simultaneously. As a result, the landmark models always worked under misspecification, which caused difficulty in interpreting the comparison. In this paper, we solve this problem by using a novel algorithm to generate longitudinal and survival data that satisfies the working assumptions of the landmark models. This innovation makes it possible for a "fair" comparison of joint modeling and landmark modeling in terms of model specification. Our simulation results demonstrate that the relative performance of these two modeling approaches depends on the data settings and one does not always dominate the other in terms of prediction accuracy. These findings stress the importance of methodological development for both approaches. The related methodology is illustrated with a kidney transplantation dataset.


Assuntos
Modelos Estatísticos , Humanos , Simulação por Computador , Estudos Longitudinais
14.
Ann Am Thorac Soc ; 20(8): 1107-1115, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-36812384

RESUMO

Rationale: Population-based data on the epidemiology of nontuberculosis mycobacterial (NTM) infections are limited, particularly with respect to variation in NTM infection among racial groups and socioeconomic strata. Wisconsin is one of a handful of states where mycobacterial disease is notifiable, allowing large, population-based analyses of the epidemiology of NTM infection in this state. Objectives: To estimate the incidence of NTM infection in Wisconsin adults, describe the geographic distribution of NTM infection across the state, identify the frequency and type of infection caused by different NTM species, and investigate associations between NTM infection and demographics and socioeconomic status. Methods: We conducted a retrospective cohort study using laboratory reports of all NTM isolates from Wisconsin residents submitted to the Wisconsin Electronic Disease Surveillance System from 2011 to 2018. For the analyses of NTM frequency, multiple reports from the same individual were enumerated as separate isolates when nonidentical, collected from different sites or collected more than one year apart. Results: A total of 8,135 NTM isolates from 6,811 adults were analyzed. Mycobacterium avium complex accounted for 76.4% of respiratory isolates. The M. chelonae-abscessus group was the most common species isolated from skin and soft tissue. The annual incidence of NTM infection was stable over the study period (from 22.1 per 100,000 to 22.4 per 100,000). The cumulative incidence of NTM infection among Black (224 per 100,000) and Asian (244 per 100,000) individuals was significantly higher compared with that among their White counterparts (97 per 100,000). Total NTM infections were significantly more frequent (P < 0.001) in individuals from disadvantaged neighborhoods, and racial disparities in the incidence of NTM infection generally remained consistent when stratified by measures of neighborhood disadvantage. Conclusions: More than 90% of NTM infections were from respiratory sites, with the vast majority caused by M. avium complex. Rapidly growing mycobacteria predominated as skin and soft tissue pathogens and were important minor respiratory pathogens. We found a stable annual incidence of NTM infection in Wisconsin between 2011 and 2018. NTM infection occurred more frequently in non-White racial groups and in individuals experiencing social disadvantage, suggesting that NTM disease may be more frequent in these groups as well.


Assuntos
Infecções por Mycobacterium não Tuberculosas , Micobactérias não Tuberculosas , Adulto , Humanos , Wisconsin/epidemiologia , Estudos Retrospectivos , Infecções por Mycobacterium não Tuberculosas/epidemiologia , Infecções por Mycobacterium não Tuberculosas/microbiologia , Complexo Mycobacterium avium
15.
Arthritis Care Res (Hoboken) ; 75(9): 1886-1896, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-36752354

RESUMO

OBJECTIVE: Patients with systemic lupus erythematosus experience the sixth highest rate of 30-day readmissions among chronic diseases. Timely postdischarge follow-up is a marker of ambulatory care quality that can reduce readmissions in other chronic conditions. Our objective was to test the hypotheses that 1) beneficiaries from populations experiencing health disparities, including patients from disadvantaged neighborhoods, will have lower odds of completed follow-up, and that 2) follow-up will predict longer time without acute care use (readmission, observation stay, or emergency department visit) or mortality. METHODS: This observational cohort study included hospitalizations in January-November 2014 from a 20% random sample of Medicare adults. Included hospitalizations had a lupus code, discharge to home without hospice, and continuous Medicare A/B coverage for 1 year before and 1 month after hospitalization. Timely follow-up included visits with primary care or rheumatology within 30 days. Thirty-day survival outcomes were acute care use and mortality adjusted for sociodemographic information and comorbidities. RESULTS: Over one-third (35%) of lupus hospitalizations lacked 30-day follow-up. Younger age, living in disadvantaged neighborhoods, and rurality were associated with lower odds of follow-up. Follow-up was not associated with subsequent acute care or mortality in beneficiaries age <65 years. In contrast, follow-up was associated with a 27% higher hazard for acute care use (adjusted hazard ratio [HR] 1.27 [95% confidence interval (95% CI) 1.09-1.47]) and 65% lower mortality (adjusted HR 0.35 [95% CI 0.19-0.67]) among beneficiaries age ≥65 years. CONCLUSION: One-third of lupus hospitalizations lacked follow-up, with significant disparities in rural and disadvantaged neighborhoods. Follow-up was associated with increased acute care, but 65% lower mortality in older systemic lupus erythematosus patients. Further development of lupus-specific postdischarge strategies is needed.


Assuntos
Assistência ao Convalescente , Alta do Paciente , Adulto , Humanos , Idoso , Estados Unidos/epidemiologia , Estudos de Coortes , Medicare , Hospitalização , Readmissão do Paciente , Estudos Retrospectivos
16.
BMC Med Res Methodol ; 23(1): 5, 2023 01 07.
Artigo em Inglês | MEDLINE | ID: mdl-36611147

RESUMO

BACKGROUND: In the development of prediction models for a clinical event, it is common to use the static prediction modeling (SPM), a regression model that relates baseline predictors to the time to event. In many situations, the data used in training and validation are from longitudinal studies, where predictor variables are time-varying and measured at clinical visits. But these data are not used in SPM. The landmark analysis (LA), previously proposed for dynamic prediction with longitudinal data, has interpretational difficulty when the baseline is not a risk-changing clinical milestone, as is often the case in observational studies of chronic disease without intervention. METHODS: This paper studies the generalized landmark analysis (GLA), a statistical framework to develop prediction models for longitudinal data. The GLA includes the LA as a special case, and generalizes it to situations where the baseline is not a risk-changing clinical milestone with a more useful interpretation. Unlike the LA, the landmark variable does not have to be time since baseline in the GLA, but can be any time-varying prognostic variable. The GLA can also be viewed as a longitudinal generalization of localized prediction, which has been studied in the context of low-dimensional cross-sectional data. We studied the GLA using data from the Chronic Renal Insufficiency Cohort (CRIC) Study and the Wisconsin Allograft Replacement Database (WisARD) and compared the prediction performance of SPM and GLA. RESULTS: In various validation populations from longitudinal data, the GLA generally had similarly or better predictive performance than SPM, with notable improvement being seen when the validation population deviated from the baseline population. The GLA also demonstrated similar or better predictive performance than LA, due to its more general model specification. CONCLUSIONS: GLA is a generalization of the LA such that the landmark variable does not have to be the time since baseline. It has better interpretation when the baseline is not a risk-changing clinical milestone. The GLA is more adaptive to the validation population than SPM and is more flexible than LA, which may help produce more accurate prediction.


Assuntos
Estudos Transversais , Humanos , Prognóstico , Estudos Longitudinais , Fatores de Risco
17.
J Vasc Access ; 24(6): 1398-1406, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35259945

RESUMO

BACKGROUND: Arteriovenous fistulae (AVF) are considered the preferred hemodialysis access but up to 50% of all AVF created in the United States never mature. Doppler ultrasound (DUS) is useful for predicting fistula maturity and impending fistula failure. DUS is resource-intensive and is associated with poor compliance rates in dialysis patients, ranging from 12% to 33%. METHODS: EchoSure is an FDA-cleared 3D Doppler ultrasound device that automatically delivers quantitative blood flow and anatomic vascular information. The technology can be used at the bedside by personnel without formal sonographic training, nullifying limitations of traditional Duplex ultrasound imaging. This study compared the EchoSure system in the hands of inexpert personnel to a traditional expert-operated DUS for rapid assessment of a benchtop model vascular system with flow, diameter, and depth expected in a human AVF. RESULTS: Both Duplex and EchoSure performed within the expected tolerance of ultrasound readings (35%) for volume flow, with the average error (AE) between the observed measurement and the ground truth being 8% for Duplex and 8% for EchoSure. However, the average coefficient of variation (CV) for Duplex pooled over all flow rate measurements was 17% versus 4% for EchoSure. Regarding diameter, Duplex measurements had AE of 15% with an average CV of 6% across all measurements versus EchoSure AE of 4% and average CV of 2%. Duplex and EchoSure measurements over all depths had the same AE of 2%. The two modalities were not statistically different for depth measurement (p = 0.05) but EchoSure measured closer to the ground truth for flow rate and vessel diameter (flow: p = 0.028, ρ = -0.07; diameter: p < 0.001, ρ = 0.69). The inexpert personnel using EchoSure acquired data 62% faster than the expert sonographers using the Duplex ultrasound (141 min for Duplex vs 87 min for EchoSure). CONCLUSIONS: EchoSure may offer an accurate and convenient alternative for imaging fistulas in the clinic.


Assuntos
Fístula Arteriovenosa , Derivação Arteriovenosa Cirúrgica , Humanos , Derivação Arteriovenosa Cirúrgica/efeitos adversos , Ultrassonografia , Diálise Renal/métodos , Ultrassonografia Doppler Dupla , Grau de Desobstrução Vascular , Resultado do Tratamento
18.
J Rheumatol ; 50(3): 359-367, 2023 03.
Artigo em Inglês | MEDLINE | ID: mdl-35970523

RESUMO

OBJECTIVE: Recent studies suggest young adults with systemic lupus erythematosus (SLE) have high 30-day readmission rates, which may necessitate tailored readmission reduction strategies. To aid in risk stratification for future strategies, we measured 30-day rehospitalization and mortality rates among Medicare beneficiaries with SLE and determined rehospitalization predictors by age. METHODS: In a 2014 20% national Medicare sample of hospitalizations, rehospitalization risk and mortality within 30 days of discharge were calculated for young (aged 18-35 yrs), middle-aged (aged 36-64 yrs), and older (aged 65+ yrs) beneficiaries with and without SLE. Multivariable generalized estimating equation models were used to predict rehospitalization rates among patients with SLE by age group using patient, hospital, and geographic factors. RESULTS: Among 1.39 million Medicare hospitalizations, 10,868 involved beneficiaries with SLE. Hospitalized young adult beneficiaries with SLE were more racially diverse, were living in more disadvantaged areas, and had more comorbidities than older beneficiaries with SLE and those without SLE. Thirty-day rehospitalization was 36% among young adult beneficiaries with SLE-40% higher than peers without SLE and 85% higher than older beneficiaries with SLE. Longer length of stay and higher comorbidity risk score increased odds of rehospitalization in all age groups, whereas specific comorbid condition predictors and their effect varied. Our models, which incorporated neighborhood-level socioeconomic disadvantage, had moderate-to-good predictive value (C statistics 0.67-0.77), outperforming administrative data models lacking comprehensive social determinants in other conditions. CONCLUSION: Young adults with SLE on Medicare had very high 30-day rehospitalization at 36%. Considering socioeconomic disadvantage and comorbidities provided good prediction of rehospitalization risk, particularly in young adults. Young beneficiaries with SLE with comorbidities should be a focus of programs aimed at reducing rehospitalizations.


Assuntos
Lúpus Eritematoso Sistêmico , Readmissão do Paciente , Pessoa de Meia-Idade , Adulto Jovem , Humanos , Idoso , Estados Unidos , Medicare , Estudos de Coortes , Estudos Retrospectivos , Hospitalização
19.
Clin Transplant ; 37(2): e14862, 2023 02.
Artigo em Inglês | MEDLINE | ID: mdl-36380446

RESUMO

INTRODUCTION: Serum albumin is an indicator of overall health status, but it remains unclear how pre-transplant hypoalbuminemia is associated with early post-transplant outcomes. METHODS: This study included all adult kidney transplant recipients (KTRs) at our center from 01/01/2001-12/31/2017 with serum albumin measured within 30 days before transplantation. KTRs were grouped based on pretransplant albumin level normal (≥4.0 g/dL), mild (≥3.5 - < 4.0g/dL), moderate (≥3.0 - < 3.5g/dL), or severe hypoalbuminemia (<3.0g/dL). Outcomes of interest included: length of hospital stay (LOS), readmission within 30 days, delayed graft function(DGF), and re-operation related to post-transplant surgical complications. We also analyzed rejection, graft failure, and death within 6 months post-transplant. RESULTS: A total of 2807 KTRs were included 43.6% had normal serum albumin, 35.3% mild, 16.6% moderate, and 4.5% severe hypoalbuminemia. Mild and moderate hypoalbuminemia were associated with a shorter LOS by 1.22 (p < 0.001) and 0.80 days (p = 0.01), respectively, compared to normal albumin. Moderate (HR: 0.58; 95% CI: 0.37-0.91; p = 0.02) and severe hypoalbuminemia (HR: 0.21; 95% CI: 0.07-0.68; p = 0.01) were associated with significantly lower rates of acute rejection within 6 months post-transplant. CONCLUSION: Patients with pre-transplant hypoalbuminemia have post-transplant outcomes similar to those with normal serum albumin, but with a lower risk of acute rejection based on the degree of hypoalbuminemia.


Assuntos
Hipoalbuminemia , Transplante de Rim , Adulto , Humanos , Hipoalbuminemia/complicações , Transplante de Rim/efeitos adversos , Estudos Retrospectivos , Albumina Sérica , Transplantados , Fatores de Risco , Rejeição de Enxerto/etiologia
20.
Clin Transplant ; 37(1): e14852, 2023 01.
Artigo em Inglês | MEDLINE | ID: mdl-36354280

RESUMO

PURPOSE: Studies conducted in the northern United States found cytomegalovirus (CMV) disease after liver transplantation follows a seasonal pattern, with increased incidence in fall and winter. This has not been evaluated in kidney transplant recipients. Improved understanding of CMV seasonality may help guide use of preventative therapies. METHODS: We evaluated adult patients receiving a kidney transplant at our center in Wisconsin from January 1, 1995 to December 31, 2018. CMV event was defined as quantifiable viral replication with clinical signs or symptoms suspicious for CMV per current consensus recommendations. Seasons were divided as follows: winter (December-February), spring (March-May), summer (June-August), and fall (September-November). The primary objective was to evaluate the annual distribution of CMV disease and determine whether this differed by season. RESULTS: There were 6151 kidney transplants in the study period. A total of 913 patients had 1492 episodes of CMV. Median time from transplant to first detection was 5.51 months (interquartile range [IQR] 2.87-11.7). The observed overall incidence exceeded the expected incidence in winter (+.7%), spring (+5.5%), and fall (+3.4%) and was less than expected in summer (-9.5%) (p = .18). The incidence of CMV during summer, however, was 21% less than expected (p = .001) in recipients who were CMV positive (R+) at the time of transplantation. No such difference was observed in CMV negative recipients (R-; p = .58). CONCLUSION: CMV after kidney transplant appears to be less common during the summer season in patients who were R+ at transplant but does not follow seasonal variation in R-. Reasons for this are unclear but are likely related to CMV-specific cell-mediated immunity. These findings may have clinical implications, particularly the use of non-pharmacologic strategies to improve response to antiviral therapy.


Assuntos
Infecções por Citomegalovirus , Transplante de Rim , Adulto , Humanos , Estações do Ano , Citomegalovirus , Transplante de Rim/efeitos adversos , Antivirais/uso terapêutico , Infecções por Citomegalovirus/tratamento farmacológico , Infecções por Citomegalovirus/epidemiologia , Infecções por Citomegalovirus/etiologia , Transplantados
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...