Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 170
Filtrar
Mais filtros

Bases de dados
Tipo de documento
Intervalo de ano de publicação
1.
Am J Transplant ; 24(5): 839-849, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38266712

RESUMO

Lung transplantation lags behind other solid organ transplants in donor lung utilization due, in part, to uncertainty regarding donor quality. We sought to develop an easy-to-use donor risk metric that, unlike existing metrics, accounts for a rich set of donor factors. Our study population consisted of n = 26 549 adult lung transplant recipients abstracted from the United Network for Organ Sharing Standard Transplant Analysis and Research file. We used Cox regression to model graft failure (GF; earliest of death or retransplant) risk based on donor and transplant factors, adjusting for recipient factors. We then derived and validated a Lung Donor Risk Index (LDRI) and developed a pertinent online application (https://shiny.pmacs.upenn.edu/LDRI_Calculator/). We found 12 donor/transplant factors that were independently predictive of GF: age, race, insulin-dependent diabetes, the difference between donor and recipient height, smoking, cocaine use, cytomegalovirus seropositivity, creatinine, human leukocyte antigen (HLA) mismatch, ischemia time, and donation after circulatory death. Validation showed the LDRI to have GF risk discrimination that was reasonable (C = 0.61) and higher than any of its predecessors. The LDRI is intended for use by transplant centers, organ procurement organizations, and regulatory agencies and to benefit patients in decision-making. Unlike its predecessors, the proposed LDRI could gain wide acceptance because of its granularity and similarity to the Kidney Donor Risk Index.


Assuntos
Rejeição de Enxerto , Sobrevivência de Enxerto , Transplante de Pulmão , Doadores de Tecidos , Obtenção de Tecidos e Órgãos , Humanos , Transplante de Pulmão/efeitos adversos , Feminino , Masculino , Doadores de Tecidos/provisão & distribuição , Pessoa de Meia-Idade , Fatores de Risco , Adulto , Rejeição de Enxerto/etiologia , Seguimentos , Prognóstico , Medição de Risco
2.
Artigo em Inglês | MEDLINE | ID: mdl-38599308

RESUMO

BACKGROUND & AIMS: Greater availability of less invasive biliary imaging to rule out choledocholithiasis should reduce the need for diagnostic endoscopic retrograde cholangiopancreatography (ERCP) in patients who have a remote history of cholecystectomy. The primary aims were to determine the incidence, characteristics, and outcomes of individuals who undergo first-time ERCP >1 year after cholecystectomy (late-ERCP). METHODS: Data from a commercial insurance claim database (Optum Clinformatics) identified 583,712 adults who underwent cholecystectomy, 4274 of whom underwent late-ERCP, defined as first-time ERCP for nonmalignant indications >1 year after cholecystectomy. Outcomes were exposure and temporal trends in late-ERCP, biliary imaging utilization, and post-ERCP outcomes. Multivariable logistic regression was used to examine patient characteristics associated with undergoing late-ERCP. RESULTS: Despite a temporal increase in the use of noninvasive biliary imaging (35.9% in 2004 to 65.6% in 2021; P < .001), the rate of late-ERCP increased 8-fold (0.5-4.2/1000 person-years from 2005 to 2021; P < .001). Although only 44% of patients who underwent late-ERCP had gallstone removal, there were high rates of post-ERCP pancreatitis (7.1%), hospitalization (13.1%), and new chronic opioid use (9.7%). Factors associated with late-ERCP included concomitant disorder of gut-brain interaction (odds ratio [OR], 6.48; 95% confidence interval [CI], 5.88-6.91) and metabolic dysfunction steatotic liver disease (OR, 3.27; 95% CI, 2.79-3.55) along with use of anxiolytic (OR, 3.45; 95% CI, 3.19-3.58), antispasmodic (OR, 1.60; 95% CI, 1.53-1.72), and chronic opioids (OR, 6.24; 95% CI, 5.79-6.52). CONCLUSIONS: The rate of late-ERCP postcholecystectomy is increasing significantly, particularly in patients with comorbidities associated with disorder of gut-brain interaction and mimickers of choledocholithiasis. Late-ERCPs are associated with disproportionately higher rates of adverse events, including initiation of chronic opioid use.

3.
Gastroenterology ; 165(5): 1197-1205.e2, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37481117

RESUMO

BACKGROUND & AIMS: We sought to estimate the incidence, prevalence, and racial-ethnic distribution of physician-diagnosed inflammatory bowel disease (IBD) in the United States. METHODS: The study used 4 administrative claims data sets: a 20% random sample of national fee-for-service Medicare data (2007 to 2017); Medicaid data from Florida, New York, Pennsylvania, Ohio, and California (1999 to 2012); and commercial health insurance data from Anthem beneficiaries (2006 to 2018) and Optum's deidentified Clinformatics Data Mart (2000 to 2017). We used validated combinations of medical diagnoses, diagnostic procedures, and prescription medications to identify incident and prevalent diagnoses. We computed pooled age-, sex-, and race/ethnicity-specific insurance-weighted estimates and pooled estimates standardized to 2018 United States Census estimates with 95% confidence intervals (CIs). RESULTS: The age- and sex-standardized incidence of IBD per 100,000 person-years was 10.9 (95% CI, 10.6-11.2). The incidence of IBD peaked in the third decade of life, decreased to a relatively stable level across the fourth to eighth decades, and declined further. The age-, sex- and insurance-standardized prevalence of IBD was 721 per 100,000 population (95% CI, 717-726). Extrapolated to the 2020 United States Census, an estimated 2.39 million Americans are diagnosed with IBD. The prevalence of IBD per 100,000 population was 812 (95% CI, 802-823) in White, 504 (95% CI, 482-526) in Black, 403 (95% CI, 373-433) in Asian, and 458 (95% CI, 440-476) in Hispanic Americans. CONCLUSIONS: IBD is diagnosed in >0.7% of Americans. The incidence peaks in early adulthood and then plateaus at a lower rate. The disease is less commonly diagnosed in Black, Asian, and Hispanic Americans.


Assuntos
Doenças Inflamatórias Intestinais , Medicare , Humanos , Estados Unidos/epidemiologia , Idoso , Adulto , Prevalência , Incidência , Doenças Inflamatórias Intestinais/diagnóstico , Doenças Inflamatórias Intestinais/epidemiologia , Florida
4.
Liver Transpl ; 30(7): 689-698, 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38265295

RESUMO

Given liver transplantation organ scarcity, selection of recipients and donors to maximize post-transplant benefit is paramount. Several scores predict post-transplant outcomes by isolating elements of donor and recipient risk, including the donor risk index, Balance of Risk, pre-allocation score to predict survival outcomes following liver transplantation/survival outcomes following liver transplantation (SOFT), improved donor-to-recipient allocation score for deceased donors only/improved donor-to-recipient allocation score for both deceased and living donors (ID2EAL-D/-DR), and survival benefit (SB) models. No studies have examined the performance of these models over time, which is critical in an ever-evolving transplant landscape. This was a retrospective cohort study of liver transplantation events in the UNOS database from 2002 to 2021. We used Cox regression to evaluate model discrimination (Harrell's C) and calibration (testing of calibration curves) for post-transplant patient and graft survival at specified post-transplant timepoints. Sub-analyses were performed in the modern transplant era (post-2014) and for key donor-recipient characteristics. A total of 112,357 transplants were included. The SB and SOFT scores had the highest discrimination for short-term patient and graft survival, including in the modern transplant era, where only the SB model had good discrimination (C ≥ 0.60) for all patient and graft outcome timepoints. However, these models had evidence of poor calibration at 3- and 5-year patient survival timepoints. The ID2EAL-DR score had lower discrimination but adequate calibration at all patient survival timepoints. In stratified analyses, SB and SOFT scores performed better in younger (< 40 y) and higher Model for End-Stage Liver Disease (≥ 25) patients. All prediction scores had declining discrimination over time, and scores relying on donor factors alone had poor performance. Although the SB and SOFT scores had the best overall performance, all models demonstrated declining performance over time. This underscores the importance of periodically updating and/or developing new prediction models to reflect the evolving transplant field. Scores relying on donor factors alone do not meaningfully inform post-transplant risk.


Assuntos
Doença Hepática Terminal , Sobrevivência de Enxerto , Transplante de Fígado , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Feminino , Masculino , Pessoa de Meia-Idade , Medição de Risco/estatística & dados numéricos , Medição de Risco/métodos , Doença Hepática Terminal/cirurgia , Doença Hepática Terminal/mortalidade , Doença Hepática Terminal/diagnóstico , Adulto , Fatores de Risco , Fatores de Tempo , Doadores Vivos/estatística & dados numéricos , Seleção do Doador/normas , Seleção do Doador/métodos , Seleção do Doador/estatística & dados numéricos , Idoso , Modelos de Riscos Proporcionais , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/métodos , Obtenção de Tecidos e Órgãos/normas , Resultado do Tratamento , Doadores de Tecidos/estatística & dados numéricos , Bases de Dados Factuais/estatística & dados numéricos
5.
Stat Med ; 2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38780593

RESUMO

In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.

6.
Clin Transplant ; 38(5): e15319, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38683684

RESUMO

OBJECTIVE: Longer end-stage renal disease time has been associated with inferior kidney transplant outcomes. However, the contribution of transplant evaluation is uncertain. We explored the relationship between time from evaluation to listing (ELT) and transplant outcomes. METHODS: This retrospective study included 2535 adult kidney transplants from 2000 to 2015. Kaplan-Meier survival curves, log-rank tests, and Cox regression models were used to compare transplant outcomes. RESULTS: Patient survival for both deceased donor (DD) recipients (p < .001) and living donor (LD) recipients (p < .0001) was significantly higher when ELT was less than 3 months. The risks of ELT appeared to be mediated by other risks in DD recipients, as adjusted models showed no associated risk of graft loss or death in DD recipients. For LD recipients, ELT remained a risk factor for patient death after covariate adjustment. Each month of ELT was associated with an increased risk of death (HR = 1.021, p = .04) but not graft loss in LD recipients in adjusted models. CONCLUSIONS: Kidney transplant recipients with longer ELT times had higher rates of death after transplant, and ELT was independently associated with an increased risk of death for LD recipients. Investigations on the impact of pretransplant evaluation on post-transplant outcomes can inform transplant policy and practice.


Assuntos
Sobrevivência de Enxerto , Falência Renal Crônica , Transplante de Rim , Listas de Espera , Humanos , Transplante de Rim/mortalidade , Transplante de Rim/efeitos adversos , Feminino , Masculino , Estudos Retrospectivos , Pessoa de Meia-Idade , Falência Renal Crônica/cirurgia , Seguimentos , Fatores de Risco , Listas de Espera/mortalidade , Prognóstico , Taxa de Sobrevida , Adulto , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/mortalidade , Doadores de Tecidos/provisão & distribuição , Taxa de Filtração Glomerular , Testes de Função Renal , Doadores Vivos/provisão & distribuição , Obtenção de Tecidos e Órgãos , Fatores de Tempo , Complicações Pós-Operatórias
7.
Transpl Infect Dis ; : e14317, 2024 Jun 09.
Artigo em Inglês | MEDLINE | ID: mdl-38852064

RESUMO

BACKGROUND: Opportunistic infections (OIs) are a significant cause of morbidity and mortality after organ transplantation, though data in the liver transplant (LT) population are limited. METHODS: We performed a retrospective cohort study of LT recipients between January 1, 2007 and Deceber 31, 2016 using Medicare claims data linked to the Organ Procurement and Transplantation Network database. Multivariable Cox regression models evaluated factors independently associated with hospitalizations for early (≤1 year post transplant) and late (>1 year) OIs, with a particular focus on immunosuppression. RESULTS: There were 11 320 LT recipients included in the study, of which 13.2% had at least one OI hospitalization during follow-up. Of the 2638 OI hospitalizations, 61.9% were early post-LT. Cytomegalovirus was the most common OI (45.4% overall), although relative frequency decreased after the first year (25.3%). Neither induction or maintenance immunosuppression were associated with early OI hospitalization (all p > .05). The highest risk of early OI was seen with primary sclerosing cholangitis (aHR 1.74; p = .003 overall). Steroid-based and mechanistic target of rapamycin inhibitor-based immunosuppression at 1 year post LT were independently associated with increased late OI (p < .001 overall). CONCLUSION: This study found OI hospitalizations to be relatively common among LT recipients and frequently occur later than previously reported. Immunosuppression regimen may be an important modifiable risk factor for late OIs.

8.
Pediatr Crit Care Med ; 25(1): e41-e46, 2024 Jan 01.
Artigo em Inglês | MEDLINE | ID: mdl-37462429

RESUMO

OBJECTIVE: To determine the association of venovenous extracorporeal membrane oxygenation (VV-ECMO) initiation with changes in vasoactive-inotropic scores (VISs) in children with pediatric acute respiratory distress syndrome (PARDS) and cardiovascular instability. DESIGN: Retrospective cohort study. SETTING: Single academic pediatric ECMO center. PATIENTS: Children (1 mo to 18 yr) treated with VV-ECMO (2009-2019) for PARDS with need for vasopressor or inotropic support at ECMO initiation. MEASUREMENTS AND MAIN RESULTS: Arterial blood gas values, VIS, mean airway pressure (mPaw), and oxygen saturation (Sp o2 ) values were recorded hourly relative to the start of ECMO flow for 24 hours pre-VV-ECMO and post-VV-ECMO cannulation. A sharp kink discontinuity regression analysis clustered by patient tested the difference in VISs and regression line slopes immediately surrounding cannulation. Thirty-two patients met inclusion criteria: median age 6.6 years (interquartile range [IQR] 1.5-11.7), 22% immunocompromised, and 75% had pneumonia or sepsis as the cause of PARDS. Pre-ECMO characteristics included: median oxygenation index 45 (IQR 35-58), mPaw 32 cm H 2o (IQR 30-34), 97% on inhaled nitric oxide, and 81% on an advanced mode of ventilation. Median VIS immediately before VV-ECMO cannulation was 13 (IQR 8-25) with an overall increasing VIS trajectory over the hours before cannulation. VISs decreased and the slope of the regression line reversed immediately surrounding the time of cannulation (robust p < 0.0001). There were pre-ECMO to post-ECMO cannulation decreases in mPaw (32 vs 20 cm H 2o , p < 0.001) and arterial P co2 (64.1 vs 50.1 mm Hg, p = 0.007) and increases in arterial pH (7.26 vs 7.38, p = 0.001), arterial base excess (2.5 vs 5.2, p = 0.013), and SpO 2 (91% vs 95%, p = 0.013). CONCLUSIONS: Initiation of VV-ECMO was associated with an immediate and sustained reduction in VIS in PARDS patients with cardiovascular instability. This VIS reduction was associated with decreased mPaw and reduced respiratory and/or metabolic acidosis as well as improved oxygenation.


Assuntos
Oxigenação por Membrana Extracorpórea , Síndrome do Desconforto Respiratório , Insuficiência Respiratória , Humanos , Criança , Estudos Retrospectivos , Síndrome do Desconforto Respiratório/terapia , Insuficiência Respiratória/terapia , Artérias
9.
Clin Gastroenterol Hepatol ; 21(5): 1214-1222.e14, 2023 05.
Artigo em Inglês | MEDLINE | ID: mdl-35750248

RESUMO

BACKGROUND: Patients with acute pancreatitis (AP) have at least a 2-fold higher risk for developing postpancreatitis diabetes mellitus (PPDM). No therapies have prevented PPDM. Statins were demonstrated to possibly lower the incidence and severity of AP but have not been studied to prevent PPDM. METHODS: Data from a commercial insurance claim database (Optum Clinformatics) were used to assess the impact of statins on patients without pre-existing DM admitted for a first episode of AP in 118,479 patients. Regular statin usage was defined as filled statin prescriptions for at least 80% of the year prior to AP. The primary outcome was defined as PPDM. We constructed a propensity score and applied inverse probability of treatment weighting to balance baseline characteristics between groups. Using Cox proportional hazards regression modeling, we estimated the risk of PPDM, accounting for competing events. RESULTS: With a median of 3.5 years of follow-up, the 5-year cumulative incidence of PPDM was 7.5% (95% confidence interval [CI], 6.9% to 8.0%) among regular statin users and 12.7% (95% CI, 12.4% to 12.9%) among nonusers. Regular statin users had a 42% lower risk of developing PPDM compared with nonusers (hazard ratio, 0.58; 95% CI, 0.52 to 0.65; P < .001). Irregular statin users had a 15% lower risk of PPDM (hazard ratio, 0.85; 95% CI, 0.81 to 0.89; P < .001). Similar benefits were seen with low, moderate, and high statin doses. CONCLUSIONS: In a large database-based study, statin usage reduced the risk of developing DM after acute pancreatitis. Further prospective studies with long-term follow-up are needed to study the impact of statins on acute pancreatitis and prevention of PPDM.


Assuntos
Diabetes Mellitus , Inibidores de Hidroximetilglutaril-CoA Redutases , Pancreatite , Humanos , Pancreatite/epidemiologia , Pancreatite/prevenção & controle , Inibidores de Hidroximetilglutaril-CoA Redutases/uso terapêutico , Estudos Prospectivos , Doença Aguda , Diabetes Mellitus/tratamento farmacológico , Diabetes Mellitus/epidemiologia , Estudos Retrospectivos
10.
Clin Gastroenterol Hepatol ; 21(11): 2817-2824.e4, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-36967101

RESUMO

BACKGROUND & AIMS: Antibiotic exposure leads to changes in the gut microbiota. Our objective was to evaluate the association between antibiotic exposure and esophageal adenocarcinoma (EAC) risk. METHODS: We performed a nested case-control study using data from the Veterans Health Administration from 2004 through 2020. The case group consisted of patients who received an incident diagnosis of EAC. For each case, up to 20 matched controls were selected using incidence density sampling. Our primary exposure of interest was any oral or intravenous antibiotic use. Our secondary exposures included cumulative number of days of exposure and classification of antibiotics by various subgroups. Conditional logistic regression was used to estimate the crude and adjusted odds ratios (aORs) for the risk of EAC associated with antibiotic exposure. RESULTS: The case-control analysis included 8226 EAC cases and 140,670 matched controls. Exposure to any antibiotic was associated with an aOR for EAC of 1.74 (95% confidence interval [CI], 1.65-1.83) vs no antibiotic exposure. Compared with no antibiotic exposure, the aOR for EAC was 1.63 (95% CI, 1.52-1.74; P < .001) for cumulative exposure to any antibiotic for 1 to 15 days; 1.77 (95% CI, 1.65-1.89; P < 0 .001) for 16 to 47 days; and 1.87 (95% CI, 1.75-2.01; P < .001) for ≥48 days, respectively (P for trend < .001). CONCLUSION: Exposure to any antibiotic is associated with an increased risk of EAC, and this risk increases as the cumulative days of exposure increase. This novel finding is hypothesis-generating for potential mechanisms that may play a role in the development or progression of EAC.


Assuntos
Adenocarcinoma , Esôfago de Barrett , Neoplasias Esofágicas , Humanos , Antibacterianos/efeitos adversos , Estudos de Casos e Controles , Neoplasias Esofágicas/induzido quimicamente , Neoplasias Esofágicas/epidemiologia , Adenocarcinoma/induzido quimicamente , Adenocarcinoma/epidemiologia , Fatores de Risco , Esôfago de Barrett/complicações
11.
Clin Gastroenterol Hepatol ; 21(5): 1233-1242.e14, 2023 05.
Artigo em Inglês | MEDLINE | ID: mdl-36075501

RESUMO

BACKGROUND & AIMS: The Cotton Consensus (CC) criteria for post-endoscopic retrograde cholangiopancreatography (ERCP) pancreatitis (PEP) may not capture post-ERCP morbidity. PAN-PROMISE, a patient-reported outcome measure (PROM), was developed to quantify acute pancreatitis-related morbidity. This study aims to determine the value of PAN-PROMISE in independently defining ERCP-related morbidity. METHODS: We conducted a prospective cohort study of patients undergoing ERCP at 2 academic centers from September 2021 to August 2022. We administered PAN-PROMISE and assessed quality of life and work productivity at baseline, 48 to 72 hours, 7 days, and 30 days following ERCP. PEP was defined by a 3-physician committee using the CC criteria. We defined high morbidity following ERCP (elevated PROM) by an increase of PAN-PROMISE score of >7 at 7 days post-procedure. The McNemar test assessed discordance between PEP and elevated-PROM. RESULTS: A total of 679 patients were enrolled. Choledocholithiasis (30%) and malignant biliary obstruction (29%) were the main indications for ERCP. Thirty-two patients (4.7%) developed PEP. One hundred forty-seven patients (21.6%) had an elevated PROM, whereas only 20 of them (13.4%) had PEP by the CC criteria (P < .001 for discordance). An elevated PROM strongly correlated with lower physical quality of life and increased direct and indirect health care costs ($80 and $25 per point increase in PAN-PROMISE, respectively). Patients with pancreatic cancer (odds ratio, 4.52; 95% confidence interval, 1.68-10.74) and primary sclerosing cholangitis (odds ratio, 1.79; 95% confidence interval, 1.29-2.45) had the highest odds of elevated PROM. CONCLUSIONS: A substantial number of patients experience significant morbidity after ERCP despite not developing PEP or other adverse events. Future studies are needed to characterize better the reasons behind this increase in symptoms and potential interventions to reduce the symptom burden post-ERCP. CLINICALTRIALS: gov number, NCT05310409.


Assuntos
Colangiopancreatografia Retrógrada Endoscópica , Pancreatite , Humanos , Colangiopancreatografia Retrógrada Endoscópica/efeitos adversos , Colangiopancreatografia Retrógrada Endoscópica/métodos , Pancreatite/diagnóstico , Pancreatite/epidemiologia , Pancreatite/etiologia , Estudos Prospectivos , Doença Aguda , Qualidade de Vida , Morbidade , Medidas de Resultados Relatados pelo Paciente , Fatores de Risco , Estudos Retrospectivos
12.
BMC Oral Health ; 23(1): 763, 2023 10 17.
Artigo em Inglês | MEDLINE | ID: mdl-37848867

RESUMO

BACKGROUND: Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. METHODS: We describe here the study design, including processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management and statistical plan. DISCUSSION: We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We also highlight the rigors and challenges associated with our ongoing participant recruitment and retention. A rigorous prospective longitudinal study requires proper planning and execution. A great benefit is that large data sets are collected and biospecimen repository can be used to answer more questions in future studies including genetic, microbiome and metabolome-based studies. TRIAL REGISTRATION: National Institute of Health Clinical Trials Registration (NCT) #: NCT04645693.


Assuntos
Fármacos Anti-HIV , Infecções por HIV , Humanos , Infecções por HIV/complicações , Infecções por HIV/tratamento farmacológico , Fármacos Anti-HIV/efeitos adversos , Estudos Longitudinais , Estudos Prospectivos , Carga Viral , Avaliação de Resultados em Cuidados de Saúde
13.
Cancer ; 128(1): 150-159, 2022 01 01.
Artigo em Inglês | MEDLINE | ID: mdl-34541673

RESUMO

BACKGROUND: Solid organ transplant recipients have an elevated risk of cancer. Quantifying the life-years lost (LYL) due to cancer provides a complementary view of the burden of cancer distinct from other metrics and may identify subgroups of transplant recipients who are most affected. METHODS: Linked transplant and cancer registry data were used to identify incident cancers and deaths among solid organ transplant recipients in the United States (1987-2014). Data on LYL due to cancer within 10 years posttransplant were derived using mean survival estimates from Cox models. RESULTS: Among 221,962 transplant recipients, 13,074 (5.9%) developed cancer within 10 years of transplantation. During this period, the mean LYL due to cancer were 0.16 years per transplant recipient and 2.7 years per cancer case. Cancer was responsible for a loss of 1.9% of the total life-years expected in the absence of cancer in this population. Lung recipients had the highest proportion of total LYL due to cancer (0.45%) followed by heart recipients (0.29%). LYL due to cancer increased with age, from 0.5% among those aged birth to 34 years at transplant to 3.2% among those aged 50 years and older. Among recipients overall, lung cancer was the largest contributor, accounting for 24% of all LYL due to cancer, and non-Hodgkin lymphoma had the next highest contribution (15%). CONCLUSIONS: Transplant recipients have a shortened lifespan after developing cancer. Lung cancer and non-Hodgkin lymphoma contribute strongly to LYL due to cancer within the first 10 years after transplant, highlighting opportunities to reduce cancer mortality through prevention and screening.


Assuntos
Neoplasias Pulmonares , Linfoma não Hodgkin , Transplante de Órgãos , Adolescente , Adulto , Criança , Pré-Escolar , Humanos , Incidência , Lactente , Recém-Nascido , Linfoma não Hodgkin/epidemiologia , Pessoa de Meia-Idade , Transplante de Órgãos/efeitos adversos , Sistema de Registros , Fatores de Risco , Transplantados , Estados Unidos/epidemiologia , Adulto Jovem
14.
Am J Transplant ; 22(2): 599-609, 2022 02.
Artigo em Inglês | MEDLINE | ID: mdl-34613666

RESUMO

Kidney transplantation (KT) from deceased donors with hepatitis C virus (HCV) into HCV-negative recipients has become more common. However, the risk of complications such as BK polyomavirus (BKPyV) remains unknown. We assembled a retrospective cohort at four centers. We matched recipients of HCV-viremic kidneys to highly similar recipients of HCV-aviremic kidneys on established risk factors for BKPyV. To limit bias, matches were within the same center. The primary outcome was BKPyV viremia ≥1000 copies/ml or biopsy-proven BKPyV nephropathy; a secondary outcome was BKPyV viremia ≥10 000 copies/ml or nephropathy. Outcomes were analyzed using weighted and stratified Cox regression. The median days to peak BKPyV viremia level was 119 (IQR 87-182). HCV-viremic KT was not associated with increased risk of the primary BKPyV outcome (HR 1.26, p = .22), but was significantly associated with the secondary outcome of BKPyV ≥10 000 copies/ml (HR 1.69, p = .03). One-year eGFR was similar between the matched groups. Only one HCV-viremic kidney recipient had primary graft loss. In summary, HCV-viremic KT was not significantly associated with the primary outcome of BKPyV viremia, but the data suggested that donor HCV might elevate the risk of more severe BKPyV viremia ≥10 000 copies/ml. Nonetheless, one-year graft function for HCV-viremic recipients was reassuring.


Assuntos
Vírus BK , Transplante de Rim , Infecções por Polyomavirus , Infecções Tumorais por Vírus , Hepacivirus , Humanos , Transplante de Rim/efeitos adversos , Estudos Retrospectivos , Infecções Tumorais por Vírus/etiologia , Viremia
15.
Liver Transpl ; 28(3): 454-465, 2022 03.
Artigo em Inglês | MEDLINE | ID: mdl-34365719

RESUMO

Transplant center performance and practice variation for pediatric post-liver transplantation (LT) outcomes other than survival are understudied. This was a retrospective cohort study of pediatric LT recipients who received transplants between January 1, 2006, and May 31, 2017, using United Network for Organ Sharing (UNOS) data that were merged with the Pediatric Health Information System database. Center effects for the acute rejection rate at 1 year after LT (AR1) using UNOS coding and the biliary complication rate at 1 year after LT (BC1) using inpatient billing claims data were estimated by center-specific rescaled odds ratios that accounted for potential differences in recipient and donor characteristics. There were 2216 pediatric LT recipients at 24 freestanding children's hospitals in the United States during the study period. The median unadjusted center rate of AR1 was 36.92% (interquartile range [IQR], 22.36%-44.52%), whereas that of BC1 was 32.29% (IQR, 26.14%-40.44%). Accounting for recipient case mix and donor factors, 5/24 centers performed better than expected with regard to AR1, whereas 3/24 centers performed worse than expected. There was less heterogeneity across the center effects for BC1 than for AR1. There was no relationship observed between the center effects for AR1 or BC1 and center volume. Beyond recipient and allograft factors, differences in transplant center management are an important driver of center AR1 performance, and less so of BC1 performance. Further research is needed to identify the sources of variability so as to implement the most effective solutions to broadly enhance outcomes for pediatric LT recipients.


Assuntos
Transplante de Fígado , Criança , Bases de Dados Factuais , Sobrevivência de Enxerto , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Doadores de Tecidos , Transplantados , Estados Unidos/epidemiologia
16.
Biometrics ; 78(1): 192-201, 2022 03.
Artigo em Inglês | MEDLINE | ID: mdl-33616953

RESUMO

Restricted mean survival time (RMST) is a clinically interpretable and meaningful survival metric that has gained popularity in recent years. Several methods are available for regression modeling of RMST, most based on pseudo-observations or what is essentially an inverse-weighted complete-case analysis. No existing RMST regression method allows for the covariate effects to be expressed as functions over time. This is a considerable limitation, in light of the many hazard regression methods that do accommodate such effects. To address this void in the literature, we propose RMST methods that permit estimating time-varying effects. In particular, we propose an inference framework for directly modeling RMST as a continuous function of L. Large-sample properties are derived. Simulation studies are performed to evaluate the performance of the methods in finite sample sizes. The proposed framework is applied to kidney transplant data obtained from the Scientific Registry of Transplant Recipients.


Assuntos
Taxa de Sobrevida , Modelos de Riscos Proporcionais , Análise de Regressão , Tamanho da Amostra , Análise de Sobrevida
17.
Transpl Int ; 35: 10345, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35356400

RESUMO

Optimal kidney graft outcomes after simultaneous liver-kidney (SLK) transplant may be threatened by the increased cold ischemia time and hemodynamic perturbations of dual organ transplantation. Hypothermic machine perfusion (MP) of kidney allografts may mitigate these effects. We analyzed U.S. trends and renal outcomes of hypothermic non-oxygenated MP vs. static cold storage (CS) of kidney grafts from 6,689 SLK transplants performed between 2005 and 2020 using the United Network for Organ Sharing database. Outcomes included delayed graft function (DGF), primary non-function (PNF), and kidney graft survival (GS). Overall, 17.2% of kidney allografts were placed on MP. Kidney cold ischemia time was longer in the MP group (median 12.8 vs. 10.0 h; p < 0.001). Nationally, MP utilization in SLK increased from <3% in 2005 to >25% by 2019. Center preference was the primary determinant of whether a graft underwent MP vs. CS (intraclass correlation coefficient 65.0%). MP reduced DGF (adjusted OR 0.74; p = 0.008), but not PNF (p = 0.637). Improved GS with MP was only observed with Kidney Donor Profile Index <20% (HR 0.71; p = 0.030). Kidney MP has increased significantly in SLK in the U.S. in a heterogeneous manner and with variable short-term benefits. Additional studies are needed to determine the ideal utilization for MP in SLK.


Assuntos
Transplante de Rim , Aloenxertos , Humanos , Rim , Fígado , Preservação de Órgãos , Perfusão , Estados Unidos
18.
Am J Transplant ; 21(3): 1092-1099, 2021 03.
Artigo em Inglês | MEDLINE | ID: mdl-32741074

RESUMO

Transplant centers coordinate complex care in acute liver failure (ALF), for which liver transplant (LT) can be lifesaving. We studied associations between waitlist outcomes and center (1) ALF waitlist volume (low: <20; medium: 20-39; high: 40+ listings) and (2) total LT volume (<600, 600-1199, 1200+ LTs) in a retrospective cohort of 3248 adults with ALF listed for LT at 92 centers nationally from 2002 to 2019. Predicted outcome probabilities (LT, died/too sick, spontaneous survival [SS]) were obtained with multinomial regression, and observed-to-expected ratios were calculated. Median center outcome rates were 72.6% LT, 18.2% died/too sick, and 6.1% SS. SS was significantly higher with greater center ALF volume (median 0% for low-, 5.9% for medium-, and 8.6% for high-volume centers; P = .039), while waitlist mortality was highest at low-volume centers (median 21.4%, IQR: 16.1%-26.7%; P = .042). Significant heterogeneity in center performance was observed for waitlist mortality (observed-to-expected ratio range: 0-4.1) and particularly for SS (0-6.4), which persisted despite accounting for recipient case mix. This novel study demonstrates that increased center experience is associated with greater SS and reduced waitlist mortality for ALF. More-focused management pathways are needed to improve ALF outcomes at less-experienced centers and to identify opportunities for improvement at large.


Assuntos
Falência Hepática Aguda , Transplante de Fígado , Adulto , Humanos , Falência Hepática Aguda/cirurgia , Sistema de Registros , Estudos Retrospectivos , Listas de Espera
19.
Liver Transpl ; 27(8): 1154-1164, 2021 08.
Artigo em Inglês | MEDLINE | ID: mdl-33733570

RESUMO

Black race is a risk factor for end-stage renal disease (ESRD). Racial disparities in the risks of early and long-term renal complications after liver transplantation (LT) have not been systematically studied. This study evaluated racial differences in the natural history of acute and chronic renal insufficiency after LT. This was a retrospective single-center cohort study of 763 non-Hispanic White and 181 Black LT recipients between 2008 and 2017. Black race was investigated as an independent predictor of the following outcomes: (1) receipt and duration of early post-LT hemodialysis and (2) time to post-LT ESRD. The interaction of race and post-LT ESRD on survival was also studied. Black recipients had higher rates of pre-LT hypertension (P < 0.001), but diabetes mellitus and renal function before LT were not different by race (all P > 0.05). Overall, 15.2% of patients required early hemodialysis immediately after LT with no difference by race (covariate-adjusted odds ratio, 0.89; P = 0.71). Early dialysis discontinuation was lower among Black recipients (covariate-adjusted hazard ratio [aHR], 0.47; P = 0.02), whereas their rate of post-LT ESRD was higher (aHR, 1.91; P = 0.005). Post-LT survival after ESRD was markedly worse for Black (aHR, 11.18; P < 0.001) versus White recipients (aHR, 5.83; P < 0.001; interaction P = 0.08). Although Black and White LT recipients had comparable pretransplant renal function, post-LT renal outcomes differed considerably, and the impact of ESRD on post-LT survival was greater for Black recipients. This study highlights the need for an individualized approach to post-LT management to improve outcomes for all patients.


Assuntos
Falência Renal Crônica , Transplante de Fígado , Estudos de Coortes , Humanos , Falência Renal Crônica/cirurgia , Transplante de Fígado/efeitos adversos , Diálise Renal/efeitos adversos , Estudos Retrospectivos , Fatores de Risco
20.
Am J Kidney Dis ; 78(3): 369-379.e1, 2021 09.
Artigo em Inglês | MEDLINE | ID: mdl-33857533

RESUMO

RATIONALE & OBJECTIVE: As the proportion of arteriovenous fistulas (AVFs) compared with arteriovenous grafts (AVGs) in the United States has increased, there has been a concurrent increase in interventions. We explored AVF and AVG maturation and maintenance procedural burden in the first year of hemodialysis. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Patients initiating hemodialysis from July 1, 2012, to December 31, 2014, and having a first-time AVF or AVG placement between dialysis initiation and 1 year (N = 73,027), identified using the US Renal Data System (USRDS). PREDICTORS: Patient characteristics. OUTCOME: Successful AVF/AVG use and intervention procedure burden. ANALYTICAL APPROACH: For each group, we analyzed interventional procedure rates during maturation maintenance phases using Poisson regression. We used proportional rate modeling for covariate-adjusted analysis of interventional procedure rates during the maintenance phase. RESULTS: During the maturation phase, 13,989 of 57,275 patients (24.4%) in the AVF group required intervention, with therapeutic interventional requirements of 0.36 per person. In the AVG group 2,904 of 15,572 patients (18.4%) required intervention during maturation, with therapeutic interventional requirements of 0.28 per person. During the maintenance phase, in the AVF group 12,732 of 32,115 patients (39.6%) required intervention, with a therapeutic intervention rate of 0.93 per person-year. During maintenance phase, in the AVG group 5,928 of 10,271 patients (57.7%) required intervention, with a therapeutic intervention rate of 1.87 per person-year. For both phases, the intervention rates for AVF tended to be higher on the East Coast while those for AVG were more uniform geographically. LIMITATIONS: This study relies on administrative data, with monthly recording of access use. CONCLUSIONS: During maturation, interventions for both AVFs and AVGs were relatively common. Once successfully matured, AVFs had lower maintenance interventional requirements. During the maturation and maintenance phases, there were geographic variations in AVF intervention rates that warrant additional study.


Assuntos
Derivação Arteriovenosa Cirúrgica/efeitos adversos , Oclusão de Enxerto Vascular/epidemiologia , Diálise Renal/efeitos adversos , Grau de Desobstrução Vascular/fisiologia , Adulto , Idoso , Feminino , Seguimentos , Oclusão de Enxerto Vascular/etiologia , Oclusão de Enxerto Vascular/fisiopatologia , Humanos , Incidência , Falência Renal Crônica/terapia , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Risco , Fatores de Tempo , Estados Unidos/epidemiologia , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA