Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 164
Filtrar
1.
Am J Transplant ; 2024 Aug 23.
Artículo en Inglés | MEDLINE | ID: mdl-39182615

RESUMEN

Lung size measurements play an important role in transplantation, as optimal donor-recipient size matching is necessary to ensure the best possible outcome. Although several strategies for size matching are currently used, all have limitations, and none has proven superior. In this pilot study, we leveraged deep learning and computer vision to develop an automated system for generating standardized lung size measurements using portable chest radiographs to improve accuracy, reduce variability, and streamline donor/recipient matching. We developed a 2-step framework involving lung mask extraction from chest radiographs followed by feature point detection to generate 6 distinct lung height and width measurements, which we validated against measurements reported by 2 radiologists (M.K.I. and R.R.) for 50 lung transplant recipients. Our system demonstrated <2.5% error (<7.0 mm) with robust interrater and intrarater agreement compared with an expert radiologist review. This is especially promising given that the radiographs used in this study were purposely chosen to include images with technical challenges such as consolidations, effusions, and patient rotation. Although validation in a larger cohort is necessary, this study highlights artificial intelligence's potential to both provide reproducible lung size assessment in real patients and enable studies on the effect of lung size matching on transplant outcomes in large data sets.

2.
Biom J ; 66(6): e202200371, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-39149839

RESUMEN

Analysis of the restricted mean survival time (RMST) has become increasingly common in biomedical studies during the last decade as a means of estimating treatment or covariate effects on survival. Advantages of RMST over the hazard ratio (HR) include increased interpretability and lack of reliance on the often tenuous proportional hazards assumption. Some authors have argued that RMST regression should generally be the frontline analysis as opposed to methods based on counting process increments. However, in order for the use of the RMST to be more mainstream, it is necessary to broaden the range of data structures to which pertinent methods can be applied. In this report, we address this issue from two angles. First, most of existing methodological development for directly modeling RMST has focused on multiplicative models. An additive model may be preferred due to goodness of fit and/or parameter interpretation. Second, many settings encountered nowadays feature high-dimensional categorical (nuisance) covariates, for which parameter estimation is best avoided. Motivated by these considerations, we propose stratified additive models for direct RMST analysis. The proposed methods feature additive covariate effects. Moreover, nuisance factors can be factored out of the estimation, akin to stratification in Cox regression, such that focus can be appropriately awarded to the parameters of chief interest. Large-sample properties of the proposed estimators are derived, and a simulation study is performed to assess finite-sample performance. In addition, we provide techniques for evaluating a fitted model with respect to risk discrimination and predictive accuracy. The proposed methods are then applied to liver transplant data to estimate the effects of donor characteristics on posttransplant survival time.


Asunto(s)
Modelos Estadísticos , Humanos , Análisis de Supervivencia , Trasplante de Hígado , Modelos de Riesgos Proporcionales , Biometría/métodos
3.
Liver Transpl ; 2024 Aug 26.
Artículo en Inglés | MEDLINE | ID: mdl-39177579

RESUMEN

End-stage renal disease (ESRD) after liver transplantation (LT) is associated with high morbidity and mortality. The consequences of hospitalizations for post-LT acute kidney injury (AKI) are poorly understood. Using linked Medicare claims and transplant registry data, we analyzed adult liver alone recipients not receiving pre-transplant dialysis between 1/1/2007-12/31/2016. Covariate-adjusted Cox proportional hazards models stratified by center evaluated factors associated with AKI readmission during the first post-LT year, and whether AKI readmission was associated with de novo early (<1 y) or late (≥1 y) ESRD post-LT. The cohort included 10,559 patients and was 64.5% male, 72.5% White, 8.1% Black and 14.0% Hispanic with median age 62 years. Overall, 2,875 (27.2%) patients had ≥1 AKI hospitalization during the first year. eGFR at LT was associated with AKI readmission (aHR 1.16 per 10 mL/min/1.73m2 decrease; p<0.001). The aHR for early ESRD in patients with ≥1 AKI readmission <90 days post-LT was 1.90 (p<0.001). The aHRs for late ESRD with 1 and ≥2 prior AKI readmissions were 1.57 and 2.80 respectively (p<0.001). AKI readmissions in the first post-LT year impact over one-quarter of recipients. These increase the risk of subsequent ESRD, but may represent an opportunity to intervene and mitigate further renal dysfunction.

4.
Transpl Infect Dis ; 26(4): e14317, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38852064

RESUMEN

BACKGROUND: Opportunistic infections (OIs) are a significant cause of morbidity and mortality after organ transplantation, though data in the liver transplant (LT) population are limited. METHODS: We performed a retrospective cohort study of LT recipients between January 1, 2007 and Deceber 31, 2016 using Medicare claims data linked to the Organ Procurement and Transplantation Network database. Multivariable Cox regression models evaluated factors independently associated with hospitalizations for early (≤1 year post transplant) and late (>1 year) OIs, with a particular focus on immunosuppression. RESULTS: There were 11 320 LT recipients included in the study, of which 13.2% had at least one OI hospitalization during follow-up. Of the 2638 OI hospitalizations, 61.9% were early post-LT. Cytomegalovirus was the most common OI (45.4% overall), although relative frequency decreased after the first year (25.3%). Neither induction or maintenance immunosuppression were associated with early OI hospitalization (all p > .05). The highest risk of early OI was seen with primary sclerosing cholangitis (aHR 1.74; p = .003 overall). Steroid-based and mechanistic target of rapamycin inhibitor-based immunosuppression at 1 year post LT were independently associated with increased late OI (p < .001 overall). CONCLUSION: This study found OI hospitalizations to be relatively common among LT recipients and frequently occur later than previously reported. Immunosuppression regimen may be an important modifiable risk factor for late OIs.


Asunto(s)
Hospitalización , Trasplante de Hígado , Medicare , Infecciones Oportunistas , Humanos , Estados Unidos/epidemiología , Masculino , Medicare/estadística & datos numéricos , Femenino , Hospitalización/estadística & datos numéricos , Estudios Retrospectivos , Factores de Riesgo , Anciano , Infecciones Oportunistas/epidemiología , Persona de Mediana Edad , Trasplante de Hígado/efectos adversos , Inmunosupresores/efectos adversos , Inmunosupresores/uso terapéutico , Terapia de Inmunosupresión/efectos adversos , Infecciones por Citomegalovirus/epidemiología
5.
JAMA Netw Open ; 7(6): e2417107, 2024 Jun 03.
Artículo en Inglés | MEDLINE | ID: mdl-38916893

RESUMEN

Importance: Centralizing deceased organ donor management and organ recovery into donor care units (DCUs) may mitigate the critical organ shortage by positively impacting donation and recipient outcomes. Objective: To compare donation and lung transplant outcomes between 2 common DCU models: independent (outside of acute-care hospitals) and hospital-based. Design, Setting, and Participants: This is a retrospective cohort study of Organ Procurement and Transplantation Network deceased donor registry and lung transplant recipient files from 21 US donor service areas with an operating DCU. Characteristics and lung donation rates among deceased donors cared for in independent vs hospital-based DCUs were compared. Eligible participants included deceased organ donors (aged 16 years and older) after brain death, who underwent organ recovery procedures between April 26, 2017, and June 30, 2022, and patients who received lung transplants from those donors. Data analysis was conducted from May 2023 to March 2024. Exposure: Organ recovery in an independent DCU (vs hospital-based DCU). Main Outcome and Measures: The primary outcome was duration of transplanted lung survival (through December 31, 2023) among recipients of lung(s) transplanted from cohort donors. A Cox proportional hazards model stratified by transplant year and program, adjusting for donor and recipient characteristics was used to compare graft survival. Results: Of 10 856 donors in the starting sample (mean [SD] age, 42.8 [15.2] years; 6625 male [61.0%] and 4231 female [39.0%]), 5149 (primary comparison group) underwent recovery procedures in DCUs including 1466 (28.4%) in 11 hospital-based DCUs and 3683 (71.5%) in 10 independent DCUs. Unadjusted lung donation rates were higher in DCUs than local hospitals, but lower in hospital-based vs independent DCUs (418 donors [28.5%] vs 1233 donors [33.5%]; P < .001). Among 1657 transplant recipients, 1250 (74.5%) received lung(s) from independent DCUs. Median (range) duration of follow-up after transplant was 734 (0-2292) days. Grafts recovered from independent DCUs had shorter restricted mean (SE) survival times than grafts from hospital-based DCUs (1548 [27] days vs 1665 [50] days; P = .04). After adjustment, graft failure remained higher among lungs recovered from independent DCUs than hospital-based DCUs (hazard ratio, 1.85; 95% CI, 1.28-2.65). Conclusions and Relevance: In this retrospective analysis of national donor and transplant recipient data, although lung donation rates were higher from deceased organ donors after brain death cared for in independent DCUs, lungs recovered from donors in hospital-based DCUs survived longer. These findings suggest that further work is necessary to understand which factors (eg, donor transfer, management, or lung evaluation and acceptance practices) differ between DCU models and may contribute to these differences.


Asunto(s)
Trasplante de Pulmón , Obtención de Tejidos y Órganos , Humanos , Trasplante de Pulmón/estadística & datos numéricos , Masculino , Femenino , Estudios Retrospectivos , Persona de Mediana Edad , Adulto , Obtención de Tejidos y Órganos/estadística & datos numéricos , Obtención de Tejidos y Órganos/métodos , Donantes de Tejidos/estadística & datos numéricos , Donantes de Tejidos/provisión & distribución , Receptores de Trasplantes/estadística & datos numéricos , Estados Unidos , Sistema de Registros , Supervivencia de Injerto
6.
Stat Med ; 43(16): 3036-3050, 2024 Jul 20.
Artículo en Inglés | MEDLINE | ID: mdl-38780593

RESUMEN

In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.


Asunto(s)
Supervivencia de Injerto , Trasplante de Riñón , Modelos Estadísticos , Trasplante de Riñón/mortalidad , Humanos , Pronóstico , Análisis de Supervivencia , Estados Unidos , Probabilidad , Simulación por Computador
7.
Kidney Med ; 6(5): 100814, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38689836

RESUMEN

Rationale & Objective: Limited data exist on longitudinal kidney outcomes after nonsurgical obesity treatments. We investigated the effects of intensive lifestyle intervention on kidney function over 10 years. Study Design: Post hoc analysis of Action for Health in Diabetes (Look AHEAD) randomized controlled trial. Setting & Participants: We studied 4,901 individuals with type 2 diabetes and body mass index of ≥25 kg/m2 enrolled in Look AHEAD (2001-2015). The original Look AHEAD trial excluded individuals with 4+ urine dipstick protein, serum creatinine level of >1.4 mg/dL (women), 1.5 mg/dL (men), or dialysis dependence. Exposures: Intensive lifestyle intervention versus diabetes support and education (ie, usual care). Outcome: Primary outcome was estimated glomerular filtration rate (eGFR, mL/min/1.73 m2) slope. Secondary outcomes were mean eGFR, slope, and mean urine albumin to creatinine ratio (UACR, mg/mg). Analytical Approach: Linear mixed-effects models with random slopes and intercepts to evaluate the association between randomization arms and within-individual repeated measures of eGFR and UACR. We tested for effect modification by baseline eGFR. Results: At baseline, mean eGFR was 89, and 83% had a normal UACR. Over 10 years, there was no difference in eGFR slope (+0.064 per year; 95% CI: -0.036 to 0.16; P = 0.21) between arms. Slope or mean UACR did not differ between arms. Baseline eGFR, categorized as eGFR of <80, 80-100, or >100, did not modify the intervention's effect on eGFR slope or mean. Limitations: Loss of muscle may confound creatinine-based eGFR. Conclusions: In patients with type 2 diabetes and preserved kidney function, intensive lifestyle intervention did not change eGFR slope over 10 years. Among participants with baseline eGFR <80, lifestyle intervention had a slightly higher longitudinal mean eGFR than usual care. Further studies evaluating the effects of intensive lifestyle intervention in people with kidney disease are needed.


Lifestyle interventions can improve chronic kidney disease risk factors, specifically diabetes, hypertension, and obesity. But, the effects of lifestyle intervention on change in kidney function (estimated glomerular filtration rate [eGFR]) over time are not well established. We studied Action for Health in Diabetes (Look AHEAD) trial data because all participants were affected by diabetes and overweight or obesity. Look AHEAD randomized participants to intensive lifestyle intervention or diabetes support and education (ie, usual care). We compared eGFR change over 10 years between groups, but found no difference. However, the intervention group maintained slightly higher eGFR than usual care, especially if eGFR was relatively low at baseline. Our study suggests lifestyle intervention may preserve eGFR, but dedicated studies in individuals with chronic kidney disease are needed.

8.
Clin Gastroenterol Hepatol ; 22(8): 1618-1627.e4, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38599308

RESUMEN

BACKGROUND & AIMS: Greater availability of less invasive biliary imaging to rule out choledocholithiasis should reduce the need for diagnostic endoscopic retrograde cholangiopancreatography (ERCP) in patients who have a remote history of cholecystectomy. The primary aims were to determine the incidence, characteristics, and outcomes of individuals who undergo first-time ERCP >1 year after cholecystectomy (late-ERCP). METHODS: Data from a commercial insurance claim database (Optum Clinformatics) identified 583,712 adults who underwent cholecystectomy, 4274 of whom underwent late-ERCP, defined as first-time ERCP for nonmalignant indications >1 year after cholecystectomy. Outcomes were exposure and temporal trends in late-ERCP, biliary imaging utilization, and post-ERCP outcomes. Multivariable logistic regression was used to examine patient characteristics associated with undergoing late-ERCP. RESULTS: Despite a temporal increase in the use of noninvasive biliary imaging (35.9% in 2004 to 65.6% in 2021; P < .001), the rate of late-ERCP increased 8-fold (0.5-4.2/1000 person-years from 2005 to 2021; P < .001). Although only 44% of patients who underwent late-ERCP had gallstone removal, there were high rates of post-ERCP pancreatitis (7.1%), hospitalization (13.1%), and new chronic opioid use (9.7%). Factors associated with late-ERCP included concomitant disorder of gut-brain interaction (odds ratio [OR], 6.48; 95% confidence interval [CI], 5.88-6.91) and metabolic dysfunction steatotic liver disease (OR, 3.27; 95% CI, 2.79-3.55) along with use of anxiolytic (OR, 3.45; 95% CI, 3.19-3.58), antispasmodic (OR, 1.60; 95% CI, 1.53-1.72), and chronic opioids (OR, 6.24; 95% CI, 5.79-6.52). CONCLUSIONS: The rate of late-ERCP postcholecystectomy is increasing significantly, particularly in patients with comorbidities associated with disorder of gut-brain interaction and mimickers of choledocholithiasis. Late-ERCPs are associated with disproportionately higher rates of adverse events, including initiation of chronic opioid use.


Asunto(s)
Colangiopancreatografia Retrógrada Endoscópica , Colecistectomía , Humanos , Masculino , Femenino , Colangiopancreatografia Retrógrada Endoscópica/efectos adversos , Persona de Mediana Edad , Colecistectomía/efectos adversos , Colecistectomía/estadística & datos numéricos , Adulto , Anciano , Adulto Joven , Estudios Retrospectivos , Adolescente , Complicaciones Posoperatorias/epidemiología , Anciano de 80 o más Años , Incidencia
9.
Clin Transplant ; 38(5): e15319, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38683684

RESUMEN

OBJECTIVE: Longer end-stage renal disease time has been associated with inferior kidney transplant outcomes. However, the contribution of transplant evaluation is uncertain. We explored the relationship between time from evaluation to listing (ELT) and transplant outcomes. METHODS: This retrospective study included 2535 adult kidney transplants from 2000 to 2015. Kaplan-Meier survival curves, log-rank tests, and Cox regression models were used to compare transplant outcomes. RESULTS: Patient survival for both deceased donor (DD) recipients (p < .001) and living donor (LD) recipients (p < .0001) was significantly higher when ELT was less than 3 months. The risks of ELT appeared to be mediated by other risks in DD recipients, as adjusted models showed no associated risk of graft loss or death in DD recipients. For LD recipients, ELT remained a risk factor for patient death after covariate adjustment. Each month of ELT was associated with an increased risk of death (HR = 1.021, p = .04) but not graft loss in LD recipients in adjusted models. CONCLUSIONS: Kidney transplant recipients with longer ELT times had higher rates of death after transplant, and ELT was independently associated with an increased risk of death for LD recipients. Investigations on the impact of pretransplant evaluation on post-transplant outcomes can inform transplant policy and practice.


Asunto(s)
Supervivencia de Injerto , Fallo Renal Crónico , Trasplante de Riñón , Listas de Espera , Humanos , Trasplante de Riñón/mortalidad , Trasplante de Riñón/efectos adversos , Femenino , Masculino , Estudios Retrospectivos , Persona de Mediana Edad , Fallo Renal Crónico/cirugía , Estudios de Seguimiento , Factores de Riesgo , Listas de Espera/mortalidad , Pronóstico , Tasa de Supervivencia , Adulto , Rechazo de Injerto/etiología , Rechazo de Injerto/mortalidad , Donantes de Tejidos/provisión & distribución , Tasa de Filtración Glomerular , Pruebas de Función Renal , Donadores Vivos/provisión & distribución , Obtención de Tejidos y Órganos , Factores de Tiempo , Complicaciones Posoperatorias
10.
Am J Transplant ; 24(5): 839-849, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38266712

RESUMEN

Lung transplantation lags behind other solid organ transplants in donor lung utilization due, in part, to uncertainty regarding donor quality. We sought to develop an easy-to-use donor risk metric that, unlike existing metrics, accounts for a rich set of donor factors. Our study population consisted of n = 26 549 adult lung transplant recipients abstracted from the United Network for Organ Sharing Standard Transplant Analysis and Research file. We used Cox regression to model graft failure (GF; earliest of death or retransplant) risk based on donor and transplant factors, adjusting for recipient factors. We then derived and validated a Lung Donor Risk Index (LDRI) and developed a pertinent online application (https://shiny.pmacs.upenn.edu/LDRI_Calculator/). We found 12 donor/transplant factors that were independently predictive of GF: age, race, insulin-dependent diabetes, the difference between donor and recipient height, smoking, cocaine use, cytomegalovirus seropositivity, creatinine, human leukocyte antigen (HLA) mismatch, ischemia time, and donation after circulatory death. Validation showed the LDRI to have GF risk discrimination that was reasonable (C = 0.61) and higher than any of its predecessors. The LDRI is intended for use by transplant centers, organ procurement organizations, and regulatory agencies and to benefit patients in decision-making. Unlike its predecessors, the proposed LDRI could gain wide acceptance because of its granularity and similarity to the Kidney Donor Risk Index.


Asunto(s)
Rechazo de Injerto , Supervivencia de Injerto , Trasplante de Pulmón , Donantes de Tejidos , Obtención de Tejidos y Órganos , Humanos , Trasplante de Pulmón/efectos adversos , Femenino , Masculino , Donantes de Tejidos/provisión & distribución , Persona de Mediana Edad , Factores de Riesgo , Adulto , Rechazo de Injerto/etiología , Estudios de Seguimiento , Pronóstico , Medición de Riesgo
11.
Liver Transpl ; 30(7): 689-698, 2024 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-38265295

RESUMEN

Given liver transplantation organ scarcity, selection of recipients and donors to maximize post-transplant benefit is paramount. Several scores predict post-transplant outcomes by isolating elements of donor and recipient risk, including the donor risk index, Balance of Risk, pre-allocation score to predict survival outcomes following liver transplantation/survival outcomes following liver transplantation (SOFT), improved donor-to-recipient allocation score for deceased donors only/improved donor-to-recipient allocation score for both deceased and living donors (ID2EAL-D/-DR), and survival benefit (SB) models. No studies have examined the performance of these models over time, which is critical in an ever-evolving transplant landscape. This was a retrospective cohort study of liver transplantation events in the UNOS database from 2002 to 2021. We used Cox regression to evaluate model discrimination (Harrell's C) and calibration (testing of calibration curves) for post-transplant patient and graft survival at specified post-transplant timepoints. Sub-analyses were performed in the modern transplant era (post-2014) and for key donor-recipient characteristics. A total of 112,357 transplants were included. The SB and SOFT scores had the highest discrimination for short-term patient and graft survival, including in the modern transplant era, where only the SB model had good discrimination (C ≥ 0.60) for all patient and graft outcome timepoints. However, these models had evidence of poor calibration at 3- and 5-year patient survival timepoints. The ID2EAL-DR score had lower discrimination but adequate calibration at all patient survival timepoints. In stratified analyses, SB and SOFT scores performed better in younger (< 40 y) and higher Model for End-Stage Liver Disease (≥ 25) patients. All prediction scores had declining discrimination over time, and scores relying on donor factors alone had poor performance. Although the SB and SOFT scores had the best overall performance, all models demonstrated declining performance over time. This underscores the importance of periodically updating and/or developing new prediction models to reflect the evolving transplant field. Scores relying on donor factors alone do not meaningfully inform post-transplant risk.


Asunto(s)
Enfermedad Hepática en Estado Terminal , Supervivencia de Injerto , Trasplante de Hígado , Humanos , Trasplante de Hígado/efectos adversos , Estudios Retrospectivos , Femenino , Masculino , Persona de Mediana Edad , Medición de Riesgo/estadística & datos numéricos , Medición de Riesgo/métodos , Enfermedad Hepática en Estado Terminal/cirugía , Enfermedad Hepática en Estado Terminal/mortalidad , Enfermedad Hepática en Estado Terminal/diagnóstico , Adulto , Factores de Riesgo , Factores de Tiempo , Donadores Vivos/estadística & datos numéricos , Selección de Donante/normas , Selección de Donante/métodos , Selección de Donante/estadística & datos numéricos , Anciano , Modelos de Riesgos Proporcionales , Obtención de Tejidos y Órganos/estadística & datos numéricos , Obtención de Tejidos y Órganos/métodos , Obtención de Tejidos y Órganos/normas , Resultado del Tratamiento , Donantes de Tejidos/estadística & datos numéricos , Bases de Datos Factuales/estadística & datos numéricos
12.
Blood Adv ; 8(5): 1272-1280, 2024 Mar 12.
Artículo en Inglés | MEDLINE | ID: mdl-38163322

RESUMEN

ABSTRACT: Hospitalized patients with inflammatory bowel disease (IBD) are at increased risk of venous thromboembolism (VTE). We aimed to evaluate the effectiveness and safety of prophylactic anticoagulation compared with no anticoagulation in hospitalized patients with IBD. We conducted a retrospective cohort study using a hospital-based database. We included patients with IBD who had a length of hospital stay ≥2 days between 1 January 2016 and 31 December 2019. We excluded patients who had other indications for anticoagulation, users of direct oral anticoagulants, warfarin, therapeutic-intensity heparin, and patients admitted for surgery. We defined exposure to prophylactic anticoagulation using charge codes. The primary effectiveness outcome was VTE. The primary safety outcome was bleeding. We used propensity score matching to reduce potential differences between users and nonusers of anticoagulants and Cox proportional-hazards regression to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The analysis included 56 194 matched patients with IBD (users of anticoagulants, n = 28 097; nonusers, n = 28 097). In the matched sample, prophylactic use of anticoagulants (vs no use) was associated with a lower rate of VTE (HR, 0.62; 95% CI, 0.41-0.94) and with no difference in the rate of bleeding (HR, 1.05; 95% CI, 0.87-1.26). In this study of hospitalized patients with IBD, prophylactic use of heparin was associated with a lower rate of VTE without increasing bleeding risk compared with no anticoagulation. Our results suggest potential benefits of prophylactic anticoagulation to reduce the burden of VTE in hospitalized patients with IBD.


Asunto(s)
Enfermedades Inflamatorias del Intestino , Tromboembolia Venosa , Humanos , Tromboembolia Venosa/prevención & control , Tromboembolia Venosa/complicaciones , Estudios Retrospectivos , Anticoagulantes/efectos adversos , Hemorragia/inducido químicamente , Heparina/efectos adversos , Enfermedades Inflamatorias del Intestino/complicaciones , Enfermedades Inflamatorias del Intestino/tratamiento farmacológico
14.
Transplantation ; 108(3): 713-723, 2024 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-37635282

RESUMEN

BACKGROUND: Outcomes after living-donor liver transplantation (LDLT) at high Model for End-stage Liver Disease (MELD) scores are not well characterized in the United States. METHODS: This was a retrospective cohort study using Organ Procurement and Transplantation Network data in adults listed for their first liver transplant alone between 2002 and 2021. Cox proportional hazards models evaluated the association of MELD score (<20, 20-24, 25-29, and ≥30) and patient/graft survival after LDLT and the association of donor type (living versus deceased) on outcomes stratified by MELD. RESULTS: There were 4495 LDLTs included with 5.9% at MELD 25-29 and 1.9% at MELD ≥30. LDLTs at MELD 25-29 and ≥30 LDLT have substantially increased since 2010 and 2015, respectively. Patient survival at MELD ≥30 was not different versus MELD <20: adjusted hazard ratio 1.67 (95% confidence interval, 0.96-2.88). However, graft survival was worse: adjusted hazard ratio (aHR) 1.69 (95% confidence interval, 1.07-2.68). Compared with deceased-donor liver transplant, LDLT led to superior patient survival at MELD <20 (aHR 0.92; P = 0.024) and 20-24 (aHR 0.70; P < 0.001), equivalent patient survival at MELD 25-29 (aHR 0.97; P = 0.843), but worse graft survival at MELD ≥30 (aHR 1.68, P = 0.009). CONCLUSIONS: Although patient survival remains acceptable, the benefits of LDLT may be lost at MELD ≥30.


Asunto(s)
Enfermedad Hepática en Estado Terminal , Trasplante de Hígado , Adulto , Humanos , Estados Unidos , Donadores Vivos , Trasplante de Hígado/efectos adversos , Enfermedad Hepática en Estado Terminal/diagnóstico , Enfermedad Hepática en Estado Terminal/cirugía , Estudios Retrospectivos , Índice de Severidad de la Enfermedad , Supervivencia de Injerto , Resultado del Tratamiento
15.
Pediatr Crit Care Med ; 25(1): e41-e46, 2024 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-37462429

RESUMEN

OBJECTIVE: To determine the association of venovenous extracorporeal membrane oxygenation (VV-ECMO) initiation with changes in vasoactive-inotropic scores (VISs) in children with pediatric acute respiratory distress syndrome (PARDS) and cardiovascular instability. DESIGN: Retrospective cohort study. SETTING: Single academic pediatric ECMO center. PATIENTS: Children (1 mo to 18 yr) treated with VV-ECMO (2009-2019) for PARDS with need for vasopressor or inotropic support at ECMO initiation. MEASUREMENTS AND MAIN RESULTS: Arterial blood gas values, VIS, mean airway pressure (mPaw), and oxygen saturation (Sp o2 ) values were recorded hourly relative to the start of ECMO flow for 24 hours pre-VV-ECMO and post-VV-ECMO cannulation. A sharp kink discontinuity regression analysis clustered by patient tested the difference in VISs and regression line slopes immediately surrounding cannulation. Thirty-two patients met inclusion criteria: median age 6.6 years (interquartile range [IQR] 1.5-11.7), 22% immunocompromised, and 75% had pneumonia or sepsis as the cause of PARDS. Pre-ECMO characteristics included: median oxygenation index 45 (IQR 35-58), mPaw 32 cm H 2o (IQR 30-34), 97% on inhaled nitric oxide, and 81% on an advanced mode of ventilation. Median VIS immediately before VV-ECMO cannulation was 13 (IQR 8-25) with an overall increasing VIS trajectory over the hours before cannulation. VISs decreased and the slope of the regression line reversed immediately surrounding the time of cannulation (robust p < 0.0001). There were pre-ECMO to post-ECMO cannulation decreases in mPaw (32 vs 20 cm H 2o , p < 0.001) and arterial P co2 (64.1 vs 50.1 mm Hg, p = 0.007) and increases in arterial pH (7.26 vs 7.38, p = 0.001), arterial base excess (2.5 vs 5.2, p = 0.013), and SpO 2 (91% vs 95%, p = 0.013). CONCLUSIONS: Initiation of VV-ECMO was associated with an immediate and sustained reduction in VIS in PARDS patients with cardiovascular instability. This VIS reduction was associated with decreased mPaw and reduced respiratory and/or metabolic acidosis as well as improved oxygenation.


Asunto(s)
Oxigenación por Membrana Extracorpórea , Síndrome de Dificultad Respiratoria , Insuficiencia Respiratoria , Humanos , Niño , Estudios Retrospectivos , Síndrome de Dificultad Respiratoria/terapia , Insuficiencia Respiratoria/terapia , Arterias
16.
Hepatol Commun ; 7(10)2023 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-37916863

RESUMEN

Liver transplantation is a life-saving option for decompensated cirrhosis. Liver transplant recipients require advanced self-management skills, intact cognitive skills, and care partner support to improve long-term outcomes. Gaps remain in understanding post-liver transplant cognitive and health trajectories, and patient factors such as self-management skills, care partner support, and sleep. Our aims are to (1) assess pre-liver transplant to post-liver transplant cognitive trajectories and identify risk factors for persistent cognitive impairment; (2) evaluate associations between cognitive function and self-management skills, health behaviors, functional health status, and post-transplant outcomes; and (3) investigate potential mediators and moderators of associations between cognitive function and post-liver transplant outcomes. LivCog is a longitudinal, prospective observational study that will enroll 450 adult liver transplant recipients and their caregivers/care partners. The duration of the study is 5 years with 24 additional months of patient follow-up. Data will be collected from participants at 1, 3, 12, and 24 months post-transplant. Limited pre-liver transplant data will also be collected from waitlisted candidates. Data collection methods include interviews, surveys, cognitive assessments, and actigraphy/sleep diary measures. Patient measurements include sociodemographic characteristics, pretransplant health status, cognitive function, physical function, perioperative measures, medical history, transplant history, self-management skills, patient-reported outcomes, health behaviors, and clinical outcomes. Caregiver measures assess sociodemographic variables, health literacy, health care navigation skills, self-efficacy, care partner preparedness, nature and intensity of care, care partner burden, and community participation. By elucidating various health trajectories from pre-liver transplant to 2 years post-liver transplant, LivCog will be able to better characterize recipients at higher risk of cognitive impairment and compromised self-management. Findings will inform interventions targeting health behaviors, self-management, and caregiver supports to optimize outcomes.


Asunto(s)
Disfunción Cognitiva , Trasplante de Hígado , Automanejo , Adulto , Humanos , Trasplante de Hígado/efectos adversos , Estudios Prospectivos , Cognición , Disfunción Cognitiva/etiología
17.
Prog Transplant ; 33(4): 283-292, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-37941335

RESUMEN

Introduction: Organ recovery facilities address the logistical challenges of hospital-based deceased organ donor management. While more organs are transplanted from donors in facilities, differences in donor management and donation processes are not fully characterized. Research Question: Does deceased donor management and organ transport distance differ between organ procurement organization (OPO)-based recovery facilities versus hospitals? Design: Retrospective analysis of Organ Procurement and Transplant Network data, including adults after brain death in 10 procurement regions (April 2017-June 2021). The primary outcomes were ischemic times of transplanted hearts, kidneys, livers, and lungs. Secondary outcomes included transport distances (between the facility or hospital and the transplant program) for each transplanted organ. Results: Among 5010 deceased donors, 51.7% underwent recovery in an OPO-based recovery facility. After adjustment for recipient and system factors, mean differences in ischemic times of any transplanted organ were not significantly different between donors in facilities and hospitals. Transplanted hearts recovered from donors in facilities were transported further than hearts from hospital donors (median 255 mi [IQR 27, 475] versus 174 [IQR 42, 365], P = .002); transport distances for livers and kidneys were significantly shorter (P < .001 for both). Conclusion: Organ recovery procedures performed in OPO-based recovery facilities were not associated with differences in ischemic times in transplanted organs from organs recovered in hospitals, but differences in organ transport distances exist. Further work is needed to determine whether other observed differences in donor management and organ distribution meaningfully impact donation and transplantation outcomes.


Asunto(s)
Trasplante de Órganos , Obtención de Tejidos y Órganos , Adulto , Humanos , Estudios Retrospectivos , Donantes de Tejidos , Hospitales
18.
Stat Methods Med Res ; 32(12): 2386-2404, 2023 12.
Artículo en Inglés | MEDLINE | ID: mdl-37965684

RESUMEN

The hazard ratio (HR) remains the most frequently employed metric in assessing treatment effects on survival times. However, the difference in restricted mean survival time (RMST) has become a popular alternative to the HR when the proportional hazards assumption is considered untenable. Moreover, independent of the proportional hazards assumption, many comparative effectiveness studies aim to base contrasts on survival probability rather than on the hazard function. Causal effects based on RMST are often estimated via inverse probability of treatment weighting (IPTW). However, this approach generally results in biased results when the assumed propensity score model is misspecified. Motivated by the need for more robust techniques, we propose an empirical likelihood-based weighting approach that allows for specifying a set of propensity score models. The resulting estimator is consistent when the postulated model set contains a correct model; this property has been termed multiple robustness. In this report, we derive and evaluate a multiply robust estimator of the causal between-treatment difference in RMST. Simulation results confirm its robustness. Compared with the IPTW estimator from a correct model, the proposed estimator tends to be less biased and more efficient in finite samples. Additional simulations reveal biased results from a direct application of machine learning estimation of propensity scores. Finally, we apply the proposed method to evaluate the impact of intrapartum group B streptococcus antibiotic prophylaxis on the risk of childhood allergic disorders using data derived from electronic medical records from the Children's Hospital of Philadelphia and census data from the American Community Survey.


Asunto(s)
Modelos Estadísticos , Niño , Humanos , Funciones de Verosimilitud , Tasa de Supervivencia , Modelos de Riesgos Proporcionales , Simulación por Computador , Puntaje de Propensión
19.
Res Sq ; 2023 Oct 04.
Artículo en Inglés | MEDLINE | ID: mdl-37886466

RESUMEN

Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. Our group designed and implemented a prospective observational longitudinal study to address this gap. We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We described here the processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management. We also highlighted the rigors and challenges associated with participant recruitment and retention.

20.
BMC Oral Health ; 23(1): 763, 2023 10 17.
Artículo en Inglés | MEDLINE | ID: mdl-37848867

RESUMEN

BACKGROUND: Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. METHODS: We describe here the study design, including processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management and statistical plan. DISCUSSION: We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We also highlight the rigors and challenges associated with our ongoing participant recruitment and retention. A rigorous prospective longitudinal study requires proper planning and execution. A great benefit is that large data sets are collected and biospecimen repository can be used to answer more questions in future studies including genetic, microbiome and metabolome-based studies. TRIAL REGISTRATION: National Institute of Health Clinical Trials Registration (NCT) #: NCT04645693.


Asunto(s)
Fármacos Anti-VIH , Infecciones por VIH , Humanos , Infecciones por VIH/complicaciones , Infecciones por VIH/tratamiento farmacológico , Fármacos Anti-VIH/efectos adversos , Estudios Longitudinales , Estudios Prospectivos , Carga Viral , Evaluación de Resultado en la Atención de Salud
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...