Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 168
Filtrar
1.
Stat Med ; 2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38780593

RESUMO

In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.

2.
Kidney Med ; 6(5): 100814, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38689836

RESUMO

Rationale & Objective: Limited data exist on longitudinal kidney outcomes after nonsurgical obesity treatments. We investigated the effects of intensive lifestyle intervention on kidney function over 10 years. Study Design: Post hoc analysis of Action for Health in Diabetes (Look AHEAD) randomized controlled trial. Setting & Participants: We studied 4,901 individuals with type 2 diabetes and body mass index of ≥25 kg/m2 enrolled in Look AHEAD (2001-2015). The original Look AHEAD trial excluded individuals with 4+ urine dipstick protein, serum creatinine level of >1.4 mg/dL (women), 1.5 mg/dL (men), or dialysis dependence. Exposures: Intensive lifestyle intervention versus diabetes support and education (ie, usual care). Outcome: Primary outcome was estimated glomerular filtration rate (eGFR, mL/min/1.73 m2) slope. Secondary outcomes were mean eGFR, slope, and mean urine albumin to creatinine ratio (UACR, mg/mg). Analytical Approach: Linear mixed-effects models with random slopes and intercepts to evaluate the association between randomization arms and within-individual repeated measures of eGFR and UACR. We tested for effect modification by baseline eGFR. Results: At baseline, mean eGFR was 89, and 83% had a normal UACR. Over 10 years, there was no difference in eGFR slope (+0.064 per year; 95% CI: -0.036 to 0.16; P = 0.21) between arms. Slope or mean UACR did not differ between arms. Baseline eGFR, categorized as eGFR of <80, 80-100, or >100, did not modify the intervention's effect on eGFR slope or mean. Limitations: Loss of muscle may confound creatinine-based eGFR. Conclusions: In patients with type 2 diabetes and preserved kidney function, intensive lifestyle intervention did not change eGFR slope over 10 years. Among participants with baseline eGFR <80, lifestyle intervention had a slightly higher longitudinal mean eGFR than usual care. Further studies evaluating the effects of intensive lifestyle intervention in people with kidney disease are needed.


Lifestyle interventions can improve chronic kidney disease risk factors, specifically diabetes, hypertension, and obesity. But, the effects of lifestyle intervention on change in kidney function (estimated glomerular filtration rate [eGFR]) over time are not well established. We studied Action for Health in Diabetes (Look AHEAD) trial data because all participants were affected by diabetes and overweight or obesity. Look AHEAD randomized participants to intensive lifestyle intervention or diabetes support and education (ie, usual care). We compared eGFR change over 10 years between groups, but found no difference. However, the intervention group maintained slightly higher eGFR than usual care, especially if eGFR was relatively low at baseline. Our study suggests lifestyle intervention may preserve eGFR, but dedicated studies in individuals with chronic kidney disease are needed.

3.
Artigo em Inglês | MEDLINE | ID: mdl-38599308

RESUMO

BACKGROUND & AIMS: Greater availability of less invasive biliary imaging to rule out choledocholithiasis should reduce the need for diagnostic endoscopic retrograde cholangiopancreatography (ERCP) in patients who have a remote history of cholecystectomy. The primary aims were to determine the incidence, characteristics, and outcomes of individuals who undergo first-time ERCP >1 year after cholecystectomy (late-ERCP). METHODS: Data from a commercial insurance claim database (Optum Clinformatics) identified 583,712 adults who underwent cholecystectomy, 4274 of whom underwent late-ERCP, defined as first-time ERCP for nonmalignant indications >1 year after cholecystectomy. Outcomes were exposure and temporal trends in late-ERCP, biliary imaging utilization, and post-ERCP outcomes. Multivariable logistic regression was used to examine patient characteristics associated with undergoing late-ERCP. RESULTS: Despite a temporal increase in the use of noninvasive biliary imaging (35.9% in 2004 to 65.6% in 2021; P < .001), the rate of late-ERCP increased 8-fold (0.5-4.2/1000 person-years from 2005 to 2021; P < .001). Although only 44% of patients who underwent late-ERCP had gallstone removal, there were high rates of post-ERCP pancreatitis (7.1%), hospitalization (13.1%), and new chronic opioid use (9.7%). Factors associated with late-ERCP included concomitant disorder of gut-brain interaction (odds ratio [OR], 6.48; 95% confidence interval [CI], 5.88-6.91) and metabolic dysfunction steatotic liver disease (OR, 3.27; 95% CI, 2.79-3.55) along with use of anxiolytic (OR, 3.45; 95% CI, 3.19-3.58), antispasmodic (OR, 1.60; 95% CI, 1.53-1.72), and chronic opioids (OR, 6.24; 95% CI, 5.79-6.52). CONCLUSIONS: The rate of late-ERCP postcholecystectomy is increasing significantly, particularly in patients with comorbidities associated with disorder of gut-brain interaction and mimickers of choledocholithiasis. Late-ERCPs are associated with disproportionately higher rates of adverse events, including initiation of chronic opioid use.

4.
Clin Transplant ; 38(5): e15319, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38683684

RESUMO

OBJECTIVE: Longer end-stage renal disease time has been associated with inferior kidney transplant outcomes. However, the contribution of transplant evaluation is uncertain. We explored the relationship between time from evaluation to listing (ELT) and transplant outcomes. METHODS: This retrospective study included 2535 adult kidney transplants from 2000 to 2015. Kaplan-Meier survival curves, log-rank tests, and Cox regression models were used to compare transplant outcomes. RESULTS: Patient survival for both deceased donor (DD) recipients (p < .001) and living donor (LD) recipients (p < .0001) was significantly higher when ELT was less than 3 months. The risks of ELT appeared to be mediated by other risks in DD recipients, as adjusted models showed no associated risk of graft loss or death in DD recipients. For LD recipients, ELT remained a risk factor for patient death after covariate adjustment. Each month of ELT was associated with an increased risk of death (HR = 1.021, p = .04) but not graft loss in LD recipients in adjusted models. CONCLUSIONS: Kidney transplant recipients with longer ELT times had higher rates of death after transplant, and ELT was independently associated with an increased risk of death for LD recipients. Investigations on the impact of pretransplant evaluation on post-transplant outcomes can inform transplant policy and practice.


Assuntos
Sobrevivência de Enxerto , Falência Renal Crônica , Transplante de Rim , Listas de Espera , Humanos , Transplante de Rim/mortalidade , Transplante de Rim/efeitos adversos , Feminino , Masculino , Estudos Retrospectivos , Pessoa de Meia-Idade , Falência Renal Crônica/cirurgia , Seguimentos , Fatores de Risco , Listas de Espera/mortalidade , Prognóstico , Taxa de Sobrevida , Adulto , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/mortalidade , Doadores de Tecidos/provisão & distribuição , Taxa de Filtração Glomerular , Testes de Função Renal , Doadores Vivos/provisão & distribuição , Obtenção de Tecidos e Órgãos , Fatores de Tempo , Complicações Pós-Operatórias
5.
Am J Transplant ; 24(5): 839-849, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38266712

RESUMO

Lung transplantation lags behind other solid organ transplants in donor lung utilization due, in part, to uncertainty regarding donor quality. We sought to develop an easy-to-use donor risk metric that, unlike existing metrics, accounts for a rich set of donor factors. Our study population consisted of n = 26 549 adult lung transplant recipients abstracted from the United Network for Organ Sharing Standard Transplant Analysis and Research file. We used Cox regression to model graft failure (GF; earliest of death or retransplant) risk based on donor and transplant factors, adjusting for recipient factors. We then derived and validated a Lung Donor Risk Index (LDRI) and developed a pertinent online application (https://shiny.pmacs.upenn.edu/LDRI_Calculator/). We found 12 donor/transplant factors that were independently predictive of GF: age, race, insulin-dependent diabetes, the difference between donor and recipient height, smoking, cocaine use, cytomegalovirus seropositivity, creatinine, human leukocyte antigen (HLA) mismatch, ischemia time, and donation after circulatory death. Validation showed the LDRI to have GF risk discrimination that was reasonable (C = 0.61) and higher than any of its predecessors. The LDRI is intended for use by transplant centers, organ procurement organizations, and regulatory agencies and to benefit patients in decision-making. Unlike its predecessors, the proposed LDRI could gain wide acceptance because of its granularity and similarity to the Kidney Donor Risk Index.


Assuntos
Rejeição de Enxerto , Sobrevivência de Enxerto , Transplante de Pulmão , Doadores de Tecidos , Obtenção de Tecidos e Órgãos , Humanos , Transplante de Pulmão/efeitos adversos , Feminino , Masculino , Doadores de Tecidos/provisão & distribuição , Pessoa de Meia-Idade , Fatores de Risco , Adulto , Rejeição de Enxerto/etiologia , Seguimentos , Prognóstico , Medição de Risco
6.
Liver Transpl ; 2024 Jan 24.
Artigo em Inglês | MEDLINE | ID: mdl-38265295

RESUMO

Given liver transplantation organ scarcity, selection of recipients and donors to maximize post-transplant benefit is paramount. Several scores predict post-transplant outcomes by isolating elements of donor and recipient risk, including the donor risk index, Balance of Risk, pre-allocation score to predict survival outcomes following liver transplantation/survival outcomes following liver transplantation (SOFT), improved donor-to-recipient allocation score for deceased donors only/improved donor-to-recipient allocation score for both deceased and living donors (ID2EAL-D/-DR), and survival benefit (SB) models. No studies have examined the performance of these models over time, which is critical in an ever-evolving transplant landscape. This was a retrospective cohort study of liver transplantation events in the UNOS database from 2002 to 2021. We used Cox regression to evaluate model discrimination (Harrell's C) and calibration (testing of calibration curves) for post-transplant patient and graft survival at specified post-transplant timepoints. Sub-analyses were performed in the modern transplant era (post-2014) and for key donor-recipient characteristics. A total of 112,357 transplants were included. The SB and SOFT scores had the highest discrimination for short-term patient and graft survival, including in the modern transplant era, where only the SB model had good discrimination (C ≥ 0.60) for all patient and graft outcome timepoints. However, these models had evidence of poor calibration at 3- and 5-year patient survival timepoints. The ID2EAL-DR score had lower discrimination but adequate calibration at all patient survival timepoints. In stratified analyses, SB and SOFT scores performed better in younger (< 40 y) and higher Model for End-Stage Liver Disease (≥ 25) patients. All prediction scores had declining discrimination over time, and scores relying on donor factors alone had poor performance. Although the SB and SOFT scores had the best overall performance, all models demonstrated declining performance over time. This underscores the importance of periodically updating and/or developing new prediction models to reflect the evolving transplant field. Scores relying on donor factors alone do not meaningfully inform post-transplant risk.

7.
Blood Adv ; 8(5): 1272-1280, 2024 Mar 12.
Artigo em Inglês | MEDLINE | ID: mdl-38163322

RESUMO

ABSTRACT: Hospitalized patients with inflammatory bowel disease (IBD) are at increased risk of venous thromboembolism (VTE). We aimed to evaluate the effectiveness and safety of prophylactic anticoagulation compared with no anticoagulation in hospitalized patients with IBD. We conducted a retrospective cohort study using a hospital-based database. We included patients with IBD who had a length of hospital stay ≥2 days between 1 January 2016 and 31 December 2019. We excluded patients who had other indications for anticoagulation, users of direct oral anticoagulants, warfarin, therapeutic-intensity heparin, and patients admitted for surgery. We defined exposure to prophylactic anticoagulation using charge codes. The primary effectiveness outcome was VTE. The primary safety outcome was bleeding. We used propensity score matching to reduce potential differences between users and nonusers of anticoagulants and Cox proportional-hazards regression to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The analysis included 56 194 matched patients with IBD (users of anticoagulants, n = 28 097; nonusers, n = 28 097). In the matched sample, prophylactic use of anticoagulants (vs no use) was associated with a lower rate of VTE (HR, 0.62; 95% CI, 0.41-0.94) and with no difference in the rate of bleeding (HR, 1.05; 95% CI, 0.87-1.26). In this study of hospitalized patients with IBD, prophylactic use of heparin was associated with a lower rate of VTE without increasing bleeding risk compared with no anticoagulation. Our results suggest potential benefits of prophylactic anticoagulation to reduce the burden of VTE in hospitalized patients with IBD.


Assuntos
Doenças Inflamatórias Intestinais , Tromboembolia Venosa , Humanos , Tromboembolia Venosa/prevenção & controle , Tromboembolia Venosa/complicações , Estudos Retrospectivos , Anticoagulantes/efeitos adversos , Hemorragia/induzido quimicamente , Heparina/efeitos adversos , Doenças Inflamatórias Intestinais/complicações , Doenças Inflamatórias Intestinais/tratamento farmacológico
8.
Pediatr Crit Care Med ; 25(1): e41-e46, 2024 Jan 01.
Artigo em Inglês | MEDLINE | ID: mdl-37462429

RESUMO

OBJECTIVE: To determine the association of venovenous extracorporeal membrane oxygenation (VV-ECMO) initiation with changes in vasoactive-inotropic scores (VISs) in children with pediatric acute respiratory distress syndrome (PARDS) and cardiovascular instability. DESIGN: Retrospective cohort study. SETTING: Single academic pediatric ECMO center. PATIENTS: Children (1 mo to 18 yr) treated with VV-ECMO (2009-2019) for PARDS with need for vasopressor or inotropic support at ECMO initiation. MEASUREMENTS AND MAIN RESULTS: Arterial blood gas values, VIS, mean airway pressure (mPaw), and oxygen saturation (Sp o2 ) values were recorded hourly relative to the start of ECMO flow for 24 hours pre-VV-ECMO and post-VV-ECMO cannulation. A sharp kink discontinuity regression analysis clustered by patient tested the difference in VISs and regression line slopes immediately surrounding cannulation. Thirty-two patients met inclusion criteria: median age 6.6 years (interquartile range [IQR] 1.5-11.7), 22% immunocompromised, and 75% had pneumonia or sepsis as the cause of PARDS. Pre-ECMO characteristics included: median oxygenation index 45 (IQR 35-58), mPaw 32 cm H 2o (IQR 30-34), 97% on inhaled nitric oxide, and 81% on an advanced mode of ventilation. Median VIS immediately before VV-ECMO cannulation was 13 (IQR 8-25) with an overall increasing VIS trajectory over the hours before cannulation. VISs decreased and the slope of the regression line reversed immediately surrounding the time of cannulation (robust p < 0.0001). There were pre-ECMO to post-ECMO cannulation decreases in mPaw (32 vs 20 cm H 2o , p < 0.001) and arterial P co2 (64.1 vs 50.1 mm Hg, p = 0.007) and increases in arterial pH (7.26 vs 7.38, p = 0.001), arterial base excess (2.5 vs 5.2, p = 0.013), and SpO 2 (91% vs 95%, p = 0.013). CONCLUSIONS: Initiation of VV-ECMO was associated with an immediate and sustained reduction in VIS in PARDS patients with cardiovascular instability. This VIS reduction was associated with decreased mPaw and reduced respiratory and/or metabolic acidosis as well as improved oxygenation.


Assuntos
Oxigenação por Membrana Extracorpórea , Síndrome do Desconforto Respiratório , Insuficiência Respiratória , Humanos , Criança , Estudos Retrospectivos , Síndrome do Desconforto Respiratório/terapia , Insuficiência Respiratória/terapia , Artérias
10.
Transplantation ; 108(3): 713-723, 2024 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-37635282

RESUMO

BACKGROUND: Outcomes after living-donor liver transplantation (LDLT) at high Model for End-stage Liver Disease (MELD) scores are not well characterized in the United States. METHODS: This was a retrospective cohort study using Organ Procurement and Transplantation Network data in adults listed for their first liver transplant alone between 2002 and 2021. Cox proportional hazards models evaluated the association of MELD score (<20, 20-24, 25-29, and ≥30) and patient/graft survival after LDLT and the association of donor type (living versus deceased) on outcomes stratified by MELD. RESULTS: There were 4495 LDLTs included with 5.9% at MELD 25-29 and 1.9% at MELD ≥30. LDLTs at MELD 25-29 and ≥30 LDLT have substantially increased since 2010 and 2015, respectively. Patient survival at MELD ≥30 was not different versus MELD <20: adjusted hazard ratio 1.67 (95% confidence interval, 0.96-2.88). However, graft survival was worse: adjusted hazard ratio (aHR) 1.69 (95% confidence interval, 1.07-2.68). Compared with deceased-donor liver transplant, LDLT led to superior patient survival at MELD <20 (aHR 0.92; P = 0.024) and 20-24 (aHR 0.70; P < 0.001), equivalent patient survival at MELD 25-29 (aHR 0.97; P = 0.843), but worse graft survival at MELD ≥30 (aHR 1.68, P = 0.009). CONCLUSIONS: Although patient survival remains acceptable, the benefits of LDLT may be lost at MELD ≥30.


Assuntos
Doença Hepática Terminal , Transplante de Fígado , Adulto , Humanos , Estados Unidos , Doadores Vivos , Transplante de Fígado/efeitos adversos , Doença Hepática Terminal/diagnóstico , Doença Hepática Terminal/cirurgia , Estudos Retrospectivos , Índice de Gravidade de Doença , Sobrevivência de Enxerto , Resultado do Tratamento
11.
Prog Transplant ; 33(4): 283-292, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37941335

RESUMO

Introduction: Organ recovery facilities address the logistical challenges of hospital-based deceased organ donor management. While more organs are transplanted from donors in facilities, differences in donor management and donation processes are not fully characterized. Research Question: Does deceased donor management and organ transport distance differ between organ procurement organization (OPO)-based recovery facilities versus hospitals? Design: Retrospective analysis of Organ Procurement and Transplant Network data, including adults after brain death in 10 procurement regions (April 2017-June 2021). The primary outcomes were ischemic times of transplanted hearts, kidneys, livers, and lungs. Secondary outcomes included transport distances (between the facility or hospital and the transplant program) for each transplanted organ. Results: Among 5010 deceased donors, 51.7% underwent recovery in an OPO-based recovery facility. After adjustment for recipient and system factors, mean differences in ischemic times of any transplanted organ were not significantly different between donors in facilities and hospitals. Transplanted hearts recovered from donors in facilities were transported further than hearts from hospital donors (median 255 mi [IQR 27, 475] versus 174 [IQR 42, 365], P = .002); transport distances for livers and kidneys were significantly shorter (P < .001 for both). Conclusion: Organ recovery procedures performed in OPO-based recovery facilities were not associated with differences in ischemic times in transplanted organs from organs recovered in hospitals, but differences in organ transport distances exist. Further work is needed to determine whether other observed differences in donor management and organ distribution meaningfully impact donation and transplantation outcomes.


Assuntos
Transplante de Órgãos , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Estudos Retrospectivos , Doadores de Tecidos , Hospitais
12.
Hepatol Commun ; 7(10)2023 Oct 01.
Artigo em Inglês | MEDLINE | ID: mdl-37916863

RESUMO

Liver transplantation is a life-saving option for decompensated cirrhosis. Liver transplant recipients require advanced self-management skills, intact cognitive skills, and care partner support to improve long-term outcomes. Gaps remain in understanding post-liver transplant cognitive and health trajectories, and patient factors such as self-management skills, care partner support, and sleep. Our aims are to (1) assess pre-liver transplant to post-liver transplant cognitive trajectories and identify risk factors for persistent cognitive impairment; (2) evaluate associations between cognitive function and self-management skills, health behaviors, functional health status, and post-transplant outcomes; and (3) investigate potential mediators and moderators of associations between cognitive function and post-liver transplant outcomes. LivCog is a longitudinal, prospective observational study that will enroll 450 adult liver transplant recipients and their caregivers/care partners. The duration of the study is 5 years with 24 additional months of patient follow-up. Data will be collected from participants at 1, 3, 12, and 24 months post-transplant. Limited pre-liver transplant data will also be collected from waitlisted candidates. Data collection methods include interviews, surveys, cognitive assessments, and actigraphy/sleep diary measures. Patient measurements include sociodemographic characteristics, pretransplant health status, cognitive function, physical function, perioperative measures, medical history, transplant history, self-management skills, patient-reported outcomes, health behaviors, and clinical outcomes. Caregiver measures assess sociodemographic variables, health literacy, health care navigation skills, self-efficacy, care partner preparedness, nature and intensity of care, care partner burden, and community participation. By elucidating various health trajectories from pre-liver transplant to 2 years post-liver transplant, LivCog will be able to better characterize recipients at higher risk of cognitive impairment and compromised self-management. Findings will inform interventions targeting health behaviors, self-management, and caregiver supports to optimize outcomes.


Assuntos
Disfunção Cognitiva , Transplante de Fígado , Autogestão , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Estudos Prospectivos , Cognição , Disfunção Cognitiva/etiologia
13.
Stat Methods Med Res ; 32(12): 2386-2404, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37965684

RESUMO

The hazard ratio (HR) remains the most frequently employed metric in assessing treatment effects on survival times. However, the difference in restricted mean survival time (RMST) has become a popular alternative to the HR when the proportional hazards assumption is considered untenable. Moreover, independent of the proportional hazards assumption, many comparative effectiveness studies aim to base contrasts on survival probability rather than on the hazard function. Causal effects based on RMST are often estimated via inverse probability of treatment weighting (IPTW). However, this approach generally results in biased results when the assumed propensity score model is misspecified. Motivated by the need for more robust techniques, we propose an empirical likelihood-based weighting approach that allows for specifying a set of propensity score models. The resulting estimator is consistent when the postulated model set contains a correct model; this property has been termed multiple robustness. In this report, we derive and evaluate a multiply robust estimator of the causal between-treatment difference in RMST. Simulation results confirm its robustness. Compared with the IPTW estimator from a correct model, the proposed estimator tends to be less biased and more efficient in finite samples. Additional simulations reveal biased results from a direct application of machine learning estimation of propensity scores. Finally, we apply the proposed method to evaluate the impact of intrapartum group B streptococcus antibiotic prophylaxis on the risk of childhood allergic disorders using data derived from electronic medical records from the Children's Hospital of Philadelphia and census data from the American Community Survey.


Assuntos
Modelos Estatísticos , Criança , Humanos , Funções Verossimilhança , Taxa de Sobrevida , Modelos de Riscos Proporcionais , Simulação por Computador , Pontuação de Propensão
14.
BMC Oral Health ; 23(1): 763, 2023 10 17.
Artigo em Inglês | MEDLINE | ID: mdl-37848867

RESUMO

BACKGROUND: Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. METHODS: We describe here the study design, including processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management and statistical plan. DISCUSSION: We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We also highlight the rigors and challenges associated with our ongoing participant recruitment and retention. A rigorous prospective longitudinal study requires proper planning and execution. A great benefit is that large data sets are collected and biospecimen repository can be used to answer more questions in future studies including genetic, microbiome and metabolome-based studies. TRIAL REGISTRATION: National Institute of Health Clinical Trials Registration (NCT) #: NCT04645693.


Assuntos
Fármacos Anti-HIV , Infecções por HIV , Humanos , Infecções por HIV/complicações , Infecções por HIV/tratamento farmacológico , Fármacos Anti-HIV/efeitos adversos , Estudos Longitudinais , Estudos Prospectivos , Carga Viral , Avaliação de Resultados em Cuidados de Saúde
15.
Res Sq ; 2023 Oct 04.
Artigo em Inglês | MEDLINE | ID: mdl-37886466

RESUMO

Long-term antiretroviral therapy (ART) perpetually suppresses HIV load and has dramatically altered the prognosis of HIV infection, such that HIV is now regarded as a chronic disease. Side effects of ART in Patients With HIV (PWH), has introduced new challenges including "metabolic" (systemic) and oral complications. Furthermore, inflammation persists despite great viral load suppression and normal levels of CD4+ cell count. The impact of ART on the spectrum of oral diseases among PWH is often overlooked relative to other systemic complications. There is paucity of data on oral complications associated with ART use in PWH. This is in part due to limited prospective longitudinal studies designed to better understand the range of oral abnormalities observed in PWH on ART. Our group designed and implemented a prospective observational longitudinal study to address this gap. We present a procedural roadmap that could be modelled to assess the extent and progression of oral diseases associated with ART in PWH. We described here the processes associated with subject recruitment and retention, study visit planning, oral health assessments, bio-specimen collection and preprocessing procedures, and data management. We also highlighted the rigors and challenges associated with participant recruitment and retention.

16.
BMJ Open ; 13(9): e075172, 2023 09 18.
Artigo em Inglês | MEDLINE | ID: mdl-37723108

RESUMO

BACKGROUND AND AIMS: Liver transplantation is a life-saving procedure for end-stage liver disease. However, post-transplant medication regimens are complex and non-adherence is common. Post-transplant medication non-adherence is associated with graft rejection, which can have long-term adverse consequences. Transplant centres are equipped with clinical staff that monitor patients post-transplant; however, digital health tools and proactive immunosuppression adherence monitoring has potential to improve outcomes. METHODS AND ANALYSIS: This is a patient-randomised prospective clinical trial at three transplant centres in the Northeast, Midwest and South to investigate the effects of a remotely administered adherence programme compared with usual care. The programme monitors potential non-adherence largely levering text message prompts and phenotypes the nature of the non-adhere as cognitive, psychological, medical, social or economic. Additional reminders for medications, clinical appointments and routine self-management support are incorporated to promote adherence to the entire medical regimen. The primary study outcome is medication adherence via 24-hour recall; secondary outcomes include additional medication adherence (ASK-12 self-reported scale, regimen knowledge scales, tacrolimus values), quality of life, functional health status and clinical outcomes (eg, days hospitalised). Study implementation, acceptability, feasibility, costs and potential cost-effectiveness will also be evaluated. ETHICS AND DISSEMINATION: The University of Pennsylvania Review Board has approved the study as the single IRB of record (protocol # 849575, V.1.4). Results will be published in peer-reviewed journals and summaries will be provided to study funders. TRIAL REGISTRATION NUMBER: NCT05260268.


Assuntos
Doença Hepática Terminal , Transplante de Fígado , Humanos , Estudos Prospectivos , Qualidade de Vida , Cooperação e Adesão ao Tratamento
17.
J Heart Lung Transplant ; 42(12): 1735-1742, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37437825

RESUMO

BACKGROUND: Whether functional status is associated with survival to pediatric lung transplant is unknown. We hypothesized that completely dependent functional status at waitlist registration, defined using Lansky Play Performance Scale (LPPS), would be associated with worse outcomes. METHODS: Retrospective cohort study of pediatric lung transplant registrants utilizing United Network for Organ Sharing's Standard Transplant Analysis and Research files (2005-2020). Primary exposure was completely dependent functional status, defined as LPPS score of 10-40. Primary outcome was waitlist removal for death/deterioration with cause-specific hazard ratio (CSHR) regression. Subdistribution hazard regression (SHR, Fine and Gray) was used for the secondary outcome of waitlist removal due to transplant/improvement with a competing risk of death/deterioration. Confounders included: sex, age, race, diagnosis, ventilator dependence, extracorporeal membrane oxygenation, year, and listing center volume. RESULTS: A total of 964 patients were included (63.5% ≥ 12 years, 50.2% cystic fibrosis [CF]). Median waitlist days were 95; 20.1% were removed for death/deterioration and 68.2% for transplant/improvement. Completely dependent functional status was associated with removal due to death/deterioration (adjusted CSHR 5.30 [95% CI 2.86-9.80]). This association was modified by age (interaction p = 0.0102), with a larger effect for age ≥12 years, and particularly strong for CF. In the Fine and Gray model, completely dependent functional status did not affect the risk of removal due to transplant/improvement with a competing risk of death/deterioration (adjusted SHR 1.08 [95% CI 0.77-1.49]). CONCLUSIONS: Pediatric lung transplant registrants with the worst functional status had worse pretransplant outcomes, especially for adolescents and CF patients. Functional status at waitlist registration may be a modifiable risk factor to improve survival to lung transplant.


Assuntos
Fibrose Cística , Transplante de Pulmão , Adolescente , Humanos , Criança , Estudos Retrospectivos , Estado Funcional , Fatores de Risco , Listas de Espera
18.
Gastroenterology ; 165(5): 1197-1205.e2, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37481117

RESUMO

BACKGROUND & AIMS: We sought to estimate the incidence, prevalence, and racial-ethnic distribution of physician-diagnosed inflammatory bowel disease (IBD) in the United States. METHODS: The study used 4 administrative claims data sets: a 20% random sample of national fee-for-service Medicare data (2007 to 2017); Medicaid data from Florida, New York, Pennsylvania, Ohio, and California (1999 to 2012); and commercial health insurance data from Anthem beneficiaries (2006 to 2018) and Optum's deidentified Clinformatics Data Mart (2000 to 2017). We used validated combinations of medical diagnoses, diagnostic procedures, and prescription medications to identify incident and prevalent diagnoses. We computed pooled age-, sex-, and race/ethnicity-specific insurance-weighted estimates and pooled estimates standardized to 2018 United States Census estimates with 95% confidence intervals (CIs). RESULTS: The age- and sex-standardized incidence of IBD per 100,000 person-years was 10.9 (95% CI, 10.6-11.2). The incidence of IBD peaked in the third decade of life, decreased to a relatively stable level across the fourth to eighth decades, and declined further. The age-, sex- and insurance-standardized prevalence of IBD was 721 per 100,000 population (95% CI, 717-726). Extrapolated to the 2020 United States Census, an estimated 2.39 million Americans are diagnosed with IBD. The prevalence of IBD per 100,000 population was 812 (95% CI, 802-823) in White, 504 (95% CI, 482-526) in Black, 403 (95% CI, 373-433) in Asian, and 458 (95% CI, 440-476) in Hispanic Americans. CONCLUSIONS: IBD is diagnosed in >0.7% of Americans. The incidence peaks in early adulthood and then plateaus at a lower rate. The disease is less commonly diagnosed in Black, Asian, and Hispanic Americans.


Assuntos
Doenças Inflamatórias Intestinais , Medicare , Humanos , Estados Unidos/epidemiologia , Idoso , Adulto , Prevalência , Incidência , Doenças Inflamatórias Intestinais/diagnóstico , Doenças Inflamatórias Intestinais/epidemiologia , Florida
19.
J Heart Lung Transplant ; 42(10): 1455-1463, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37290569

RESUMO

BACKGROUND: Lung transplant (LT) centers are increasingly evaluating patients with multiple risk factors for adverse outcomes. The effects of these stacked risks remains unclear. Our aim was to determine the relationship between the number of comorbidities and post-transplant outcomes. METHODS: We performed a retrospective cohort study using the National Inpatient Sample (NIS) and UNOS Starfile (USF). We applied a probabilistic matching algorithm using 7 variables (transplant: month, year, and type; recipient: age, sex, race, payer). We matched recipients in the USF to transplant patients in the NIS between 2016 and 2019. The Elixhauser methodology was used to identify comorbidities present on admission. We determined the associations between mortality, length of stay (LOS), total charges, and disposition with comorbidity numbers using penalized cubic splines, Kaplan-Meier, and linear and logistic regression methods. RESULTS: From 28,484,087 NIS admissions, we identified 1,821 LT recipients. Matches were exact in 76.8% of the cohort. While the remaining cohort had a probability match of ≥0.94. Penalized splines of Elixhauser comorbidity number identified 3 knots defining 3 groups of stacked risk: low (<3), medium (3-6), and high risk (>6). Inpatient mortality increased from low to medium to high-risk categories: (1.6%, 3.9%, and 7.0%; p < 0.001), as did LOS (16, 21, 29 days, p < 0.001), total charges ($553,057, $666,791, $821,641.5; p = 0.004) and discharge to a skilled nursing facility (15%, 20%, 31%; p < 0.001). CONCLUSIONS: Stacked risks adversely affect post-LT mortality, LOS, charges, and discharge disposition. Further study to understand the details of specific stacked risks is warranted.


Assuntos
Hospitalização , Alta do Paciente , Humanos , Estudos Retrospectivos , Tempo de Internação , Fatores de Risco
20.
JAMA Netw Open ; 6(5): e239739, 2023 05 01.
Artigo em Inglês | MEDLINE | ID: mdl-37155170

RESUMO

Importance: Although racial and ethnic minority patients with sepsis and acute respiratory failure (ARF) experience worse outcomes, how patient presentation characteristics, processes of care, and hospital resource delivery are associated with outcomes is not well understood. Objective: To measure disparities in hospital length of stay (LOS) among patients at high risk of adverse outcomes who present with sepsis and/or ARF and do not immediately require life support and to quantify associations with patient- and hospital-level factors. Design, Setting, and Participants: This matched retrospective cohort study used electronic health record data from 27 acute care teaching and community hospitals across the Philadelphia metropolitan and northern California areas between January 1, 2013, and December 31, 2018. Matching analyses were performed between June 1 and July 31, 2022. The study included 102 362 adult patients who met clinical criteria for sepsis (n = 84 685) or ARF (n = 42 008) with a high risk of death at the time of presentation to the emergency department but without an immediate requirement for invasive life support. Exposures: Racial or ethnic minority self-identification. Main Outcomes and Measures: Hospital LOS, defined as the time from hospital admission to the time of discharge or inpatient death. Matches were stratified by racial and ethnic minority patient identity, comparing Asian and Pacific Islander patients, Black patients, Hispanic patients, and multiracial patients with White patients in stratified analyses. Results: Among 102 362 patients, the median (IQR) age was 76 (65-85) years; 51.5% were male. A total of 10.2% of patients self-identified as Asian American or Pacific Islander, 13.7% as Black, 9.7% as Hispanic, 60.7% as White, and 5.7% as multiracial. After matching racial and ethnic minority patients to White patients on clinical presentation characteristics, hospital capacity strain, initial intensive care unit admission, and the occurrence of inpatient death, Black patients experienced longer LOS relative to White patients in fully adjusted matches (sepsis: 1.26 [95% CI, 0.68-1.84] days; ARF: 0.97 [95% CI, 0.05-1.89] days). Length of stay was shorter among Asian American and Pacific Islander patients with ARF (-0.61 [95% CI, -0.88 to -0.34] days) and Hispanic patients with sepsis (-0.22 [95% CI, -0.39 to -0.05] days) or ARF (-0.47 [-0.73 to -0.20] days). Conclusions and Relevance: In this cohort study, Black patients with severe illness who presented with sepsis and/or ARF experienced longer LOS than White patients. Hispanic patients with sepsis and Asian American and Pacific Islander and Hispanic patients with ARF both experienced shorter LOS. Because matched differences were independent of commonly implicated clinical presentation-related factors associated with disparities, identification of additional mechanisms that underlie these disparities is warranted.


Assuntos
Insuficiência Respiratória , Sepse , Adulto , Humanos , Masculino , Idoso , Idoso de 80 Anos ou mais , Feminino , Etnicidade , Tempo de Internação , Estudos de Coortes , Estudos Retrospectivos , Grupos Minoritários , Sepse/terapia , Brancos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA