Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 192
Filtrar
1.
Liver Transpl ; 2024 May 13.
Artigo em Inglês | MEDLINE | ID: mdl-38727618

RESUMO

BACKGROUND: There is no recent update on the clinical course of retransplantation(re-LT) after living-donor liver transplantation (LDLT) in the US using recent national data. METHOD: The UNOS database (2002-2023) was used to explore patient characteristics in initial LT, comparing deceased-donor liver transplantation (DDLT) and LDLT for graft survival (GS), reasons for graft failure, and GS after re-LT. It assesses waitlist dropout and re-LT likelihood, categorizing re-LT cohort based on time to re-listing as acute or chronic (≤ or >1 mo). RESULTS: Of 132,323 DDLT and 5,955 LDLT initial transplants, 3,848 DDLT and 302 LDLT recipients underwent re-LT. Of the 302 re-LT following LDLT, 156 were acute and 146 chronic. Primary non-function (PNF) was more common in DDLT, although the difference was not statistically significant (17.4%vs14.8% for LDLT; p=0.52). Vascular complications were significantly higher in LDLT (12.5%vs8.3% for DDLT; p<0.01). Acute re-LT showed a larger difference in PNF between DDLT and LDLT (49.7%vs32.0%; p<0.01). Status 1 patients were more common in DDLT (51.3%vs34.0% in LDLT; p<0.01). In the acute cohort, Kaplan-Meier curves indicated superior GS post-re-LT for initial LDLT recipients in both short-term and long-term (p=0.02 and <0.01, respectively), with no significant difference in the chronic cohort. No significant differences in waitlist dropout were observed, but the initial LDLT group had a higher re-LT likelihood in the acute cohort (sHR 1.40, p<0.01). A sensitivity analysis focusing on the most recent 10-year cohort revealed trends consistent with the overall study findings. CONCLUSION: LDLT recipients had better GS in re-LT than DDLT. Despite a higher severity of illness, the DDLT cohort was less likely to undergo re-LT.

3.
Liver Transpl ; 2024 Apr 17.
Artigo em Inglês | MEDLINE | ID: mdl-38625836

RESUMO

BACKGROUND: The use of older donors after circulatory death(DCD) for liver transplantation(LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD(≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion(NMP) technology. METHOD: 7,602 DCD LT cases from the UNOS database(2003-2022) were reviewed. The impact of older DCD donors on graft survival(GS) was assessed using Kaplan-Meier and hazard ratio(HR) analyses. RESULTS: 1,447 LT cases(19.0%) involved older DCD donors. Although there was a decrease in their use from 2003-2014, a resurgence was noted post-2015 and reached 21.9% of all LT in the last four years(2019-2022). Initially, 90-day and one-year GS for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the CIT(>5.5 h) was a significant predictor. CONCLUSION: LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing CIT(≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.

4.
Am J Audiol ; : 1-16, 2024 Apr 23.
Artigo em Inglês | MEDLINE | ID: mdl-38652004

RESUMO

PURPOSE: Military-affiliated individuals (MIs) are at a higher risk of developing hearing loss and tinnitus. While these disorders are well-studied in MIs, their impact relative to non-military-affiliated individuals (non-MIs) remains understudied. Our study compared hearing, speech-in-noise (SIN) perception, and tinnitus characteristics between MIs and non-MIs. METHOD: MIs (n = 84) and non-MIs (n = 193) underwent hearing threshold assessment and Quick Speech-in-Noise Test. Participants with tinnitus completed psychoacoustic tinnitus matching, numeric rating scale (NRS) for loudness and annoyance, and Tinnitus Functional Index. Comorbid conditions such as anxiety, depression, and hyperacusis were assessed. We used a linear mixed-effects model to compare hearing thresholds and SIN scores between MIs and non-MIs. A multivariate analysis of variance compared tinnitus characteristics between MIs and non-MIs, and a stepwise regression was performed to identify predictors of tinnitus severity. RESULTS: MIs exhibited better hearing sensitivity than non-MIs; however, their SIN scores were similar. MIs matched their tinnitus loudness to a lower intensity than non-MIs, but their loudness ratings (NRS) were comparable. MIs reported greater tinnitus annoyance and severity on the relaxation subscale, indicating increased difficulty engaging in restful activities. Tinnitus severity was influenced by hyperacusis and depression in both MIs and non-MIs; however, hearing loss uniquely contributed to severity in MIs. CONCLUSIONS: Our findings suggest that while MIs may exhibit better or comparable listening abilities, they were significantly more affected by tinnitus than non-MIs. Furthermore, our study highlights the importance of assessing tinnitus-related distress across multiple dimensions, facilitating customization of management strategies for both MIs and non-MIs.

5.
Pediatr Transplant ; 28(4): e14763, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38682750

RESUMO

BACKGROUND: Epstein-Barr virus (EBV)-associated post-transplant lymphoproliferative disorders (PTLD) is the most common malignancy in children after transplant; however, difficulties for early detection may worsen the prognosis. METHODS: The prospective, multicenter, study enrolled 944 children (≤21 years of age). Of these, 872 received liver, heart, kidney, intestinal, or multivisceral transplants in seven US centers between 2014 and 2019 (NCT02182986). In total, 34 pediatric EBV+ PTLD (3.9%) were identified by biopsy. Variables included sex, age, race, ethnicity, transplanted organ, EBV viral load, pre-transplant EBV serology, immunosuppression, response to chemotherapy and rituximab, and histopathological diagnosis. RESULTS: The uni-/multivariable competing risk analyses revealed the combination of EBV-seropositive donor and EBV-naïve recipient (D+R-) was a significant risk factor for PTLD development (sub-hazard ratio: 2.79 [1.34-5.78], p = .006) and EBV DNAemia (2.65 [1.72-4.09], p < .001). Patients with D+R- were significantly more associated with monomorphic/polymorphic PTLD than those with the other combinations (p = .02). Patients with monomorphic/polymorphic PTLD (n = 21) had significantly more EBV DNAemia than non-PTLD patients (p < .001) and an earlier clinical presentation of PTLD than patients with hyperplasias (p < .001), within 6-month post-transplant. Among non-liver transplant recipients, monomorphic/polymorphic PTLD were significantly more frequent than hyperplasias in patients ≥5 years of age at transplant (p = .01). CONCLUSIONS: D+R- is a risk factor for PTLD and EBV DNAemia and associated with the incidence of monomorphic/polymorphic PTLD. Intensive follow-up of EBV viral load within 6-month post-transplant, especially for patients with D+R- and/or non-liver transplant recipients ≥5 years of age at transplant, may help detect monomorphic/polymorphic PTLD early in pediatric transplant.


Assuntos
Infecções por Vírus Epstein-Barr , Transtornos Linfoproliferativos , Transplante de Órgãos , Complicações Pós-Operatórias , Humanos , Transtornos Linfoproliferativos/etiologia , Transtornos Linfoproliferativos/epidemiologia , Transtornos Linfoproliferativos/virologia , Infecções por Vírus Epstein-Barr/epidemiologia , Masculino , Estudos Prospectivos , Criança , Feminino , Estados Unidos/epidemiologia , Pré-Escolar , Adolescente , Lactente , Transplante de Órgãos/efeitos adversos , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/virologia , Complicações Pós-Operatórias/etiologia , Fatores de Risco , Herpesvirus Humano 4 , Adulto Jovem
6.
Animals (Basel) ; 14(6)2024 Mar 10.
Artigo em Inglês | MEDLINE | ID: mdl-38539948

RESUMO

The aim of this study was to assess the carcass and meat quality of female Lidia cattle slaughtered at different ages, in order to deepen our understanding of the breed's unique characteristics. The effect of slaughter age on carcass traits and meat quality attributes of m. Longissimus was investigated in Lidia heifers (n = 200) and cows (n = 100) reared and finished in an extensive system. The animals were slaughtered at 24-36 months (Heifer I), 36-48 months (Heifer II) or >48 months (Cull cow). The carcasses (~120 kg) presented poor conformation (O, O+) and medium fatness (2, 2+). The dissection of the 6th rib yielded mean values of 58.6%, 14.3% and 24.8% for lean, fat and bone, respectively. The cows had a higher proportion of dissectible fat (p < 0.05). Subcutaneous fat was classified as dark and yellowish, and meat (aged for 21 days) as dark (L* = 25.5), reddish (a* = 14.4) and moderately yellowish (b* = 12.9), with acceptable water-holding capacity (TL = 5.34%; DL = 0.97%; PL = 8.9%; CL = 22.1%) and intermediate tenderness (WBSF = 4.6 kg/cm2). The b* value of meat was higher (p < 0.05) in cull cows. The meat of cull cows was more yellowish (p < 0.05) and obtained higher scores for flavor (p < 0.05), juiciness p < 0.01), overall tenderness (p < 0.001) and overall acceptance (p < 0.001).

7.
Bioengineering (Basel) ; 11(2)2024 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-38391603

RESUMO

INTRODUCTION: The vestibular system, essential for gaze and postural stability, can be damaged by threats on the battlefield. Technology can aid in vestibular assessment and rehabilitation; however, not all devices are conducive to the delivery of healthcare in an austere setting. This scoping review aimed to examine the literature for technologies that can be utilized for vestibular assessment and rehabilitation in operational environments. MATERIALS AND METHODS: A comprehensive search of PubMed was performed. Articles were included if they related to central or peripheral vestibular disorders, addressed assessment or rehabilitation, leveraged technology, and were written in English. Articles were excluded if they discussed health conditions other than vestibular disorders, focused on devices or techniques not conducive to the operational environment, or were written in a language other than English. RESULTS: Our search strategy yielded 32 articles: 8 articles met our inclusion and exclusion criteria whereas the other 24 articles were rejected. DISCUSSION: There is untapped potential for leveraging technology for vestibular assessment and rehabilitation in the operational environment. Few studies were found in the peer-reviewed literature that described the application of technology to improve the identification of central and/or peripheral vestibular system impairments; triage of acutely injured patients; diagnosis; delivery and monitoring of rehabilitation; and determination of readiness for return to duty. CONCLUSIONS: This scoping review highlighted technology for vestibular assessment and rehabilitation feasible for use in an austere setting. Such technology may be leveraged for prevention; monitoring exposure to mechanisms of injury; vestibular-ocular motor evaluation; assessment, treatment, and monitoring of rehabilitation progress; and return-to-duty determination after vestibular injury. FUTURE DIRECTIONS: The future of vestibular assessment and rehabilitation may be shaped by austere manufacturing and 3D printing; artificial intelligence; drug delivery in combination with vestibular implantation; organ-on-chip and organoids; cell and gene therapy; and bioprinting.

8.
Microbiol Res ; 281: 127621, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38295679

RESUMO

Trichoderma spp. are free-living fungi present in virtually all terrestrial ecosystems. These soil fungi can stimulate plant growth and increase plant nutrient acquisition of macro- and micronutrients and water uptake. Generally, plant growth promotion by Trichoderma is a consequence of the activity of potent fungal signaling metabolites diffused in soil with hormone-like activity, including indolic compounds as indole-3-acetic acid (IAA) produced at concentrations ranging from 14 to 234 µg l-1, and volatile organic compounds such as sesquiterpene isoprenoids (C15), 6-pentyl-2H-pyran-2-one (6-PP) and ethylene (ET) produced at levels from 10 to 120 ng over a period of six days, which in turn, might impact plant endogenous signaling mechanisms orchestrated by plant hormones. Plant growth stimulation occurs without the need of physical contact between both organisms and/or during root colonization. When associated with plants Trichoderma may cause significant biochemical changes in plant content of carbohydrates, amino acids, organic acids and lipids, as detected in Arabidopsis thaliana, maize (Zea mays), tomato (Lycopersicon esculentum) and barley (Hordeum vulgare), which may improve the plant health status during the complete life cycle. Trichoderma-induced plant beneficial effects such as mechanisms of defense and growth are likely to be inherited to the next generations. Depending on the environmental conditions perceived by the fungus during its interaction with plants, Trichoderma can reprogram and/or activate molecular mechanisms commonly modulated by IAA, ET and abscisic acid (ABA) to induce an adaptative physiological response to abiotic stress, including drought, salinity, or environmental pollution. This review, provides a state of the art overview focused on the canonical mechanisms of these beneficial fungi involved in plant growth promotion traits under different environmental scenarios and shows new insights on Trichoderma metabolites from different chemical classes that can modulate specific plant growth aspects. Also, we suggest new research directions on Trichoderma spp. and their secondary metabolites with biological activity on plant growth.


Assuntos
Arabidopsis , Etilenos , Trichoderma , Ecossistema , Trichoderma/metabolismo , Desenvolvimento Vegetal , Reguladores de Crescimento de Plantas/metabolismo , Plantas/metabolismo , Solo , Raízes de Plantas/microbiologia
9.
Transplant Proc ; 56(1): 161-168, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38195284

RESUMO

BACKGROUND: This study aims to evaluate patient outcomes of simultaneous triple organ transplants, which may provide insight into optimal donor allocation while maximizing recipient benefit. METHODS: Triple organ transplants and their corollary dual organ transplants were identified using the United Network for Organ Sharing database. Triple organ transplants evaluated included heart-lung-kidney (n = 12) and heart-liver-kidney (n = 37). Heart-lung-kidney recipients were compared with heart-lung (n = 325), lung-kidney (n = 91), and heart-kidney (n = 2022) groups. Heart-liver-kidney recipients were compared with heart-liver (n = 451), liver-kidney (n = 10422), and heart-kidney (n = 2517) recipients. Patient survival outcomes were calculated using the Kaplan-Meier method and compared using log-rank tests. RESULTS: Patients undergoing triple organ transplants showed similar 10-year survival as their corresponding dual organ transplant cohorts. Patient survival estimate at 10 years for the heart-lung-kidney group was 45%, with no statistically significant difference in survival when compared with dual organ groups (P = .16). Survival estimates at 10 years for the heart-liver-kidney group was 49%, with no statistically significant difference in survival when compared with dual organ groups (P = .06). CONCLUSION: Despite the surgical burden of adding a third organ transplant, heart-liver-kidney and heart-lung-kidney have similar survival outcomes to dual organ equivalents and represent a reasonable allocation option in well-selected patients.


Assuntos
Transplante de Coração , Transplante de Órgãos , Obtenção de Tecidos e Órgãos , Humanos , Estados Unidos , Transplante de Coração/efeitos adversos , Incidência , Estudos Retrospectivos , Transplante de Órgãos/efeitos adversos , Rim , Doadores de Tecidos , Sobrevivência de Enxerto
10.
J Clin Oncol ; 42(7): 790-799, 2024 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-38175991

RESUMO

PURPOSE: There are limited data on antiviral treatment utilization and its impact on long-term outcomes of hepatitis B virus (HBV)- and hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) after hepatic resection. We aimed to determine the utilization and impact of antivirals in HBV- and HCV-related HCC. METHODS: This cohort study included 1,906 participants (1,054 HBV-related HCC and 852 HCV-related HCC) from 12 international sites. All participants had HBV- or HCV-related HCC and underwent curative surgical resection. The primary outcome was the utilization of antiviral therapy, and the secondary outcome was long-term overall survival (OS). RESULTS: The mean (±standard deviation [SD]) age was 62.1 (±11.3) years, 74% were male, and 84% were Asian. A total of 47% of the total cohort received antiviral therapy during a mean (±SD) follow-up of 5.0 (±4.3) years. The overall antiviral utilization for participants with HBV-related HCC was 57% and declined over time, from 65% before 2010, to 60% from 2010 to 2015, to 47% beyond 2015, P < .0001. The overall utilization of antivirals for HCV-related HCC was 35% and increased over time, from 24% before 2015 to 74% from 2015 and beyond, P < .0001. The 10-year OS was lower in untreated participants for both HBV (58% v 61%) and HCV participants (38% v 82%; both P < .0001). On multivariable Cox regression analysis adjusted for relevant confounders, antiviral therapy initiated before or within 6 months of HCC diagnosis was independently associated with lower mortality in both HBV- (adjusted hazard ratio [aHR], 0.60 [95% CI, 0.43 to 0.83]; P = .002) and HCV-related HCC (aHR, 0.18 [95% CI, 0.11 to 0.31]; P < .0001). CONCLUSION: Antiviral therapy is associated with long-term survival in people with HBV- or HCV-related HCC who undergo curative resection but is severely underutilized.


Assuntos
Carcinoma Hepatocelular , Hepatite B , Hepatite C , Neoplasias Hepáticas , Masculino , Humanos , Pessoa de Meia-Idade , Idoso , Feminino , Carcinoma Hepatocelular/patologia , Vírus da Hepatite B , Neoplasias Hepáticas/patologia , Hepacivirus , Estudos de Coortes , Hepatite C/complicações , Hepatite C/tratamento farmacológico , Antivirais/uso terapêutico , Hepatite B/complicações , Hepatite B/tratamento farmacológico , Estudos Retrospectivos
11.
Transplantation ; 108(2): 464-472, 2024 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-38259179

RESUMO

BACKGROUND: Children are removed from the liver transplant waitlist because of death or progressive illness. Size mismatch accounts for 30% of organ refusal. This study aimed to demonstrate that 3-dimensional (3D) technology is a feasible and accurate adjunct to organ allocation and living donor selection process. METHODS: This prospective multicenter study included pediatric liver transplant candidates and living donors from January 2020 to February 2023. Patient-specific, 3D-printed liver models were used for anatomic planning, real-time evaluation during organ procurement, and surgical navigation. The primary outcome was to determine model accuracy. The secondary outcome was to determine the impact of outcomes in living donor hepatectomy. Study groups were analyzed using propensity score matching with a retrospective cohort. RESULTS: Twenty-eight recipients were included. The median percentage error was -0.6% for 3D models and had the highest correlation to the actual liver explant (Pearson's R = 0.96, P < 0.001) compared with other volume calculation methods. Patient and graft survival were comparable. From 41 living donors, the median percentage error of the allograft was 12.4%. The donor-matched study group had lower central line utilization (21.4% versus 75%, P = 0.045), shorter length of stay (4 versus 7 d, P = 0.003), and lower mean comprehensive complication index (3 versus 21, P = 0.014). CONCLUSIONS: Three-dimensional volume is highly correlated with actual liver explant volume and may vary across different allografts for living donation. The addition of 3D-printed liver models during the transplant evaluation and organ procurement process is a feasible and safe adjunct to the perioperative decision-making process.


Assuntos
Transplante de Fígado , Modelos Anatômicos , Criança , Humanos , Fígado , Doadores Vivos , Estudos Prospectivos , Estudos Retrospectivos , Impressão Tridimensional
12.
Pediatr Transplant ; 28(1): e14471, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37294621

RESUMO

The International Pediatric Transplant Association convened an expert consensus conference to assess current evidence and develop recommendations for various aspects of care relating to post-transplant lymphoproliferative disorders after solid organ transplantation in children. In this report from the Viral Load and Biomarker Monitoring Working Group, we reviewed the existing literature regarding the role of Epstein-Barr viral load and other biomarkers in peripheral blood for predicting the development of PTLD, for PTLD diagnosis, and for monitoring of response to treatment. Key recommendations from the group highlighted the strong recommendation for use of the term EBV DNAemia instead of "viremia" to describe EBV DNA levels in peripheral blood as well as concerns with comparison of EBV DNAemia measurement results performed at different institutions even when tests are calibrated using the WHO international standard. The working group concluded that either whole blood or plasma could be used as matrices for EBV DNA measurement; optimal specimen type may be clinical context dependent. Whole blood testing has some advantages for surveillance to inform pre-emptive interventions while plasma testing may be preferred in the setting of clinical symptoms and treatment monitoring. However, EBV DNAemia testing alone was not recommended for PTLD diagnosis. Quantitative EBV DNAemia surveillance to identify patients at risk for PTLD and to inform pre-emptive interventions in patients who are EBV seronegative pre-transplant was recommended. In contrast, with the exception of intestinal transplant recipients or those with recent primary EBV infection prior to SOT, surveillance was not recommended in pediatric SOT recipients EBV seropositive pre-transplant. Implications of viral load kinetic parameters including peak load and viral set point on pre-emptive PTLD prevention monitoring algorithms were discussed. Use of additional markers, including measurements of EBV specific cell mediated immunity was discussed but not recommended though the importance of obtaining additional data from prospective multicenter studies was highlighted as a key research priority.


Assuntos
Infecções por Vírus Epstein-Barr , Transtornos Linfoproliferativos , Transplante de Órgãos , Humanos , Criança , Herpesvirus Humano 4/genética , Infecções por Vírus Epstein-Barr/complicações , Infecções por Vírus Epstein-Barr/diagnóstico , Estudos Prospectivos , Transtornos Linfoproliferativos/diagnóstico , Transtornos Linfoproliferativos/etiologia , Transtornos Linfoproliferativos/prevenção & controle , DNA Viral , Transplante de Órgãos/efeitos adversos , Biomarcadores , Carga Viral
13.
Transplantation ; 108(3): 703-712, 2024 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-37635278

RESUMO

BACKGROUND: Technical variant liver transplantation (TVLT) is a strategy to mitigate persistent pediatric waitlist mortality in the United States, although its implementation remains stagnant. This study investigated the relationship between TVLT utilization, transplant center volume, and graft survival. METHODS: Pediatric liver transplant recipients from 2010 to 2020 (n = 5208) were analyzed using the Scientific Registry of Transplant Recipients database. Transplant centers were categorized according to the average number of pediatric liver transplants performed per year (high-volume, ≥5; low-volume, <5). Graft survival rates were compared using Kaplan-Meier curves and log-rank tests. Cox proportional hazards models were used to identify predictors of graft failure. RESULTS: High-volume centers demonstrated equivalent whole liver transplant and TVLT graft survival ( P = 0.057) and significantly improved TVLT graft survival compared with low-volume centers ( P < 0.001). Transplantation at a low-volume center was significantly associated with graft failure (adjusted hazard ratio, 1.6; 95% confidence interval, 1.14-2.24; P = 0.007 in patients <12 y old and 1.8; 95% confidence interval, 1.13-2.87; P = 0.013 in patients ≥12 y old). A subset of high-volume centers with a significantly higher rate of TVLT use demonstrated a 23% reduction in waitlist mortality. CONCLUSIONS: Prompt transplantation with increased TVLT utilization at high-volume centers may reduce pediatric waitlist mortality without compromising graft survival.


Assuntos
Transplante de Fígado , Humanos , Criança , Estados Unidos , Transplante de Fígado/efeitos adversos , Sobrevivência de Enxerto , Modelos de Riscos Proporcionais , Transplantados , Taxa de Sobrevida , Estudos Retrospectivos
14.
Liver Transpl ; 30(4): 376-385, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-37616509

RESUMO

With increasing metabolic dysfunction-associated steatotic liver disease, the use of steatotic grafts in liver transplantation (LT) and their impact on postoperative graft survival (GS) needs further exploration. Analyzing adult LT recipient data (2002-2022) from the United Network for Organ Sharing database, outcomes of LT using steatotic (≥30% macrosteatosis) and nonsteatotic donor livers, donors after circulatory death, and standard-risk older donors (age 45-50) were compared. GS predictors were evaluated using Kaplan-Meier and Cox regression analyses. Of the 35,345 LT donors, 8.9% (3,155) were fatty livers. The initial 30-day postoperative period revealed significant challenges with fatty livers, demonstrating inferior GS. However, the GS discrepancy between fatty and nonfatty livers subsided over time ( p = 0.10 at 5 y). Long-term GS outcomes showed comparable or even superior results in fatty livers relative to nonsteatotic livers, conditional on surviving the initial 90 postoperative days ( p = 0.90 at 1 y) or 1 year ( p = 0.03 at 5 y). In the multivariable Cox regression analysis, the high body surface area (BSA) ratio (≥1.1) (HR 1.42, p = 0.02), calculated as donor BSA divided by recipient BSA, long cold ischemic time (≥6.5 h) (HR 1.72, p < 0.01), and recipient medical condition (intensive care unit hospitalization) (HR 2.53, p < 0.01) emerged as significant adverse prognostic factors. Young (<40 y) fatty donors showed a high BSA ratio, diabetes, and intensive care unit hospitalization as significant indicators of a worse prognosis ( p < 0.01). Our study emphasizes the initial postoperative 30-day survival challenge in LT using fatty livers. However, with careful donor-recipient matching, for example, avoiding the use of steatotic donors with long cold ischemic time and high BSA ratios for recipients in the intensive care unit, it is possible to enhance immediate GS, and in a longer time, outcomes comparable to those using nonfatty livers, donors after circulatory death livers, or standard-risk older donors can be anticipated. These novel insights into decision-making criteria for steatotic liver use provide invaluable guidance for clinicians.


Assuntos
Fígado Gorduroso , Transplante de Fígado , Humanos , Pessoa de Meia-Idade , Transplante de Fígado/métodos , Prognóstico , Fígado Gorduroso/etiologia , Fígado/metabolismo , Doadores de Tecidos , Sobrevivência de Enxerto
15.
Surgery ; 175(2): 513-521, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-37980203

RESUMO

BACKGROUND: Long-distance-traveling liver grafts in liver transplantation present challenges due to prolonged cold ischemic time and increased risk of ischemia-reperfusion injury. We identified long-distance-traveling liver graft donor and recipient characteristics and risk factors associated with long-distance-traveling liver graft use. METHODS: We conducted a retrospective analysis of data from donor liver transplantation patients registered from 2014 to 2020 in the United Network for Organ Sharing registry database. Donor, recipient, and transplant factors of graft survival were compared between short-travel grafts and long-distance-traveling liver grafts (traveled >500 miles). RESULTS: During the study period, 28,265 patients received a donation after brainstem death liver transplantation and 3,250 a donation after circulatory death liver transplantation. The long-distance-traveling liver graft rate was 6.2% in donation after brainstem death liver transplantation and 7.1% in donation after circulatory death liver transplantation. The 90-day graft survival rates were significantly worse for long-distance-traveling liver grafts (donation after brainstem death: 95.7% vs 94.5%, donation after circulatory death: 94.5% vs 93.9%). The 3-year graft survival rates were similar for long-distance-traveling liver grafts (donation after brainstem death: 85.5% vs 85.1%, donation after circulatory death: 81.0% vs 80.4%). Cubic spline regression analyses revealed that travel distance did not linearly worsen the prognosis of 3-year graft survival. On the other hand, younger donor age, lower donor body mass index, and shorter cold ischemic time mitigated the negative impact of 90-day graft survival in long-distance-traveling liver grafts. CONCLUSION: The use of long-distance-traveling liver grafts negatively impacts 90-day graft survival but not 3-year graft survival. Moreover, long-distance-traveling liver grafts are more feasible with appropriate donor and recipient factors offsetting the extended cold ischemic time. Mechanical perfusion can improve long-distance-traveling liver graft use. Enhanced collaboration between organ procurement organizations and transplant centers and optimized transportation systems are essential for increasing long-distance-traveling liver graft use, ultimately expanding the donor pool.


Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Doadores Vivos , Doadores de Tecidos , Fígado , Fatores de Risco , Sobrevivência de Enxerto
16.
Clin Transplant ; 38(1): e15155, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37812571

RESUMO

BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.


Assuntos
Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Prognóstico , Doadores de Tecidos , Sobrevivência de Enxerto , Hiperbilirrubinemia/etiologia , Bilirrubina , Estudos Retrospectivos
17.
Transplantation ; 108(3): 742-749, 2024 Mar 01.
Artigo em Inglês | MEDLINE | ID: mdl-37899485

RESUMO

BACKGROUND: The selection of liver transplant (LT) candidates with alcohol-related liver disease (ALD) is influenced by the risk of alcohol relapse (AR), yet the ability to predict AR is limited. We evaluate psychosocial factors associated with post-LT AR and compare the performance of high-risk alcoholism risk (HRAR), sustained alcohol use post-LT (SALT), and the Stanford Integrated Psychosocial Assessment for Transplantation (SIPAT) scores in predicting relapse. METHODS: A retrospective analysis of ALD patients undergoing LT from 2015 to 2021 at a single US transplant center was performed. Risk factors associated with post-LT AR were evaluated and test characteristics of 3 prediction models were compared. RESULTS: Of 219 ALD LT recipients, 23 (11%) had AR during a median study follow-up of 37.5 mo. On multivariate analysis, comorbid psychiatric illness (odds ratio 5.22) and continued alcohol use after advice from a health care provider (odds ratio 3.8) were found to be significantly associated with post-LT AR. On sensitivity analysis, SIPAT of 30 was optimal on discriminating between ALD LT recipients with and without post-LT AR. SIPAT outperformed both the HRAR and SALT scores (c-statistic 0.67 versus 0.59 and 0.62, respectively) in identifying post-LT AR. However, all scores had poor positive predictive value (<25%). CONCLUSIONS: AR after LT is associated with comorbid psychiatric illness and lack of heeding health care provider advice to abstain from alcohol. Although SIPAT outperformed the HRAR and SALT scores in predicting AR, all are poor predictors. The current tools to predict post-LT AR should not be used to exclude LT candidacy.


Assuntos
Alcoolismo , Hepatopatias Alcoólicas , Hepatopatias , Transplante de Fígado , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Consumo de Bebidas Alcoólicas/efeitos adversos , Alcoolismo/complicações , Alcoolismo/diagnóstico , Alcoolismo/epidemiologia , Doença Crônica , Recidiva , Hepatopatias Alcoólicas/complicações , Hepatopatias Alcoólicas/diagnóstico , Hepatopatias Alcoólicas/cirurgia
18.
Mil Med ; 188(Suppl 6): 511-519, 2023 11 08.
Artigo em Inglês | MEDLINE | ID: mdl-37948221

RESUMO

INTRODUCTION: Dizziness is prevalent in the general population, but little is known about its prevalence in the U.S. military population. Dizziness is commonly associated with blast exposure and traumatic brain injury (TBI), but the potential independent contributions of blast and TBI have yet to be evaluated. This study's goal was to estimate the prevalence of dizziness among post-9/11 service members and Veterans and to examine independent and joint associations between military TBI history, blast exposure, and self-reported dizziness. MATERIALS AND METHODS: The study sample consisted of service members (n = 424) and recently separated (< ∼2.5 years) Veterans (n = 492) enrolled in the Noise Outcomes in Service members Epidemiology (NOISE) Study. We examined associations between self-reported history of probable TBI and blast exposure and recent dizziness using logistic regression. Models were stratified by service member versus Veteran status and adjusted to account for potentially confounding demographic and military characteristics. RESULTS: Overall, 22% of service members and 31% of Veterans self-reported dizziness. Compared to those with neither TBI nor blast exposure history, both service members and Veterans with TBI (with or without blast) were three to four times more likely to self-report dizziness. Those with blast exposure but no TBI history were not more likely to self-report dizziness. There was no evidence of an interaction effect between blast exposure and a history of TBI on the occurrence of dizziness. CONCLUSION: Self-reported dizziness was prevalent in this sample of service members and Veterans. Probable TBI history, with or without blast exposure, was associated with dizziness, but blast exposure without TBI history was not. This suggests that treatment guidelines for TBI-related dizziness may not need to be tailored to the injury mechanism. However, future efforts should be directed toward the understanding of the pathophysiology of TBI on self-reported dizziness, which is fundamental to the design of treatment strategies.


Assuntos
Traumatismos por Explosões , Lesões Encefálicas Traumáticas , Militares , Transtornos de Estresse Pós-Traumáticos , Veteranos , Humanos , Autorrelato , Tontura/epidemiologia , Tontura/etiologia , Prevalência , Traumatismos por Explosões/complicações , Traumatismos por Explosões/epidemiologia , Lesões Encefálicas Traumáticas/complicações , Lesões Encefálicas Traumáticas/epidemiologia , Lesões Encefálicas Traumáticas/terapia , Fatores de Risco , Vertigem , Transtornos de Estresse Pós-Traumáticos/complicações
19.
Mil Med ; 2023 Oct 19.
Artigo em Inglês | MEDLINE | ID: mdl-37856686

RESUMO

INTRODUCTION: The Department of Defense Medical Examination Review Board (DoDMERB) plays a pivotal role in the assessment of medical fitness for aspiring military officers. A crucial component of this process is the screening audiogram, designed to evaluate hearing capabilities. However, recent observations of high disqualification rates following screening audiograms led to concerns about their accuracy. MATERIALS AND METHODS: This quality improvement project, conducted between 2017 and 2019, aimed to assess the concordance between screening audiograms and reference-standard audiometry, as well as to investigate the relationship between disqualification status and hearing thresholds at different frequencies. A sample of 134 candidates, drawn from various locations across the United States, was analyzed. RESULTS: Results revealed that the screening audiogram mean thresholds were twice that of the reference-standard audiogram, particularly in the lower frequencies. Additionally, we found that 84% of candidates were incorrectly disqualified by the screening exam when followed up by the reference-standard. Overall, Bland-Altman analysis revealed significant disagreement between these two tests. This discrepancy prompted a fundamental policy shift in 2020, where candidates who fail screening audiograms now automatically undergo reference-standard audiometry before any disqualification decision. This policy change reflects the commitment of DoDMERB to refining the medical screening process. It reduces the burden on candidates, provides a more comprehensive assessment, and ensures that qualified individuals are not erroneously disqualified.In addition to policy changes, this quality improvement project explored potential courses of action to enhance the screening audiogram process. Among these, improving contract specifications for testing facilities to minimize ambient noise emerged as the most practical and cost-effective approach. CONCLUSION: In conclusion, the project underscores the importance of refining medical screening processes to accurately assess candidates' qualifications while retaining the utility of screening audiograms. These efforts not only benefit aspiring military officers but also contribute to maintaining the high standards required for military service.

20.
Transplantation ; 107(10): 2087-2097, 2023 10 01.
Artigo em Inglês | MEDLINE | ID: mdl-37750781

RESUMO

BACKGROUND: Over 16 000 children under the age of 15 died worldwide in 2017 because of liver disease. Pediatric liver transplantation (PLT) is currently the standard of care for these patients. The aim of this study is to describe global PLT activity and identify variations between regions. METHODS: A survey was conducted from May 2018 to August 2019 to determine the current state of PLT. Transplant centers were categorized into quintile categories according to the year they performed their first PLT. Countries were classified according to gross national income per capita. RESULTS: One hundred eight programs from 38 countries were included (68% response rate). 10 619 PLTs were performed within the last 5 y. High-income countries performed 4992 (46.4%) PLT, followed by upper-middle- (4704 [44·3%]) and lower-middle (993 [9·4%])-income countries. The most frequently used type of grafts worldwide are living donor grafts. A higher proportion of lower-middle-income countries (68·7%) performed ≥25 living donor liver transplants over the last 5 y compared to high-income countries (36%; P = 0.019). A greater proportion of programs from high-income countries have performed ≥25 whole liver transplants (52.4% versus 6.2%; P = 0.001) and ≥25 split/reduced liver transplants (53.2% versus 6.2%; P < 0.001) compared to lower-middle-income countries. CONCLUSIONS: This study represents, to our knowledge, the most geographically comprehensive report on PLT activity and a first step toward global collaboration and data sharing for the greater good of children with liver disease; it is imperative that these centers share the lead in PLT.


Assuntos
Hepatopatias , Transplante de Fígado , Criança , Humanos , Transplante de Fígado/efeitos adversos , Censos , Doadores Vivos , Morte
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA