Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 652
Filtrar
1.
Nature ; 575(7781): 180-184, 2019 11.
Artigo em Inglês | MEDLINE | ID: mdl-31695210

RESUMO

Methane is a powerful greenhouse gas and is targeted for emissions mitigation by the US state of California and other jurisdictions worldwide1,2. Unique opportunities for mitigation are presented by point-source emitters-surface features or infrastructure components that are typically less than 10 metres in diameter and emit plumes of highly concentrated methane3. However, data on point-source emissions are sparse and typically lack sufficient spatial and temporal resolution to guide their mitigation and to accurately assess their magnitude4. Here we survey more than 272,000 infrastructure elements in California using an airborne imaging spectrometer that can rapidly map methane plumes5-7. We conduct five campaigns over several months from 2016 to 2018, spanning the oil and gas, manure-management and waste-management sectors, resulting in the detection, geolocation and quantification of emissions from 564 strong methane point sources. Our remote sensing approach enables the rapid and repeated assessment of large areas at high spatial resolution for a poorly characterized population of methane emitters that often appear intermittently and stochastically. We estimate net methane point-source emissions in California to be 0.618 teragrams per year (95 per cent confidence interval 0.523-0.725), equivalent to 34-46 per cent of the state's methane inventory8 for 2016. Methane 'super-emitter' activity occurs in every sector surveyed, with 10 per cent of point sources contributing roughly 60 per cent of point-source emissions-consistent with a study of the US Four Corners region that had a different sectoral mix9. The largest methane emitters in California are a subset of landfills, which exhibit persistent anomalous activity. Methane point-source emissions in California are dominated by landfills (41 per cent), followed by dairies (26 per cent) and the oil and gas sector (26 per cent). Our data have enabled the identification of the 0.2 per cent of California's infrastructure that is responsible for these emissions. Sharing these data with collaborating infrastructure operators has led to the mitigation of anomalous methane-emission activity10.


Assuntos
Monitoramento Ambiental , Metano/análise , Gerenciamento de Resíduos , California , Efeito Estufa , Esterco , Metano/química , Metano/metabolismo , Gás Natural , Indústria de Petróleo e Gás/métodos , Petróleo , Águas Residuárias
2.
Proc Natl Acad Sci U S A ; 119(38): e2202338119, 2022 09 20.
Artigo em Inglês | MEDLINE | ID: mdl-36099297

RESUMO

Understanding, prioritizing, and mitigating methane (CH4) emissions requires quantifying CH4 budgets from facility scales to regional scales with the ability to differentiate between source sectors. We deployed a tiered observing system for multiple basins in the United States (San Joaquin Valley, Uinta, Denver-Julesburg, Permian, Marcellus). We quantify strong point source emissions (>10 kg CH4 h-1) using airborne imaging spectrometers, attribute them to sectors, and assess their intermittency with multiple revisits. We compare these point source emissions to total basin CH4 fluxes derived from inversion of Sentinel-5p satellite CH4 observations. Across basins, point sources make up on average 40% of the regional flux. We sampled some basins several times across multiple months and years and find a distinct bimodal structure to emission timescales: the total point source budget is split nearly in half by short-lasting and long-lasting emission events. With the increasing airborne and satellite observing capabilities planned for the near future, tiered observing systems will more fully quantify and attribute CH4 emissions from facility to regional scales, which is needed to effectively and efficiently reduce methane emissions.


Assuntos
Poluentes Atmosféricos , Metano , Poluentes Atmosféricos/análise , Metano/análise , Estados Unidos
3.
Am J Transplant ; 2024 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-38951053

RESUMO

Obesity is a risk factor for kidney, liver, heart, and pulmonary diseases, as well as failure. Solid organ transplantation remains the definitive treatment for the end-stage presentation of these diseases. Among many criteria for organ transplant, efficient management of obesity is required for patients to acquire transplant eligibility. End-stage organ failure and obesity are 2 complex pathologies that are often entwined. Metabolic and bariatric surgery before, during, or after organ transplant has been studied to determine the long-term effect of bariatric surgery on transplant outcomes. In this review, a multidisciplinary group of surgeons from the Society of American Gastrointestinal and Endoscopic Surgeons and the American Society for Transplant Surgery presents the current published literature on metabolic and bariatric surgery as a therapeutic option for patients with obesity awaiting solid organ transplantation. This manuscript details the most recent recommendations, pharmacologic considerations, and psychological considerations for this specific cohort of patients. Since level one evidence is not available on many of the topics covered by this review, expert opinion was implemented in several instances. Additional high-quality research in this area will allow for better recommendations and, therefore, treatment strategies for these complex patients.

4.
Ann Surg ; 2024 Mar 11.
Artigo em Inglês | MEDLINE | ID: mdl-38489660

RESUMO

OBJECTIVE: Assess factors affecting the cumulative lifespan of a transplanted liver. SUMMARY BACKGROUND DATA: Liver ageing is different from other solid organs. It is unknown how old a liver can actually get after liver transplantation (LT). METHODS: Deceased donor liver transplants from 1988-2021 were queried from the United States (US) UNOS registry. Cumulative liver age was calculated as donor age + recipient graft survival. RESULTS: In total, 184,515 livers were included. Most were DBD-donors (n=175,343). The percentage of livers achieving >70, 80, 90 and 100years cumulative age was 7.8% (n=14,392), 1.9% (n=3,576), 0.3% (n=528), and 0.01% (n=21), respectively. The youngest donor age contributing to a cumulative liver age >90years was 59years, with post-transplant survival of 34years. In pediatric recipients, 736 (4.4%) and 282 livers (1.7%) survived >50 and 60years overall, respectively. Transplanted livers achieved cumulative age >90years in 2.86-per-1000 and >100years in 0.1-per-1000. The US population at-large has a cumulative "liver age" >90years in 5.35-per-1000 persons, and >100y in 0.2-per-1000. Livers aged>60 years at transplant experienced both improved cumulative survival ( P <0.0001) and interestingly improved survival after transplantation ( P <0.0001). Recipient warm-ischemia-time of >30minutes was most predictive of reduced cumulative liver survival overall (n=184,515, HR=1.126, P <0.001) and excluding patients with mortality in the first 6month (n=151,884, HR=0.973, P <0.001). CONCLUSIONS: In summary, transplanted livers frequently get as old as those in the average population despite ischemic-reperfusion-injury and immunosuppression. The presented results justify using older donor livers regardless of donation type, even in sicker recipients with limited options.

5.
Ann Surg ; 2024 Jun 11.
Artigo em Inglês | MEDLINE | ID: mdl-38860385

RESUMO

OBJECTIVE: Describe the utility of circulating tumor DNA in the post-operative surveillance of hepatocellular carcinoma (HCC). SUMMARY BACKGROUND DATA: Current biomarkers for HCC like Alpha-fetoprotein (AFP) are lacking. ctDNA has shown promise in colorectal and lung cancers, but its utility in HCC remains relatively unknown. METHODS: Patients with HCC undergoing curative-intent resection from 11/1/2020-7/1/2023 received ctDNA testing using the Guardant360 platform. TMB is calculated as the number of somatic mutations-per-megabase of genomic material identified. RESULTS: Forty seven patients had post-operative ctDNA testing. Mean follow-up was 27 months and maximum was 43.2 months. Twelve patients (26%) experienced recurrence. Most (n=41/47, 87.2%) had identifiable ctDNA post-operatively; 55.3%(n=26) were TMB-not detected versus 45.7% (n=21) TMB-detectable. Post-operative identifiable ctDNA was not associated with RFS (P=0.518). Detectable TMB was associated with reduced RFS (6.9 vs. 14.7months, P=0.049). There was a higher rate of recurrence in patients with TMB (n=9/21, 42.9%, vs. n=3/26, 11.5%, P=0.02). Area-Under the Curve (AUC) for TMB-prediction of recurrence was 0.752 versus 0.550 for AFP. ROC-analysis established a TMB cut-off of 4.8mut/mB for predicting post-operative recurrence (P=0.002) and RFS (P=0.025). AFP was not correlated with RFS using the lab-normal cut-off (<11 ng/mL, P=0.682) or the cut-off established by ROC-analysis (>4.6 ng/mL, P=0.494). TMB-high was associated with poorer RFS on cox-regression analysis (HR=5.386, 95%CI1.109-26.160, P=0.037) while micro-vascular invasion (P=0.853) and AFP (P=0.439) were not. CONCLUSIONS: Identifiable TMB on post-operative ctDNA predicts HCC recurrence, and outperformed AFP in this cohort. Perioperative ctDNA may be a useful surveillance tool following curative-intent hepatectomy. Larger-scale studies are needed to confirm this utility and investigate additional applications in HCC patients, including the potential for prophylactic treatment in patients with residual TMB after resection.

6.
Ann Surg ; 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38557793

RESUMO

OBJECTIVE: Assess cost and complication outcomes after liver transplantation (LT) using normothermic machine perfusion (NMP). SUMMARY BACKGROUND DATA: End-ischemic NMP is often used to aid logistics, yet its' impact on outcomes after LT remains unclear, as does its' true impact on costs associated with transplantation. METHODS: Deceased donor liver recipients at two centers (1/1/2019-6/30/2023) were included. Retransplants, splits and combined grafts were excluded. End-ischemic NMP (OrganOx-Metra®) was implemented 10/2022 for extended-criteria DBDs, all DCDs and logistics. NMP-cases were matched 1:2 with cold storage controls (SCS) using the Balance-of-Risk (DBD-grafts) and UK-DCD Score (DCD-grafts). RESULTS: Overall, 803 transplantations were included, 174 (21.7%) receiving NMP. Matching was achieved between 118 NMP-DBDs with 236 SCS; and 37 NMP-DCD with 74 corresponding SCS. For both graft types, median inpatient comprehensive complications index (CCI) values were comparable between groups. DCD-NMP grafts experienced reduced cumulative 90-day CCI (27.6 vs. 41.9, P=0.028). NMP also reduced the need for early relaparotomy and renal-replacement-therapy, with subsequently less-frequent major complications (Clavien-Dindo >IVa). This effect was more pronounced in DCD-transplants. NMP had no protective effect on early biliary complications. Organ acquisition/preservation costs were higher with NMP, yet NMP-treated grafts had lower 90-day pre-transplant costs in context of shorter waiting-list times. Overall costs were comparable for both cohorts. CONCLUSIONS: This is the first risk-adjusted outcome and cost analysis comparing NMP and SCS. In addition to logistical benefits, NMP was associated with a reduction in relaparotomy and bleeding in DBD-grafts, and overall complications and post-LT renal-replacement for DCDs. While organ acquisition/preservation was more costly with NMP, overall 90-day-healthcare costs-per-transplantation were comparable.

7.
Liver Transpl ; 2024 Jun 05.
Artigo em Inglês | MEDLINE | ID: mdl-38833290

RESUMO

BACKGROUND: Ex-situ normothermic machine perfusion (NMP) helps increase the use of extended criteria donor livers. However, the impact of an NMP program on waitlist times and mortality has not been evaluated. METHODS: Adult patients listed for liver transplant (LT) at two academic centers 1/1/2015-9/1/2023 were included (n=2773) to allow all patients >6-months follow-up from listing. Routine NMP was implemented on 10/14/2022. Waitlist outcomes were compared from pre-NMP pre-acuity-circles (n=1,460), pre-NMP with acuity circles (n=842) and with NMP (n=381). RESULTS: Median waitlist time was 79days (IQR 20-232 d) at baseline, 49days (7-182) with acuity circles, and 14days (5-56) with NMP (p<0.001). The rate of transplant-per-100-person-years improved from 61-per-100-person-years to 99-per-100-person-years with acuity circles, and 194-per-100-person-years with NMP (p<0.001). Crude mortality without transplant decreased from 18.3% (n=268/1460), to 13.3% (n=112/843), to 6.3% (n=24/381) p<0.001) with NMP. Incidence of mortality without LT was 15-per-100-person-years before acuity circles, 19-per-100 with acuity circles, and 9-per-100-person-years after NMP (p<0.001). Median MELD at LT was lowest with NMP, but MELD at listing was highest in this era (p<0.0001). Median DRI of transplanted livers at baseline was 1.54 (1.27-1.82), 1.66 (1.42-2.16) with acuity circles, and 2.06 (1.63-2.46) with NMP (p<0.001). Six-month post-LT survival was not different between eras (p=0.322). The total cost of healthcare while waitlisted was lowest in the NMP era ($53,683 vs. $32,687 vs. $23,688, p<0.001); cost-per-day did not differ between eras (p=0.152). CONCLUSION: Implementation of a routine NMP program was associated with reduced waitlist time and mortality without compromising short-term survival after liver transplant despite increased use of riskier grafts. Routine NMP use enables better waitlist management with reduced healthcare costs.

8.
Liver Transpl ; 2024 Jun 05.
Artigo em Inglês | MEDLINE | ID: mdl-38833301

RESUMO

BACKGROUND: We describe a novel pre-liver-transplant (LT) approach in colorectal liver metastasis (CRLM) allowing for improved monitoring of tumor biology and reduction of disease burden before committing a patient to transplantation. METHODS: Patients undergoing LT for CRLM at Cleveland Clinic were included. The described protocol involves intensive locoregional therapy with systemic chemotherapy, aiming to reach minimal disease burden revealed by PET scan and CEA. Patients with no detectable disease or irreversible treatment-induced liver injury undergo transplant. RESULTS: Nine patients received liver transplant out of 27 who were evaluated (33.3%). Median follow-up was 700 days. Seven patients (77.8%) received a living donor LT. Five had no detectable disease and four had treatment-induced cirrhosis. Pre-transplant management included chemotherapy (n=9) +/- Bevacizumab (n=6) and/or Anti-EGFR (n=6). Median pre-LT cycles of chemotherapy=16 (Range 10-40). Liver-directed therapy included Yttrium-90 (n=5), ablation (n=4), resection (n=4), and HAI-pump (n=3). Three patients recurred after LT. Actuarial 1- and 2-year recurrence-free survival were 75% (n=6/8) and 60% (n=3/5). Recurrence occurred in the lungs (n=1), liver graft (n=1), and lungs+paraaortic nodes (n=1). Patients with pre-LT detectable disease had reduced RFS (p=0.04). All patients with recurrence had histologically-viable tumor in the liver explant. Patients treated in our protocol (n=16) demonstrated improved survival versus those who were not candidates (n=11) regardless of transplant status (p=0.01). CONCLUSION: A protocol defined by aggressive pre-transplant liver-directed treatment and transplant for patients with undetectable disease or treatment-induced liver injury may help prevent tumor recurrence.

9.
Hepatology ; 78(3): 835-846, 2023 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-36988381

RESUMO

BACKGROUND AND AIMS: Acute cellular rejection (ACR) is a frequent complication after liver transplantation. By reducing ischemia and graft damage, dynamic preservation techniques may diminish ACR. We performed a systematic review to assess the effect of currently tested organ perfusion (OP) approaches versus static cold storage (SCS) on post-transplant ACR-rates. APPROACH AND RESULTS: A systematic search of Medline, Embase, Cochrane Library, and Web of Science was conducted. Studies reporting ACR-rates between OP and SCS and comprising at least 10 liver transplants performed with either hypothermic oxygenated perfusion (HOPE), normothermic machine perfusion, or normothermic regional perfusion were included. Studies with mixed perfusion approaches were excluded. Eight studies were identified (226 patients in OP and 330 in SCS). Six studies were on HOPE, one on normothermic machine perfusion, and one on normothermic regional perfusion. At meta-analysis, OP was associated with a reduction in ACR compared with SCS [OR: 0.55 (95% CI, 0.33-0.91), p =0.02]. This effect remained significant when considering HOPE alone [OR: 0.54 (95% CI, 0.29-1), p =0.05], in a subgroup analysis of studies including only grafts from donation after cardiac death [OR: 0.43 (0.20-0.91) p =0.03], and in HOPE studies with only donation after cardiac death grafts [OR: 0.37 (0.14-1), p =0.05]. CONCLUSIONS: Dynamic OP techniques are associated with a reduction in ACR after liver transplantation compared with SCS. PROSPERO registration: CRD42022348356.


Assuntos
Transplante de Fígado , Humanos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Preservação de Órgãos/métodos , Perfusão/métodos , Rejeição de Enxerto/prevenção & controle , Morte , Fígado , Sobrevivência de Enxerto
10.
Ann Surg Oncol ; 31(2): 697-700, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37996635

RESUMO

Colorectal cancer is the second most common cause of cancer-related death worldwide, and half of patients present with colorectal liver metastasis (CRLM). Liver transplant (LT) has emerged as a treatment modality for otherwise unresectable CRLM. Since the publication of the Lebeck-Lee systematic review in 2022, additional evidence has come to light supporting LT for CRLM in highly selected patients. This includes reports of >10-year follow-up with over 80% survival rates in low-risk patients. As these updated reports have significantly changed our collective knowledge, this article is intended to serve as an update to the 2022 systematic review to include the most up-to-date evidence on the subject.


Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Transplante de Fígado , Humanos , Protocolos de Quimioterapia Combinada Antineoplásica , Neoplasias Colorretais/patologia , Hepatectomia , Neoplasias Hepáticas/secundário , Revisões Sistemáticas como Assunto
11.
Am J Obstet Gynecol ; 2024 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-38955323

RESUMO

BACKGROUND: Elagolix, an approved oral treatment for endometriosis-associated pain, has been associated with hypoestrogenic effects when used as monotherapy. Hormonal add-back therapy has the potential to mitigate these effects. OBJECTIVE: To evaluate efficacy, tolerability, and bone density outcomes of elagolix 200 mg twice daily with 1 mg estradiol /0.5 mg norethindrone acetate (add-back) therapy once daily compared with placebo in premenopausal women with moderate-to-severe endometriosis-associated pain. STUDY DESIGN: This ongoing, 48-month, phase 3 study consists of a 12-month, double-blind period, with randomization 4:1:2 to elagolix 200 mg twice daily with add-back therapy, elagolix 200 mg twice daily monotherapy for 6 months followed by elagolix with add-back therapy, or placebo. The co-primary endpoints were proportion of patients with clinical improvement (termed "responders") in dysmenorrhea and nonmenstrual pelvic pain at month 6. We report 12-month results on efficacy of elagolix with add-back therapy versus placebo in reducing dysmenorrhea, nonmenstrual pelvic pain, dyspareunia, and fatigue. Tolerability assessments include adverse events and change from baseline in bone mineral density. RESULTS: A total of 679 patients were randomized to elagolix with add-back therapy (n=389), elagolix monotherapy (n=97), or placebo (n=193). Compared with patients randomized to placebo, a significantly greater proportion of patients randomized to elagolix with add-back therapy responded with clinical improvement in dysmenorrhea (62.8% versus 23.7%; P≤.001) and nonmenstrual pelvic pain (51.3% versus 36.8%; P≤.001) at 6 months. Compared with placebo, elagolix with add-back therapy produced significantly greater improvement from baseline in 7 hierarchically ranked secondary endpoints including dysmenorrhea (months 12, 6, 3), nonmenstrual pelvic pain (months 12, 6, 3), and fatigue (months 6) (all P<.01). Overall, the incidence of adverse events was 73.8% with elagolix plus add-back therapy and 66.8% with placebo. The rate of severe and serious adverse events did not meaningfully differ between treatment groups. Study drug discontinuations associated with adverse events were low in patients receiving elagolix with add-back therapy (12.6%) and those receiving placebo (9.8%). Patients randomized to elagolix monotherapy exhibited decreases from baseline in bone mineral density of -2.43% (lumbar spine), -1.54% (total hip), and -1.78% (femoral neck) at month 6. When add-back therapy was added to elagolix at month 6, the change from baseline in bone mineral density remained in a similar range of -1.58% to -1.83% at month 12. However, patients who received elagolix plus add-back therapy from baseline exhibited little change from baseline in bone mineral density (<1% change) at months 6 and 12. CONCLUSION: Compared with placebo, elagolix with add-back therapy resulted in significant, clinically meaningful improvement in dysmenorrhea, nonmenstrual pelvic pain, and fatigue at 6 months that continued until month 12 for both dysmenorrhea and nonmenstrual pelvic pain. Elagolix with add-back therapy was generally well tolerated. Loss of bone mineral density at 12 months was greater in patients who received elagolix with add-back therapy than those who received placebo. However, the change in bone mineral density with elagolix plus add-back therapy was < 1% and was attenuated compared with bone loss observed with elagolix monotherapy.

12.
Surg Endosc ; 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38951240

RESUMO

Obesity is a risk factor for kidney, liver, heart, and pulmonary diseases, as well as failure. Solid organ transplantation remains the definitive treatment for the end-stage presentation of these diseases. Among many criteria for organ transplant, efficient management of obesity is required for patients to acquire transplant eligibility. End-stage organ failure and obesity are 2 complex pathologies that are often entwined. Metabolic and bariatric surgery before, during, or after organ transplant has been studied to determine the long-term effect of bariatric surgery on transplant outcomes. In this review, a multidisciplinary group of surgeons from the Society of American Gastrointestinal and Endoscopic Surgeons and the American Society for Transplant Surgery presents the current published literature on metabolic and bariatric surgery as a therapeutic option for patients with obesity awaiting solid organ transplantation. This manuscript details the most recent recommendations, pharmacologic considerations, and psychological considerations for this specific cohort of patients. Since level one evidence is not available on many of the topics covered by this review, expert opinion was implemented in several instances. Additional high-quality research in this area will allow for better recommendations and, therefore, treatment strategies for these complex patients.

13.
Graefes Arch Clin Exp Ophthalmol ; 262(3): 753-758, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-37847267

RESUMO

PURPOSE: To evaluate whether sodium-glucose co-transporter 2 (SGLT2) inhibitors affect progression of non-proliferative diabetic retinopathy (NPDR) compared to standard of care. METHODS: A retrospective cohort study compared subjects enrolled in a commercial and Medicare Advantage medical claims database who filled a prescription for a SGLT2 inhibitor between 2013 and 2020 to unexposed controls, matched up to a 1:3 ratio. Patients were excluded if they were enrolled for less than 2 years in the plan, had no prior ophthalmologic exam, had no diagnosis of NPDR, had a diagnosis of diabetic macular edema (DME) or proliferative diabetic retinopathy (PDR), had received treatment for vision-threatening diabetic retinopathy (VTDR), or were younger than 18 years. To balance covariates of interest between the cohorts, an inverse probability treatment weighting (IPTW) propensity score for SGLT2 inhibitor exposure was used. Multivariate Cox proportional hazard regression modeling was employed to assess the hazard ratio (HR) for VTDR, PDR, or DME relative to SGLT2 exposure. RESULTS: A total of 6065 patients who initiated an SGLT2 inhibitor were matched to 12,890 controls. There were 734 (12%), 657 (10.8%), and 72 (1.18%) cases of VTDR, DME, and PDR, respectively, in the SGLT2 inhibitor cohort. Conversely, there were 1479 (11.4%), 1331 (10.3%), and 128 (0.99%) cases of VTDR, DME, and PDR, respectively, among controls. After IPTW, Cox regression analysis showed no difference in hazard for VTDR, PDR, or DME in the SGLT2 inhibitor-exposed cohort relative to the unexposed group [HR = 1.04, 95% CI 0.94 to 1.15 for VTDR; HR = 1.03, 95% CI 0.93 to 1.14 for DME; HR = 1.22, 95% CI 0.89 to 1.67 for PDR]. CONCLUSION: Exposure to SGLT2 inhibitor therapy was not associated with progression of NPDR compared to patients receiving other diabetic therapies.


Assuntos
Diabetes Mellitus , Retinopatia Diabética , Edema Macular , Inibidores do Transportador 2 de Sódio-Glicose , Estados Unidos/epidemiologia , Humanos , Idoso , Retinopatia Diabética/diagnóstico , Retinopatia Diabética/tratamento farmacológico , Estudos Retrospectivos , Transportador 2 de Glucose-Sódio , Edema Macular/diagnóstico , Edema Macular/tratamento farmacológico , Edema Macular/etiologia , Medicare
14.
Ann Vasc Surg ; 99: 422-433, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37922958

RESUMO

BACKGROUND: The objective of our present effort was to use an international blunt thoracic aortic injury (BTAI) registry to create a prediction model identifying important preoperative and intraoperative factors associated with postoperative mortality, and to develop and validate a simple risk prediction tool that could assist with patient selection and risk stratification in this patient population. METHODS: For the purpose of the present study, all patients undergoing thoracic endovascular aortic repair (TEVAR) for BTAI and registered in the Aortic Trauma Foundation (ATF) database from January 2016 as of June 2022 were identified. Patients undergoing medical management or open repair were excluded. The primary outcome was binary in-hospital all-cause mortality. Two predictive models were generated: a preoperative model (i.e. only including variables before TEVAR or intention-to-treat) and a full model (i.e. also including variables after TEVAR or per-protocol). RESULTS: Out of a total of 944 cases included in the ATF registry until June 2022, 448 underwent TEVAR and were included in the study population. TEVAR for BTAI was associated with an 8.5% in-hospital all-cause mortality in the ATF dataset. These study subjects were subsequently divided using 3:1 random sampling in a derivation cohort (336; 75.0%) and a validation cohort (112; 25.0%). The median age was 38 years, and the majority of patients were male (350; 78%). A total of 38 variables were included in the final analysis. Of these, 17 variables were considered in the preoperative model, 9 variables were integrated in the full model, and 12 variables were excluded owing to either extremely low variance or strong correlation with other variables. The calibration graphs showed how both models from the ATF dataset tended to underestimate risk, mainly in intermediate-risk cases. The discriminative capacity was moderate in all models; the best performing model was the full model from the ATF dataset, as evident from both the Receiver Operating Characteristic curve (Area Under the Curve 0.84; 95% CI 0.74-0.91) and from the density graph. CONCLUSIONS: In this study, we developed and validated a contemporary risk prediction model, which incorporates several preoperative and postoperative variables and is strongly predictive of early mortality. While this model can reasonably predict in-hospital all-cause mortality, thereby assisting physicians with risk-stratification as well as inform patients and their caregivers, its intrinsic limitations must be taken into account and it should only be considered an adjunctive tool that may complement clinical judgment and shared decision-making.


Assuntos
Doenças da Aorta , Procedimentos Endovasculares , Traumatismos Torácicos , Lesões do Sistema Vascular , Ferimentos não Penetrantes , Humanos , Masculino , Feminino , Adulto , Correção Endovascular de Aneurisma , Aorta Torácica/diagnóstico por imagem , Aorta Torácica/cirurgia , Aorta Torácica/lesões , Mortalidade Hospitalar , Fatores de Risco , Resultado do Tratamento , Fatores de Tempo , Doenças da Aorta/cirurgia , Ferimentos não Penetrantes/diagnóstico por imagem , Ferimentos não Penetrantes/cirurgia , Traumatismos Torácicos/cirurgia , Lesões do Sistema Vascular/diagnóstico por imagem , Lesões do Sistema Vascular/cirurgia , Estudos Retrospectivos
15.
Curr Opin Organ Transplant ; 29(4): 228-238, 2024 Aug 01.
Artigo em Inglês | MEDLINE | ID: mdl-38726745

RESUMO

PURPOSE OF REVIEW: Machine perfusion has been adopted into clinical practice in Europe since the mid-2010s and, more recently, in the United States (US) following approval of normothermic machine perfusion (NMP). We aim to review recent advances, provide discussion of potential future directions, and summarize challenges currently facing the field. RECENT FINDINGS: Both NMP and hypothermic-oxygenated perfusion (HOPE) improve overall outcomes after liver transplantation versus traditional static cold storage (SCS) and offer improved logistical flexibility. HOPE offers additional protection to the biliary system stemming from its' protection of mitochondria and lessening of ischemia-reperfusion injury. Normothermic regional perfusion (NRP) is touted to offer similar protective effects on the biliary system, though this has not been studied prospectively.The most critical question remaining is the optimal use cases for each of the three techniques (NMP, HOPE, and NRP), particularly as HOPE and NRP become more available in the US. There are additional questions regarding the most effective criteria for viability assessment and the true economic impact of these techniques. Finally, with each technique purported to allow well tolerated use of riskier grafts, there is an urgent need to define terminology for graft risk, as baseline population differences make comparison of current data challenging. SUMMARY: Machine perfusion is now widely available in all western countries and has become an essential tool in liver transplantation. Identification of the ideal technique for each graft, optimization of viability assessment, cost-effectiveness analyses, and proper definition of graft risk are the next steps to maximizing the utility of these powerful tools.


Assuntos
Sobrevivência de Enxerto , Transplante de Fígado , Preservação de Órgãos , Perfusão , Humanos , Transplante de Fígado/efeitos adversos , Transplante de Fígado/métodos , Transplante de Fígado/tendências , Perfusão/métodos , Perfusão/efeitos adversos , Perfusão/tendências , Perfusão/instrumentação , Preservação de Órgãos/métodos , Preservação de Órgãos/tendências , Preservação de Órgãos/efeitos adversos , Traumatismo por Reperfusão/prevenção & controle , Traumatismo por Reperfusão/etiologia , Resultado do Tratamento , Fatores de Risco , Isquemia Fria/efeitos adversos , Animais
16.
Ann Surg ; 278(4): 479-488, 2023 10 01.
Artigo em Inglês | MEDLINE | ID: mdl-37436876

RESUMO

OBJECTIVE: Evaluate outcome of left-lobe graft (LLG) first combined with purely laparoscopic donor hemihepatectomy (PLDH) as a strategy to minimize donor risk. BACKGROUND: An LLG first approach and a PLDH are 2 methods used to reduce surgical stress for donors in adult living donor liver transplantation (LDLT). But the risk associated with application LLG first combined with PLDH is not known. METHODS: From 2012 to 2023, 186 adult LDLTs were performed with hemiliver grafts, procured by open surgery in 95 and PLDH in 91 cases. LLGs were considered first when graft-to-recipient weight ratio ≥0.6%. Following a 4-month adoption process, all donor hepatectomies, since December 2019, were performed laparoscopically. RESULTS: There was one intraoperative conversion to open (1%). Mean operative times were similar in laparoscopic and open cases (366 vs 371 minutes). PLDH provided shorter hospital stays, lower blood loss, and lower peak aspartate aminotransferase. Peak bilirubin was lower in LLG donors compared with right-lobe graft donors (1.4 vs 2.4 mg/dL, P < 0.01), and PLDH further improved the bilirubin levels in LLG donors (1.2 vs 1.6 mg/dL, P < 0.01). PLDH also afforded a low rate of early complications (Clavien-Dindo grade ≥ II, 8% vs 22%, P = 0.007) and late complications, including incisional hernia (0% vs 13.7%, P < 0.001), compared with open cases. LLG was more likely to have a single duct than a right-lobe graft (89% vs 60%, P < 0.01). Importantly, with the aggressive use of LLG in 47% of adult LDLT, favorable graft survival was achieved without any differences between the type of graft and surgical approach. CONCLUSIONS: The LLG first with PLDH approach minimizes surgical stress for donors in adult LDLT without compromising recipient outcomes. This strategy can lighten the burden for living donors, which could help expand the donor pool.


Assuntos
Laparoscopia , Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/métodos , Doadores Vivos , Fígado/cirurgia , Hepatectomia/métodos , Bilirrubina , Resultado do Tratamento
17.
Ann Surg ; 277(3): 520-527, 2023 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-34334632

RESUMO

OBJECTIVE: To determine if risk-adjusted survival of patients with CDH has improved over the last 25 years within centers that are long-term, consistent participants in the CDH Study Group (CDHSG). SUMMARY BACKGROUND DATA: The CDHSG is a multicenter collaboration focused on evaluation of infants with CDH. Despite advances in pediatric surgical and intensive care, CDH mortality has appeared to plateau. Herein, we studied CDH mortality rates amongst long-term contributors to the CDHSG. METHODS: We divided registry data into 5-year intervals, with Era 1 (E1) beginning in 1995, and analyzed multiple variables (operative strategy, defect size, and mortality) to assess evolution of disease characteristics and severity over time. For mortality analyses, patients were risk stratified using a validated prediction score based on 5-minute Apgar (Apgar5) and birth weight. A risk-adjusted, observed to expected (O:E) mortality model was created using E1 as a reference. RESULTS: 5203 patients from 23 centers with >22years of participation were included. Birth weight, Apgar5, diaphragmatic agenesis, and repair rate were unchanged over time (all P > 0.05). In E5 compared to E1, minimally invasive and patch repair were more prevalent, and timing of diaphragmatic repair was later (all P < 0.01). Overall mortality decreased over time: E1 (30.7%), E2 (30.3%), E3 (28.7%), E4 (26.0%), E5 (25.8%) ( P = 0.03). Risk-adjusted mortality showed a significant improvement in E5 compared to E1 (OR 0.78, 95% CI 0.62-0.98; P = 0.03). O:E mortality improved over time, with the greatest improvement in E5. CONCLUSIONS: Risk-adjusted and observed-to-expected CDH mortality have improved over time.


Assuntos
Hérnias Diafragmáticas Congênitas , Lactente , Criança , Humanos , Hérnias Diafragmáticas Congênitas/cirurgia , Peso ao Nascer , Sistema de Registros
18.
Ann Surg ; 2023 Dec 05.
Artigo em Inglês | MEDLINE | ID: mdl-38050733

RESUMO

OBJECTIVE: We aim to report our institutional outcomes of single-staged combined liver transplantation (LT) and cardiac surgery (CS). SUMMARY BACKGROUND DATA: Concurrent LT and CS is a potential treatment for combined cardiac dysfunction and end-stage liver disease, yet only 54 cases have been previously reported in the literature. Thus, the outcomes of this approach are relatively unknown, and this approach has been previously regarded as extremely risky. METHODS: Thirty-one patients at our institution underwent combined cardiac surgery and liver transplant. Patients with at least one-year follow-up were included. The Leave-One-Out Cross-Validation (LOOCV) machine-learning approach was used to generate a model for mortality. RESULTS: Median follow-up was 8.2 years (IQR 4.6-13.6 y). One- and five-year survival was 74.2% (N=23) and 55% (N=17), respectively. Negative predictive factors of survival included recipient age>60 years (P=0.036), NASH-cirrhosis (P=0.031), Coronary Artery Bypass-Graft (CABG)-based CS (P=0.046) and pre-operative renal dysfunction (P=0.024). The final model demonstrated that renal dysfunction had a relative weighted impact of 3.2 versus CABG (1.7), age ≥60y (1.7) or NASH (1.3). Elevated LT+CS risk score was associated with an increased five-year mortality after surgery (AUC=0.731, P=<0.001). Conversely, the widely accepted STS-PROM calculator was unable to successfully stratify patients according to 1- (P>0.99) or 5-year (P=0.695) survival rates. CONCLUSIONS: This is the largest series describing combined LT+CS, with joint surgical management appearing feasible in highly selected patients. CABG and pre-operative renal dysfunction are important negative predictors of mortality. The four-variable LT+CS score may help predict patients at high risk for post-operative mortality.

19.
Liver Transpl ; 29(3): 279-289, 2023 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-36811877

RESUMO

The utilization of split liver grafts can increase access to liver transplantation (LT) for adult patients, particularly when liver grafts are shared between 2 adult recipients. However, it is yet to be determined whether split liver transplantation (SLT) increases the risk of biliary complications (BCs) compared with whole liver transplantation (WLT) in adult recipients. This retrospective study enrolled 1441 adult patients who underwent deceased donor LT at a single-site between January 2004 and June 2018. Of those, 73 patients underwent SLTs. Graft type for SLT includes 27 right trisegment grafts, 16 left lobes, and 30 right lobes. A propensity score matching analysis selected 97 WLTs and 60 SLTs. Biliary leakage was more frequently seen in SLTs (13.3% vs. 0%; p <0.001), whereas the frequency of biliary anastomotic stricture was comparable between SLTs and WLTs (11.7% vs. 9.3%; p=0.63). Graft and patient survival rates of patients undergoing SLTs were comparable to those undergoing WLTs (p=0.42 and 0.57, respectively). In the analysis of the entire SLT cohort, BCs were seen in 15 patients (20.5%) including biliary leakage in 11 patients (15.1%) and biliary anastomotic stricture in 8 patients (11.0%) [both in 4 patients (5.5%)]. The survival rates of recipients who developed BCs were significantly inferior to those without BCs (p <0.01). By multivariate analysis, the split grafts without common bile duct increased the risk of BCs. In conclusion, SLT increases the risk of biliary leakage compared with WLT. Biliary leakage can still lead to fatal infection and thus should be managed appropriately in SLT.


Assuntos
Doenças Biliares , Transplante de Fígado , Adulto , Humanos , Estudos Retrospectivos , Análise por Pareamento , Constrição Patológica , Resultado do Tratamento , Sobrevivência de Enxerto
20.
Glob Chang Biol ; 29(7): 1870-1889, 2023 04.
Artigo em Inglês | MEDLINE | ID: mdl-36647630

RESUMO

Arctic-boreal landscapes are experiencing profound warming, along with changes in ecosystem moisture status and disturbance from fire. This region is of global importance in terms of carbon feedbacks to climate, yet the sign (sink or source) and magnitude of the Arctic-boreal carbon budget within recent years remains highly uncertain. Here, we provide new estimates of recent (2003-2015) vegetation gross primary productivity (GPP), ecosystem respiration (Reco ), net ecosystem CO2 exchange (NEE; Reco - GPP), and terrestrial methane (CH4 ) emissions for the Arctic-boreal zone using a satellite data-driven process-model for northern ecosystems (TCFM-Arctic), calibrated and evaluated using measurements from >60 tower eddy covariance (EC) sites. We used TCFM-Arctic to obtain daily 1-km2 flux estimates and annual carbon budgets for the pan-Arctic-boreal region. Across the domain, the model indicated an overall average NEE sink of -850 Tg CO2 -C year-1 . Eurasian boreal zones, especially those in Siberia, contributed to a majority of the net sink. In contrast, the tundra biome was relatively carbon neutral (ranging from small sink to source). Regional CH4 emissions from tundra and boreal wetlands (not accounting for aquatic CH4 ) were estimated at 35 Tg CH4 -C year-1 . Accounting for additional emissions from open water aquatic bodies and from fire, using available estimates from the literature, reduced the total regional NEE sink by 21% and shifted many far northern tundra landscapes, and some boreal forests, to a net carbon source. This assessment, based on in situ observations and models, improves our understanding of the high-latitude carbon status and also indicates a continued need for integrated site-to-regional assessments to monitor the vulnerability of these ecosystems to climate change.


Assuntos
Ecossistema , Taiga , Carbono , Dióxido de Carbono , Tundra , Metano , Ciclo do Carbono
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA