RESUMEN
Decreasing the graft size in living donor liver transplantation (LDLT) increases the risk of early allograft dysfunction. Graft-to-recipient weight ratio (GRWR) of 0.8 is considered the threshold. There is evidence that smaller volume grafts may also provide equally good outcomes, the cut-off of which remains unknown. In this retrospective multicenter study, 92 adult LDLTs with a final GRWR ≤0.6 performed at 12 international liver transplant centers over a 3-year period were included. Perioperative data including preoperative status, portal flow hemodynamics (PFH) and portal flow modulation, development of small for size syndrome (SFSS), morbidity, and mortality was collated and analyzed. Thirty-two (36.7%) patients developed SFSS and this was associated with increased 30-day, 90-day, and 1-year mortality. The preoperative model for end-stage liver disease and inpatient status were independent predictors for SFSS (P < .05). Pre-liver transplant renal dysfunction was an independent predictor of survival (hazard ratio 3.1; 95% confidence intervals 1.1, 8.9, P = .035). PFH or portal flow modulation were not predictive of SFSS or survival. We report the largest ever multicenter study of LDLT outcomes using ultralow GRWR grafts and for the first time validate the International Liver Transplantation Society-International Living donor liver transplantation study group-Liver Transplantation Society of India consensus definition and grading of SFSS. Preoperative recipient condition rather than GRWR and PFH were independent predictors of SFSS. Algorithms to predict SFSS and LT outcomes should incorporate recipient factors along with GRWR.
RESUMEN
Liver transplantation is often the only lifesaving option for acute liver failure (ALF); however, the predictors of short-term mortality (death within one year) after living donor liver transplantation (LDLT) for ALF have yet to be defined. We retrospectively collected patients ≥18 years old who underwent LDLT for ALF between 2010 and 2020 at 35 centers in Asia. Univariate and multivariate logistic regression analyses were conducted to identify the clinical variables related to short-term mortality and establish a novel scoring system. The Kaplan-Meier method was performed to explore the association between the score and overall survival. Of the 339 recipients, 46 (13.6%) died within 1 year after LDLT. Multivariate analyses revealed 4 independent risk factors for death: use of vasopressors or mechanical ventilation, the higher model for end-stage liver disease score, and a lower graft-to-recipient weight ratio. The internally validated c-statistic of the short-term mortality after transplant (SMT) score derived from these 4 variables was 0.80 (95% confidence interval: 0.74-0.87). The SMT score successfully stratified recipients into low-, intermediate-, and high-risk groups with 1-year overall survival rates of 96%, 80%, and 50%, respectively. In conclusion, our novel SMT score based on 4 predictors will guide ALF recipient and living donor selection.
Asunto(s)
Supervivencia de Injerto , Fallo Hepático Agudo , Trasplante de Hígado , Donadores Vivos , Humanos , Trasplante de Hígado/mortalidad , Masculino , Femenino , Estudios Retrospectivos , Fallo Hepático Agudo/cirugía , Fallo Hepático Agudo/mortalidad , Adulto , Factores de Riesgo , Persona de Mediana Edad , Tasa de Supervivencia , Estudios de Seguimiento , Pronóstico , Complicaciones Posoperatorias/mortalidadRESUMEN
In living-donor liver transplantation, biliary complications including bile leaks and biliary anastomotic strictures remain significant challenges, with incidences varying across different centers. This multicentric retrospective study (2016-2020) included 3633 adult patients from 18 centers and aimed to identify risk factors for these biliary complications and their impact on patient survival. Incidences of bile leaks and biliary strictures were 11.4% and 20.6%, respectively. Key risk factors for bile leaks included multiple bile duct anastomoses (odds ratio, [OR] 1.8), Roux-en-Y hepaticojejunostomy (OR, 1.4), and a history of major abdominal surgery (OR, 1.4). For biliary anastomotic strictures, risk factors were ABO incompatibility (OR, 1.4), blood loss >1 L (OR, 1.4), and previous abdominal surgery (OR, 1.7). Patients experiencing biliary complications had extended hospital stays, increased incidence of major complications, and higher comprehensive complication index scores. The impact on graft survival became evident after accounting for immortal time bias using time-dependent covariate survival analysis. Bile leaks and biliary anastomotic strictures were associated with adjusted hazard ratios of 1.7 and 1.8 for graft survival, respectively. The study underscores the importance of minimizing these risks through careful donor selection and preoperative planning, as biliary complications significantly affect graft survival, despite the availability of effective treatments.
Asunto(s)
Supervivencia de Injerto , Trasplante de Hígado , Donadores Vivos , Complicaciones Posoperatorias , Humanos , Trasplante de Hígado/efectos adversos , Masculino , Femenino , Estudios Retrospectivos , Persona de Mediana Edad , Adulto , Factores de Riesgo , Complicaciones Posoperatorias/etiología , Estudios de Seguimiento , Pronóstico , Fuga Anastomótica/etiología , Enfermedades de las Vías Biliares/etiología , Incidencia , Tasa de SupervivenciaRESUMEN
BACKGROUND: Post-transplant HCC recurrence significantly impacts survival, yet its management is challenging due to limited evidence. With recent advancements in HCC treatment, updated data on managing recurrent disease is needed. METHODS: In this retrospective study across six centers (2000-2022), we employed Cox proportional-hazards regression and log-rank tests to assess survival differences. A prognostic score model was developed to categorize patient survival. Efficacy of tyrosine kinase inhibitors was evaluated through propensity score matching. RESULTS: In our study, 431 of 3349 (14%) transplanted HCC patients developed recurrence within a median interval of 18 (IQR:9-32) months. 147 (34%) underwent curative-intent treatments, 207 (48%) received palliative treatments, and 77 (18%) were given best-supportive care. Patients undergoing curative-intent treatments had better survival from the time of recurrence with median survival of 45 (95%CI:36-63) months and 1/3/5-year survival of 90%/56%/43% compared to those receiving non-curative treatments (median: 11 (95%CI:10-13) months, 1/3/5-year survival of 46%/10%/7%, log-rank p<0.001). Patients with recurrence diagnosed in the era 2018-2022 showed improved survival over previous era (HR 0.64 (95%CI:0.47-0.86)). Multivariable analysis identified 5 prognostic factors: ineligibility for curative-intent treatment (HR 3.5 (95%CI:2.7-4.6)), recurrence within 1-year (HR 1.7 (95%CI:1.3-2.1)), poor tumor differentiation (HR 1.5 (95%CI:1.1-1.9)), RETREAT score ≥3 (HR 1.4 (95%CI:1.1-1.8)) and AFP at recurrence ≥400 ng/mL (HR 1.4 (95%CI:1.1-1.9)). These factors contributed to a prognostic scoring system (0-9) that stratified patients into three prognosis groups. Both propensity score-matched analysis and multivariable regression indicated that lenvatinib was not statistically superior to sorafenib in terms of efficacy. CONCLUSION: Curative-intent treatments should be advocated for patients with post-transplant recurrence whenever possible. Prognostic factors linked to aggressive tumor biology significantly influence survival. Advancements in HCC management have improved survival outcomes over the past five years.
RESUMEN
BACKGROUND AND AIM: Limited data are available regarding pre-liver transplantation (LT) bacteremia in adults with end-stage liver disease. In this study, we investigated the risk factors independently associated with pre-LT bacteremia and their effects on clinical outcomes of LT. METHODS: This retrospective study performed between 2010 and 2021 included 1287 LT recipients. The study population was categorized into patients with pre-LT bacteremia and those without pre-LT infection. Pre-LT bacteremia was defined as bacteremia detected within 90 days before LT. RESULTS: Among 1287 LT recipients, 92 (7.1%) developed pre-LT bacteremia. The mean interval between bacteremia and LT was 28.3 ± 19.5 days. Of these 92 patients, seven (7.6%) patients died after LT. Of the 99 microorganisms isolated in this study, gram-negative bacteria were the most common microbes (72.7%). Bacteremia was mainly attributed to spontaneous bacterial peritonitis. The most common pathogen isolated was Escherichia coli (25.2%), followed by Klebsiella pneumoniae (18.2%), and Staphylococcus aureus (15.1%). Multivariate analysis showed that massive ascites (adjusted odds ratio [OR] 1.67, 95% confidence Interval [CI] 1.048-2.687) and a prolonged international normalized ratio for prothrombin time (adjusted OR 1.13, 95% CI 1.074-1.257) were independent risk factors for pre-LT bacteremia in patients with end-stage liver disease. Intensive care unit and in-hospital stay were significantly longer, and in-hospital mortality was significantly higher among LT recipients with pre-LT bacteremia than among those without pre-LT infection. CONCLUSIONS: This study highlights predictors of pre-LT bacteremia in patients with end-stage liver disease. Pre-LT bacteremia increases the post-transplantation mortality risk.
Asunto(s)
Bacteriemia , Enfermedad Hepática en Estado Terminal , Trasplante de Hígado , Adulto , Humanos , Trasplante de Hígado/efectos adversos , Estudios Retrospectivos , Enfermedad Hepática en Estado Terminal/complicaciones , Enfermedad Hepática en Estado Terminal/cirugía , Factores de Riesgo , Bacteriemia/epidemiologíaRESUMEN
BACKGROUND: Although the outcomes of living donor liver transplantation (LDLT) for pediatric acute liver failure (PALF) have improved, patient survival remains lower than in patients with chronic liver disease. We investigated whether the poor outcomes of LDLT for PALF persisted in the contemporary transplant era. METHODS: We analyzed 193 patients who underwent LDLT between December 2000 and December 2020. The outcomes of patients managed in 2000-2010 (era 1) and 2011-2020 (era 2) were compared. RESULTS: The median age at the time of LDLT was 1.2 years both eras. An unknown etiology was the major cause in both groups. Patients in era 1 were more likely to have surgical complications, including hepatic artery and biliary complications (p = 0.001 and p = 0.013, respectively). The era had no impact on the infection rate after LDLT (cytomegalovirus, Epstein-Barr virus, and sepsis). The mortality rates of patients and grafts in era one were significantly higher (p = 0.03 and p = 0.047, respectively). The 1- and 5-year survival rates were 76.4% and 70.9%, respectively, in era 1, while they were 88.3% and 81.9% in era 2 (p = 0.042). Rejection was the most common cause of graft loss in both groups. In the multivariate analysis, sepsis during the 30 days after LDLT was independently associated with graft loss (p = 0.002). CONCLUSIONS: The survival of patients with PALF has improved in the contemporary transplant era. The early detection and proper management of rejection in patients, while being cautious of sepsis, should be recommended to improve outcomes further.
Asunto(s)
Fallo Hepático Agudo , Trasplante de Hígado , Donadores Vivos , Complicaciones Posoperatorias , Humanos , Masculino , Femenino , Estudios Retrospectivos , Lactante , Preescolar , Fallo Hepático Agudo/cirugía , Niño , Complicaciones Posoperatorias/epidemiología , Resultado del Tratamiento , Supervivencia de Injerto , Tasa de Supervivencia , AdolescenteRESUMEN
PURPOSE: Current guidelines recommend bone mineral density (BMD) testing after fragility fractures in patients aged 50 years or older. This study aimed to assess BMD testing and subsequent fragility fractures after low-energy distal radius fractures (DRFs) among patients aged 50-59 years. METHODS: We used the 2010-2020 MarketScan dataset to identify patients with initial DRFs with ages ranging between 50 and 59 years. We assessed the 1-year BMD testing rate and 3-year non-DRF fragility fracture rate. We created Kaplan-Meier plots to depict fragility fracture-free probabilities over time and used log-rank tests to compare the Kaplan-Meier curves. RESULTS: Among 78,389 patients aged 50-59 years with DRFs, 24,589 patients met our inclusion criteria, and most patients were women (N = 17,580, 71.5%). The BMD testing rate within 1 year after the initial DRF was 12.7% (95% CI, 12.3% to 13.2%). In addition, 1-year BMD testing rates for the age groups of 50-54 and 55-59 years were 10.4% (95% CI, 9.9% to 11.0%) and 14.9% (95% CI, 14.2% to 15.6%), respectively. Only 1.8% (95% CI, 1.5% to 2.1%) of men, compared with 17.1% (95% CI, 16.5% to 17.7%) of women, underwent BMD testing within 1 year after the initial fracture. The overall 3-year fragility fracture rate was 6.0% (95% CI, 5.6% to 6.3%). The subsequent fragility fracture rate was lower for those with any BMD testing (4.4%; 95% CI, 3.7% to 5.2%), compared with those without BMD testing (6.2%; 95% CI, 5.9% to 6.6%; P < .05). CONCLUSIONS: We report a low BMD testing rate for patients aged between 50 and 59 years after initial isolated DRFs, especially for men and patients aged between 50 and 54 years. Patients who received BMD testing had a lower rate of subsequent fracture within 3 years. We recommend that providers follow published guidelines and initiate an osteoporosis work-up for patients with low-energy DRFs to ensure early diagnosis. This provides an opportunity to initiate treatment that may prevent subsequent fractures. TYPE OF STUDY/LEVEL OF EVIDENCE: Prognosis II.
Asunto(s)
Fracturas Óseas , Osteoporosis , Fracturas Osteoporóticas , Fracturas del Radio , Fracturas de la Muñeca , Estados Unidos/epidemiología , Masculino , Humanos , Anciano , Femenino , Persona de Mediana Edad , Densidad Ósea , Fracturas del Radio/diagnóstico por imagen , Fracturas del Radio/terapia , Medicare , Osteoporosis/complicaciones , Osteoporosis/diagnóstico , Fracturas Osteoporóticas/prevención & controlRESUMEN
Since the first published report of experimental kidney transplantation in dogs in 1902, there were many experimental and clinical trials of organ transplantation, with many sacrifices. After the establishment of the surgical technique and the discovery of immunosuppressive drugs, transplantation became the definitive treatment strategy for patients with terminal organ failure. However, this is not a common therapy method due to the difficulty of solving the fundamental issues behind organ transplantation, including the shortage of donor graft, potential risks of transplant surgery and economic capability. The pre- and post-transplant management of recipients is another critical issue that may affect transplant outcome. Most liver transplant recipients experience post-transplant complications, including infection, acute/chronic rejection, metabolic syndrome and the recurrence of hepatocellular carcinoma. Therefore, the early prediction and diagnosis of these complications may improve overall and disease-free survival. Furthermore, how to induce operational tolerance is the key to achieving the ultimate goal of transplantation. In this review, we focus on liver transplantation, which is known to achieve operational tolerance in some circumstances, and the mechanical similarities and differences between liver transplant immunology and fetomaternal tolerance, autoimmunity or tumor immunity are discussed.
Asunto(s)
Autoinmunidad , Trasplante de Hígado , Trasplante de Hígado/efectos adversos , Trasplante de Hígado/métodos , Humanos , Animales , Tolerancia Inmunológica , Rechazo de Injerto/inmunología , Neoplasias Hepáticas/inmunología , Neoplasias Hepáticas/cirugía , Tolerancia al Trasplante/inmunologíaRESUMEN
Abnormal shifts in global climate, leading to extreme weather, significantly threaten the safety of individuals involved in outdoor activities. Hypothermia-induced coma or death frequently occurs in clinical and forensic settings. Despite this, the precise mechanism of central nervous system injury due to hypothermia remains unclear, hindering the development of targeted clinical treatments and specific forensic diagnostic indicators. The GEO database was searched to identify datasets related to hypothermia. Post-bioinformatics analyses, DEGs, and ferroptosis-related DEGs (FerrDEGs) were intersected. GSEA was then conducted to elucidate the functions of the Ferr-related genes. Animal experiments conducted in this study demonstrated that hypothermia, compared to the control treatment, can induce significant alterations in iron death-related genes such as PPARG, SCD, ADIPOQ, SAT1, EGR1, and HMOX1 in cerebral cortex nerve cells. These changes lead to iron ion accumulation, lipid peroxidation, and marked expression of iron death-related proteins. The application of the iron death inhibitor Ferrostatin-1 (Fer-1) effectively modulates the expression of these genes, reduces lipid peroxidation, and improves the expression of iron death-related proteins. Severe hypothermia disrupts the metabolism of cerebral cortex nerve cells, causing significant alterations in ferroptosis-related genes. These genetic changes promote ferroptosis through multiple pathways.
Asunto(s)
Corteza Cerebral , Ferroptosis , Hipotermia , Neuronas , Ferroptosis/genética , Animales , Hipotermia/metabolismo , Corteza Cerebral/metabolismo , Corteza Cerebral/patología , Neuronas/metabolismo , Hierro/metabolismo , Peroxidación de Lípido , Masculino , Ratas , Fenilendiaminas/farmacología , CiclohexilaminasRESUMEN
In this work, a novel formaldehyde sensor was constructed based on nanoporous, flower-like, Pb-containing Pd-Au nanoparticles deposited on the cathode in a double-cabin galvanic cell (DCGC) with a Cu plate as the anode, a multiwalled carbon nanotube-modified glassy carbon electrode as the cathode, a 0.1 M HClO4 aqueous solution as the anolyte, and a 3.0 mM PdCl2 + 1.0 mM HAuCl4 + 5.0 mM Pb(ClO4)2 + 0.1 M HClO4 aqueous solution as the catholyte, respectively. Electrochemical studies reveal that the stripping of bulk Cu can induce underpotential deposition (UPD) of Pb during the galvanic replacement reaction (GRR) process, which affects the composition and morphology of Pb-containing Pd-Au nanoparticles. The electrocatalytic activity of Pb-containing nanoparticles toward formaldehyde oxidation was examined in an alkaline solution, and the experimental results showed that formaldehyde mainly caused direct oxidation on the surface of Pb-containing Pd-Au nanoparticles while inhibiting the formation of CO poison to a large degree. The proposed formaldehyde sensor exhibits a linear amperometric response to formaldehyde concentrations from 0.01 mM to 5.0 mM, with a sensitivity of 666 µA mM-1 cm-2, a limit of detection (LOD) of 0.89 µM at triple signal-to-noise, rapid response, high anti-interference ability, and good repeatability.
RESUMEN
Electrically conductive metal-organic frameworks (MOFs) have been extensively studied for their potential uses in energy-related technologies and sensors. However, achieving that goal requires MOFs to be highly stable and maintain their conductivity under practical operating conditions with varying solution environments and temperatures. Herein, we have designed and synthesized a new series of {[Ln4(µ4-O)(µ3-OH)3(INA)3(GA)3](CF3SO3)(H2O)6}n (denoted as Ln4-MOFs, Ln = Gd, Tm, and Lu, INA = isonicotinic acid, GA = glycolic acid) single crystals, where electrons are found to transport along the π-π stacked aromatic carbon rings in the crystals. The Ln4-MOFs show remarkable stability, with minimal changes in conductivity under varying solution pH (1-12), temperature (373 K), and electric field as high as 800â¯000 V/m. This stability is achieved through the formation of strong coordination bonds between high-valent Ln(III) ions and rigid carboxylic linkers as well as hydrogen bonds that enhance the robustness of the electron transport path. The demonstrated lanthanide MOFs pave the way for the design of stable and conductive MOFs.
RESUMEN
OBJECTIVE: To define benchmark values for adult-to-adult living-donor liver transplantation (LDLT). BACKGROUND: LDLT utilizes living-donor hemiliver grafts to expand the donor pool and reduce waitlist mortality. Although references have been established for donor hepatectomy, no such information exists for recipients to enable conclusive quality and comparative assessments. METHODS: Patients undergoing LDLT were analyzed in 15 high-volume centers (≥10 cases/year) from 3 continents over 5 years (2016-2020), with a minimum follow-up of 1 year. Benchmark criteria included a Model for End-stage Liver Disease ≤20, no portal vein thrombosis, no previous major abdominal surgery, no renal replacement therapy, no acute liver failure, and no intensive care unit admission. Benchmark cutoffs were derived from the 75th percentile of all centers' medians. RESULTS: Of 3636 patients, 1864 (51%) qualified as benchmark cases. Benchmark cutoffs, including posttransplant dialysis (≤4%), primary nonfunction (≤0.9%), nonanastomotic strictures (≤0.2%), graft loss (≤7.7%), and redo-liver transplantation (LT) (≤3.6%), at 1-year were below the deceased donor LT benchmarks. Bile leak (≤12.4%), hepatic artery thrombosis (≤5.1%), and Comprehensive Complication Index (CCI ® ) (≤56) were above the deceased donor LT benchmarks, whereas mortality (≤9.1%) was comparable. The right hemiliver graft, compared with the left, was associated with a lower CCI ® score (34 vs 21, P < 0.001). Preservation of the middle hepatic vein with the right hemiliver graft had no impact neither on the recipient nor on the donor outcome. Asian centers outperformed other centers with CCI ® score (21 vs 47, P < 0.001), graft loss (3.0% vs 6.5%, P = 0.002), and redo-LT rates (1.0% vs 2.5%, P = 0.029). In contrast, non-benchmark low-volume centers displayed inferior outcomes, such as bile leak (15.2%), hepatic artery thrombosis (15.2%), or redo-LT (6.5%). CONCLUSIONS: Benchmark LDLT offers a valuable alternative to reduce waitlist mortality. Exchange of expertise, public awareness, and centralization policy are, however, mandatory to achieve benchmark outcomes worldwide.
Asunto(s)
Enfermedad Hepática en Estado Terminal , Hepatopatías , Trasplante de Hígado , Trombosis , Adulto , Humanos , Donadores Vivos , Benchmarking , Enfermedad Hepática en Estado Terminal/cirugía , Resultado del Tratamiento , Estudios Retrospectivos , Índice de Severidad de la Enfermedad , Hepatopatías/complicaciones , Supervivencia de InjertoRESUMEN
INTRODUCTION: Global burn injury burden disproportionately impacts low- and middle-income countries. Surgery is a mainstay of burn treatment, yet access to surgical care appears to be inequitably distributed for women. This study sought to identify gender disparities in mortality and access to surgery for burn patients in the World Health Organization Global Burn Registry (GBR). METHODS: We queried the World Health Organization GBR for a retrospective cohort (2016-2021). Patients were stratified by sex. Outcomes of interest were in-hospital mortality and surgical treatment. Patient demographics, injury characteristics, outcomes, and health facility resources were compared between sexes with Wilcoxon rank sum test for nonparametric medians, and chi-squared or Fisher's exact test for nonparametric proportions. Multivariable logistic regressions were performed to assess the relationships between sex and mortality, and sex and surgery. RESULTS: Of 8445 patients in the GBR from 20 countries (10 low resource), 40% of patients were female, with 51% of all patients receiving surgical treatment during their hospitalization. Female patients had a higher incidence of mortality (24% versus 15%, P < 0.001) and a higher median total body surface area (20% versus 15%, P < 0.001), yet a lower incidence of surgery (47% versus 53%, P < 0.001) following burn injury when compared to males. In multivariable analysis, female sex was independently associated with mortality after controlling for age, time to presentation, smoke injury, percent total body surface area, surgery, and country income status. Female sex was independently associated with surgical care (odds ratio 0.86, P = 0.001). CONCLUSIONS: Female burn patients suffer higher mortality compared to males and are less likely to receive surgery. Further study into this gender disparity in burns is warranted.
Asunto(s)
Quemaduras , Masculino , Humanos , Femenino , Estudios Retrospectivos , Quemaduras/complicaciones , Hospitalización , Sistema de Registros , Mortalidad Hospitalaria , Tiempo de InternaciónRESUMEN
The selective fluorination of C-H bonds at room temperature using heterogeneous visible-light catalysts is both interesting and challenging. Herein, we present the heterogeneous sandwich-type structure uranyl-polyoxotungstate cluster Na17{Na@[(SbW9O33)2(UO2)6(PO3OH)6]}·46H2O (denoted as U6P6) to regulate the selective fluorination of the C-H bond under visible light and room temperature. This is the first report in which uranyl participates in the fluorination reaction in the form of an insoluble substance. U6P6 is capable of the effective selective fluorination of cycloalkanes and the recyclability of the photocatalyst due to the synergistic effect of multiple uranyl (UO2)2+ and the insolubility of organic reagents of polyoxotungstate. In situ electron paramagnetic resonance spectroscopy captured the generation of cycloalkane radicals during the photoreaction, confirming the mechanism of direct hydrogen atom transfer.
RESUMEN
BACKGROUND: Cushing disease (CD) arises due to a pituitary corticotroph adenoma, which is the most common cause of Cushing syndrome (CS). Bilateral inferior petrosal sinus sampling (BIPSS) is a safe method for differentiating CD from ectopic adrenocorticotropic hormone (ACTH)-dependent CS. Enhanced high-resolution magnetic resonance imaging (MRI) can localize tiny pituitary lesions. The aim of this study was to compare the preoperative diagnostic accuracy of BIPSS versus MRI for CD in CS patients. We performed a retrospective study of patients who underwent BIPSS and MRI between 2017 and 2021. Low- and high-dose dexamethasone suppression tests were performed. Blood samples were collected simultaneously from the right and left catheter and femoral vein before and after desmopressin stimulation. MRI images were obtained, and endoscopic endonasal transsphenoidal surgery (EETS) was performed in confirmed CD patients. Dominant sides of ACTH secretion during BIPSS and MRI were compared with surgical findings. RESULTS: Twenty-nine patients underwent BIPSS and MRI. CD was diagnosed in 28 patients, 27 of whom received EETS. Localizations of microadenomas by MRI and BIPSS agreed with the EETS findings in 96% and 93% of the cases, respectively. BIPSS and EETS were successfully performed on all patients. CONCLUSION: BIPSS was the most accurate method (gold standard) for establishing a preoperative diagnosis of pituitary-dependent CD and was more sensitive than MRI in diagnosing microadenoma. High-resolution MRI with enhancement had an advantage over BIPSS in microadenoma lateralization diagnostics. The combined use of MRI and BIPSS could improve the preoperative diagnosis accuracy in ACTH-dependent CS patients.
Asunto(s)
Adenoma , Síndrome de Cushing , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT) , Neoplasias Hipofisarias , Humanos , Adenoma/diagnóstico por imagen , Adenoma/cirugía , Hormona Adrenocorticotrópica , Síndrome de Cushing/diagnóstico , Imagen por Resonancia Magnética , Muestreo de Seno Petroso/métodos , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT)/diagnóstico por imagen , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT)/cirugía , Neoplasias Hipofisarias/diagnóstico por imagen , Neoplasias Hipofisarias/cirugía , Estudios RetrospectivosRESUMEN
BACKGROUND: Echocardiography (ECHO) and cardiac magnetic resonance imaging (MRI) are used to observe changes in the left ventricular structure in patients with breast and gastric cancer after 6 cycles of chemotherapy. Based on the observed values, we aimed to evaluate the cardiotoxicity of anthracyclines in cancer patients and to analyze the consistency of the two examination methods in assessing left ventricular function after chemotherapy. METHODS: From January 2020 to January 2022, the data of 80 patients with malignant tumors who received anthracycline chemotherapy (breast cancer, n = 40; gastric cancer, n = 40) and 40 healthy volunteers (Control group) were retrospectively collected. Serum high-sensitivity cardiac troponin T (hs-cTnT) levels were detected by an automatic immunoassay analyzer. Left ventricular end-systolic volume (LVESV), left ventricular end-diastolic volume (LVEDV) and left ventricular ejection fraction (LVEF) were measured by cardiac MRI and 2-dimensional ECHO using the biplane Simpson's method. RESULTS: Compared with baseline values, serum high-sensitivity cardiac troponin T (hs-cTnT) levels were significantly increased in patients with breast cancer and gastric cancer after 6 cycles of chemotherapy (P < 0.05). In addition, LVEDV, LVESV and LVEF measured with MRI were higher than those detected by ECHO in cancer patients after 6 cycles of chemotherapy (P < 0.05). And the Bland-Altman plot analysis showed that LVEDV, LVESV and LVEF measured by the two examination methods were in good agreement. CONCLUSION: Breast and gastric cancer patients exhibited elevated levels of hs-cTnT after 6 cycles of chemotherapy, indicating potential cardiotoxicity. Additionally, cardiac MRI and 2-dimensional ECHO showed good agreement in assessing left ventricular function, with ECHO tending to underestimate volume measurements compared to MRI.
Asunto(s)
Neoplasias de la Mama , Policétidos , Neoplasias Gástricas , Humanos , Femenino , Neoplasias Gástricas/diagnóstico por imagen , Neoplasias Gástricas/tratamiento farmacológico , Función Ventricular Izquierda , Volumen Sistólico , Antraciclinas/efectos adversos , Cardiotoxicidad , Estudios Retrospectivos , Troponina T , Imagen por Resonancia Magnética , Neoplasias de la Mama/tratamiento farmacológico , Ecocardiografía , Antibióticos Antineoplásicos , Espectroscopía de Resonancia MagnéticaRESUMEN
Accumulating evidence suggests the involvement of tumor-derived exosomes in the development and recurrence of hepatocellular carcinoma (HCC). We previously identified miR-4669 as a highly expressed microRNA in circulating exosomes obtained from patients with post-transplant HCC recurrence. This study aimed to explore how overexpression of miR-4669 affects HCC development and recurrence. The impact of miR-4669 overexpression in Hep3B cells on tumor cell behavior and the tumor microenvironment was evaluated in vitro. In addition, the clinical value of exosomal miR-4669 for the prediction of treatment response to HCC downstaging therapies and following post-transplant HCC recurrence was explored. Overexpression of miR-4669 enhanced migration ability and led to acquired sorafenib resistance with an elevation of sirtuin 1 and long noncoding RNA associated with microvascular invasion. Active release of tumor-derived exosomes and glyceraldehyde 3-phosphate dehydrogenase (GAPDH) contributed to generating an immunosuppressive tumor microenvironment through the induction of M2 macrophage polarization. The retrospective analysis demonstrated the clinical value of exosomal miR-4669 for predicting treatment response to HCC downstaging therapies and for risk assessment of post-transplant HCC recurrence. In summary, the present data demonstrate the impact of exosomal miR-4669 on HCC recurrence through the enhancement of tumor aggressiveness and generation of an immunosuppressive tumor microenvironment.
Asunto(s)
Biomarcadores de Tumor , Carcinoma Hepatocelular , Exosomas , Neoplasias Hepáticas , MicroARNs , Humanos , Biomarcadores de Tumor/genética , Carcinoma Hepatocelular/terapia , Carcinoma Hepatocelular/tratamiento farmacológico , Línea Celular Tumoral , Proliferación Celular/genética , Exosomas/genética , Exosomas/patología , Regulación Neoplásica de la Expresión Génica , Neoplasias Hepáticas/patología , MicroARNs/genética , Estudios Retrospectivos , Microambiente Tumoral/genéticaRESUMEN
Bisphenol A is one of the most widely used industrial compounds. Over the years, it has raised severe concern as a potential hazard to the human endocrine system and the environment. Developing robust and easy-to-use sensors for bisphenol A is important in various areas, such as controlling and monitoring water purification and sewage water systems, food safety monitoring, etc. Here, we report an electrochemical method to fabricate a bisphenol A (BPA) sensor based on a modified Au nanoparticles/multiwalled carbon nanotubes composite electrocatalyst electrode (AuCu-UPD/MWCNTs/GCE). Firstly, the Au-Cu alloy was prepared via a convenient and controllable Cu underpotential/bulk Au co-electrodeposition on a multiwalled modified carbon nanotubes glassy carbon electrode (GCE). Then, the AuCu-UPD/MWCNTs/GCE was obtained via the electrochemical anodic stripping of Cu underpotential deposition (UPD). Our novel prepared sensor enables the high-electrocatalytic and high-performance sensing of BPA. Under optimal conditions, the modified electrode showed a two-segment linear response from 0.01 to 1 µM and 1 to 20 µM with a limit of detection (LOD) of 2.43 nM based on differential pulse voltammetry (DPV). Determination of BPA in real water samples using AuCu-UPD/MWCNTs/GCE yielded satisfactory results. The proposed electrochemical sensor is promising for the development of a simple, low-cost water quality monitoring system for the detection of BPA in ambient water samples.
RESUMEN
OBJECTIVES: Acute cellular rejection (ACR) is a major immune occurrence post-liver transplant that can cause abnormal liver function. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) can be used to evaluate liver disease, but it has not been utilized in the diagnosis of ACR post-liver transplant. Therefore, the purpose of this study is to evaluate the diagnostic performance of BOLD MRI and to monitor treatment response in recipients with ACR. METHODS: This prospective study was approved by the local institutional review board. Fifty-five recipients with highly suspected ACR were enrolled in this study. Each patient underwent hepatic BOLD MRI, blood biochemistry, and biopsy before treatment. Of 55 patients, 19 recipients with ACR received a follow-up MRI after treatment. After obtaining the R2* maps, five regions-of-interest were placed on liver parenchyma to estimate the mean R2* values for statistical analysis. Receiver operating characteristic curve (ROC) analysis was performed to assess the diagnostic performance of R2* values in detecting patients with ACR. RESULTS: The histopathologic results showed that 27 recipients had ACR (14 mild, 11 moderate, and 2 severe) and their hepatic R2* values were significantly lower than those of patients without ACR. ROC analysis revealed that the sensitivity and specificity of the R2* values for detection of ACR were 82.1% and 89.9%, respectively. Moreover, the R2* values and liver function in patients with ACR significantly increased after immunosuppressive treatment. CONCLUSION: The non-invasive BOLD MRI technique may be useful for assessment of hepatic ACR and monitoring of treatment response after immunosuppressive therapy. KEY POINTS: ⢠Patients with acute cellular rejection post-liver transplant exhibited significantly decreased R2* values in liver parenchyma. ⢠R2* values and liver function were significantly increased after immunosuppressive therapy. ⢠R2* values were constructive indicators in detecting acute cellular rejection due to their high sensitivity and specificity.
Asunto(s)
Trasplante de Hígado , Rechazo de Injerto/diagnóstico , Humanos , Trasplante de Hígado/efectos adversos , Imagen por Resonancia Magnética/métodos , Oxígeno , Saturación de Oxígeno , Estudios ProspectivosRESUMEN
BACKGROUND: Liver cirrhosis is a well-known risk factor of sepsis after emergent gastrointestinal (GI) endoscopy. Elective GI endoscopy before living donor liver transplantation (LDLT), however, may also carry the septic risk among these patients. METHODS: This retrospective study reviewed the medical records of 642 cirrhotic recipients who underwent GI endoscopy from 2008 to 2016. We analyzed the incidence and risk factors of post-endoscopy sepsis during 2008-2012 (experience cohort). Our protocol changed after 2013 (validation cohort) to include antibiotic prophylaxis. RESULTS: In experience cohort, 36 cases (10.5%) of the 342 LDLT candidates experienced sepsis within 48 h after endoscopy. The sepsis rate was significantly higher in patients with hepatic decompensation than patients without (22.2% vs. 9.6% vs. 2.6% in Child C/B/A groups respectively; ×2 = 20.97, P < 0.001). Using multivariate logistic regression analysis, the factors related to post-endoscopy sepsis were the Child score (OR 1.46; 95% CI 1.24-1.71), Child classes B and C (OR 3.80 and 14.13; 95% CI 1.04-13.95 and 3.97-50.23, respectively), hepatic hydrothorax (OR 4.85; 95% CI 1.37-17.20), and use of antibiotic prophylaxis (OR 0.08; 95% CI 0.01-0.64). In validation cohort, antibiotics were given routinely, and all cases of hepatic hydrothorax (n = 10) were drained. Consequently, 4 (1.3%) episodes of sepsis occurred among 300 LDLT candidates, and the incidence was significantly lower than before (1.3% vs. 10.5%, P < 0.001). CONCLUSIONS: Patients with decompensated cirrhosis and hepatic hydrothorax have higher risk of sepsis following endoscopy. In advanced cirrhotic patients, antibiotic prophylaxis and drainage of hydrothorax may be required to prevent sepsis before elective GI endoscopy.