RESUMEN
BACKGROUND: Acute kidney injury (AKI) is one of the most common complications after liver transplantation (LT) and can significantly impact outcomes. The presence of hepatitis C virus (HCV) infection increases the risk of AKI development. However, the impact of HCV on AKI after LT has not been evaluated. The aim of this study was to assess the effect of HCV on AKI development in patients who underwent LT. METHODS: Between January 2008 and April 2023, 2183 patients who underwent living donor LT (LDLT) were included. Patients were divided into 2 groups based on the presence of chronic HCV infection. We compared LT recipients using the propensity score matching (PSM) method. Factors associated with AKI development were evaluated using multiple logistic regression analysis. In addition, 1-year mortality and graft failure were assessed using a Cox proportional regression model. RESULTS: Among 2183 patients, the incidence of AKI was 59.2%. After PSM, the patients with HCV showed a more frequent development of AKI (71.9% vs 63.9%, P = .026). In multivariate analysis after PSM, HCV was associated with AKI development (odds ratio [OR], 1.53; 95% confidence interval [CI], 1.06-2.20, P = .022), 1-year mortality (Hazard ratio [HR], 1.98; 95% CI, 1.12-3.52, P = .019), and graft failure (HR, 2.12; 95% CI, 1.22-3.69, P = .008). CONCLUSIONS: The presence of HCV was associated with increased risk for the development of AKI, 1-year mortality, and graft failure after LT.
RESUMEN
Background and Objective: Early discharge following robot-assisted kidney transplantation (RAKT) is a cost-effective strategy for reducing healthcare expenses while maintaining favorable short- and long-term prognoses. This study aims to identify predictors of postoperative delayed discharge in RAKT patients and develop a predictive model to enhance clinical outcomes. Materials and Methods: This retrospective study included 146 patients aged 18 years and older who underwent RAKT at a single tertiary medical center from August 2020 to January 2024. Data were collected on demographics, comorbidities, social and medical histories, preoperative labs, surgical information, intraoperative data, and postoperative outcomes. The primary outcome was delayed postoperative discharge (length of hospital stay > 7 days). Risk factors for delayed discharge were identified through univariate and multivariate regression analyses, leading to the development of a risk scoring system, the effectiveness of which was evaluated through receiver operating characteristic curve analysis. Results: 110 patients (74.8%) were discharged within 7 days post-transplant, while 36 (24.7%) remained hospitalized for 8 days or longer. Univariate and multivariate logistic regression analyses identified ABO incompatibility, BUN levels, anesthesia time, and vasodilator use as risk factors for delayed discharge. The RAKT score, incorporating these factors, demonstrated a predictive performance with a C-statistic of 0.789. Conclusions: This study identified risk factors for delayed discharge after RAKT and developed a promising risk scoring system for real-world clinical application, potentially improving postoperative outcome stratification in RAKT recipients.
Asunto(s)
Trasplante de Riñón , Tiempo de Internación , Procedimientos Quirúrgicos Robotizados , Humanos , Femenino , Masculino , Trasplante de Riñón/efectos adversos , Estudios Retrospectivos , Persona de Mediana Edad , Procedimientos Quirúrgicos Robotizados/estadística & datos numéricos , Procedimientos Quirúrgicos Robotizados/métodos , Procedimientos Quirúrgicos Robotizados/efectos adversos , Factores de Riesgo , Tiempo de Internación/estadística & datos numéricos , Adulto , Curva ROC , Complicaciones Posoperatorias/epidemiología , AncianoRESUMEN
High-Ni layered oxide cathodes are promising candidates for lithium-ion batteries due to their high energy density. However, their cycle stability is compromised by the poor mechanical durability of the particle microstructure. In this study, we investigate the impact of the calcination temperature on microstructural changes, including primary particle growth and pore evolution, using LiNi0.88Mn0.08Co0.04O2 (N884), with an emphasis on the critical calcination temperature for polycrystalline and single-crystal designs in high-Ni cathodes. As the calcination temperature increases, the primary particles undergo a rectangular growth pattern while the pore population decreases. Beyond a certain critical temperature (in this case, 850 °C), a sudden increase in primary particle size and a simultaneous rapid reduction in the pore population are observed. This sudden microstructure evolution leads to poor cycle retention in N884. In contrast, single-crystal particles, free of grain boundaries, synthesized at this critical temperature exhibit superior cycle retention, underscoring the significance of microstructural design over crystalline quality for achieving long-term cyclability. Our study sheds light on the interplay between calcination temperature and microstructural evolution, proposing the critical temperature as a key criterion for single-crystal synthesis.
RESUMEN
BACKGROUND: With the rise of metabolic diseases and aging in liver transplant (LT) candidates, mitral annular calcification (MAC) is more recognizable. Despite cardiovascular risk becoming a leading cause of mortality in LT recipients, the influence of MAC remains unexamined. This study investigates the prevalence, related factors, and impact of MAC on LT outcomes. METHODS: We explored 4148 consecutive LT patients who underwent routine pretransplant echocardiography from 2008 to 2019. Multivariate logistic analysis and the tree-based Shapley additive explanation scores in machine learning were used to evaluate the significant and important related factors. The primary outcome was 30-d major adverse cardiac events (MACE), and the secondary outcome was a median of 5-y cumulative all-cause mortality. RESULTS: MAC was found in 123 (3.0%) patients. Significant and important related factors included age, alcoholic liver disease, chronic kidney disease, hyperuricemia, hypertension, and coronary artery disease. The MACE rate was higher in patients with MAC compared with those without MAC at 30 d ( P â <â 0.001, adjusted hazard ratio 1.67; 95% confidence interval, 1.08-2.57). Patients with MAC had poorer cumulative overall survival probability compared with those without MAC ( P â =â 0.0016; adjusted hazard ratio 1.47; 95% confidence interval, 1.01-2.15). Specifically, women with MAC had a poorer survival probability compared with men without MAC (65.0% versus 80.7%, P â <â 0.001) >10 y post-LT. CONCLUSIONS: The presence of MAC before LT was linked to increased 30-d MACE and lower long-term survival rates, especially in women. Identification and management of MAC and potential risk factors are crucial for improving post-LT survival.
Asunto(s)
Calcinosis , Trasplante de Hígado , Válvula Mitral , Humanos , Masculino , Femenino , Trasplante de Hígado/efectos adversos , Trasplante de Hígado/mortalidad , Persona de Mediana Edad , Calcinosis/mortalidad , Calcinosis/diagnóstico por imagen , Calcinosis/epidemiología , Válvula Mitral/cirugía , Válvula Mitral/diagnóstico por imagen , Factores de Riesgo , Estudios Retrospectivos , Pronóstico , Enfermedades de las Válvulas Cardíacas/cirugía , Enfermedades de las Válvulas Cardíacas/mortalidad , Enfermedades de las Válvulas Cardíacas/diagnóstico por imagen , Anciano , Medición de Riesgo , Resultado del Tratamiento , Ecocardiografía , Prevalencia , AdultoRESUMEN
(1) Background: Liver transplantation (LT) is associated with significant hemorrhage and massive transfusions. Fibrinogen replacement has a key role in treating massive bleeding during LT and hypofibrinogenemia is treated by fibrinogen concentrate or cryoprecipitate. However, these two products are known to be associated with major thromboembolism events (MTEs). We aimed to compare the effect of fibrinogen concentrate and cryoprecipitate on MTEs in living donor LT (LDLT) recipients. (2) Methods: We analyzed 206 patients who underwent LDLT between January 2021 and March 2022. The patients were divided into two groups according to fibrinogen concentrate or cryoprecipitate use. We compared the incidence of MTEs between the two groups. In addition, we performed multiple logistic regression analyses to identify the risk factors for MTEs. (3) Results: There was no significant difference in the incidence of MTEs (16 [14.7%] vs. 14 [14.4%], p = 1.000) between the cryoprecipitate group and fibrinogen concentrate group. In the multivariate analysis, cryoprecipitate (OR 2.09, 95%CI 0.85-5.11, p = 0.107) and fibrinogen concentrate (OR 2.05, 95%CI 0.82-5.12, p = 0.126) were not significantly associated with MTEs. (4) Conclusions: there was no significant difference in the incidence of MTEs between cryoprecipitate and fibrinogen concentrate in LDLT recipients.
RESUMEN
BACKGROUND: Acute-on-chronic liver failure (ACLF) is a life-threatening disease that requires urgent liver transplantation (LT). Accurate identification of high-risk patients is essential for predicting post-LT survival. The chronic liver failure consortium ACLF score is a widely accepted risk-stratification score that includes total white blood cell (WBC) counts as a component. This study aimed to evaluate the predictive value of total and differential WBC counts for short-term mortality following LT in patients with ACLF. METHODS: A total of 685 patients with ACLF who underwent LT between January 2008 and February 2019 were analyzed. Total and differential WBC counts were examined as a function of the model for end-stage liver disease for sodium (MELD-Na) score. The association between total and differential WBC counts and 90-day post-LT mortality was assessed using multivariable Cox proportional hazards regression analysis. RESULTS: The total WBC counts and neutrophil ratio were higher in patients with ACLF than in those without ACLF. The neutrophil ratio was significantly associated with 90-day post-LT mortality after adjustment (hazard ratio [HR], 1.04; P = 0.001), whereas total WBC counts were not significantly associated with 90-day post-LT mortality in either univariate or multivariate Cox analyses. The neutrophil ratio demonstrated a relatively linear trend with an increasing MELD-Na score and HR for 90-day post-LT mortality, whereas the total WBC counts exhibited a plateaued pattern. CONCLUSIONS: Neutrophilia, rather than total WBC counts, is a better prognostic indicator for short-term post-LT mortality in patients with ACLF.
RESUMEN
Background and Objectives: Preoperative echocardiography is widely performed in patients undergoing major surgeries to evaluate cardiac functions and detect structural abnormalities. However, studies on the clinical usefulness of preoperative echocardiography in patients undergoing cerebral aneurysm clipping are limited. Therefore, this study aimed to investigate the correlation between preoperative echocardiographic parameters and the incidence of postoperative complications in patients undergoing clipping of unruptured intracranial aneurysms. Materials and Methods: Electronic medical records of patients who underwent clipping of an unruptured intracranial aneurysm from September 2018 to April 2020 were retrospectively reviewed. Data on baseline characteristics, laboratory variables, echocardiographic parameters, postoperative complications, and hospital stays were obtained. Univariable and multivariable logistic regression analyses were performed to identify independent variables related to the occurrence of postoperative complications and prolonged hospital stay (≥8 d). Results: Among 531 patients included in the final analysis, 27 (5.1%) had postoperative complications. In multivariable logistic regression, the total amount of crystalloids infused (1.002 (1.001-1.003), p = 0.001) and E/e' ratio (1.17 (1.01-1.35), p = 0.031) were significant independent factors associated with the occurrence of a postoperative complication. Additionally, the maximal diameter of a cerebral aneurysm (1.13 (1.02-1.25), p = 0.024), total amount of crystalloids infused (1.001 (1.000-1.002), p = 0.031), E/A ratio (0.22 (0.05-0.95), p = 0.042), and E/e' ratio (1.16 (1.04-1.31), p = 0.011) were independent factors related to prolonged hospitalization. Conclusions: Echocardiographic parameters related to diastolic function might be associated with postoperative complications in patients undergoing clipping of unruptured intracranial aneurysms.
Asunto(s)
Aneurisma Intracraneal , Humanos , Aneurisma Intracraneal/cirugía , Estudios Retrospectivos , Incidencia , Complicaciones Posoperatorias/epidemiología , Complicaciones Posoperatorias/etiología , Ecocardiografía , Resultado del TratamientoRESUMEN
BACKGROUND: Excessive visceral obesity in recipients of living donor liver transplantation (LDLT) is associated with mortality, and a recent study reported the correlation between visceral adiposity of male LDLT recipients and hepatocellular carcinoma (HCC) recurrence. However, there is no study on the relationship between the donor's visceral adiposity and surgical outcomes in LDLT recipients. We investigated the association of the visceral-to-subcutaneous fat area ratio (VSR) in donors and recipients with HCC recurrence and mortality in LDLT. METHODS: We analyzed 1386 sets of donors and recipients who underwent LDLT between January 2008 and January 2018. The maximal chi-square method was used to determine the optimal cutoff values for VSR for predicting overall HCC recurrence and mortality. Cox regression analyses were performed to evaluate the association of donor VSR and recipient VSR with overall HCC recurrence and mortality in recipients. RESULTS: The cutoff values of VSR was determined as 0.73 in males and 0.31 in females. High donor VSR was significantly associated with overall HCC recurrence (adjusted hazard ratio [HR]: 1.43, 95% confidence interval [CI]: 1.06-1.93, p = 0.019) and mortality (HR: 1.35, 95% CI: 1.03-1.76, p = 0.030). High recipient VSR was significantly associated with overall HCC recurrence (HR: 1.40, 95% CI: 1.04-1.88, p = 0.027) and mortality (HR: 1.50, 95% CI: 1.14-1.96, p = 0.003). CONCLUSIONS: Both recipient VSR and donor VSR were significant risk factors for HCC recurrence and mortality in LDLT recipients. Preoperative donor VSR and recipient VSR may be strong predictors of the surgical outcomes of LDLT recipients with HCC.
Asunto(s)
Carcinoma Hepatocelular , Neoplasias Hepáticas , Trasplante de Hígado , Femenino , Masculino , Humanos , Carcinoma Hepatocelular/cirugía , Carcinoma Hepatocelular/patología , Trasplante de Hígado/efectos adversos , Trasplante de Hígado/métodos , Neoplasias Hepáticas/cirugía , Neoplasias Hepáticas/patología , Donadores Vivos , Recurrencia Local de Neoplasia/epidemiología , Recurrencia Local de Neoplasia/etiología , Obesidad Abdominal/etiología , Estudios Retrospectivos , Resultado del TratamientoRESUMEN
Background: Heart failure with preserved ejection fraction (HFpEF) and its risk factors are increasingly recognized in patients with end-stage liver disease (ESLD). Objectives: The aim of this study was to characterize HFpEF and identify relevant risk factors in patients with ESLD. Additionally, the prognostic impact of high-probability HFpEF on post-liver transplantation (LT) mortality was investigated. Methods: Patients with ESLD prospectively enrolled from the Asan LT Registry between 2008 and 2019 were divided into groups with low (scores of 0 and 1), intermediate (scores of 2-4), and high (scores of 5 and 6) probability using the Heart Failure Association-PEFF diagnostic score for HFpEF. Gradient-boosted modeling in machine learning was further used to appraise the apparent importance of risk factors. Finally, post-LT all-cause mortality was followed for 12.8 years (median 5.3 years); there were 498 deaths after LT. Results: Among the 3,244 patients, 215 belonged to the high-probability group, commonly those with advanced age, female sex, anemia, dyslipidemia, renal dysfunction, and hypertension. The highest risk factors for the high-probability group, according to gradient-boosted modeling, were female sex, anemia, hypertension, dyslipidemia, and age >65 years. Among patients with Model for End-Stage Liver Disease scores of >30, those with high, intermediate, and low probability had cumulative overall survival rates of 71.6%, 82.2%, and 88.9% at 1 year and 54.8%, 72.1%, and 88.9% at 12 years after LT (log-rank P = 0.026), respectively. Conclusions: High-probability HFpEF was found in 6.6% of patients with ESLD with poorer long-term post-LT survival, especially those with advanced stages of liver disease. Therefore, identifying HFpEF using the Heart Failure Association-PEFF score and addressing modifiable risk factors can improve post-LT survival.
RESUMEN
BACKGROUND: Diastolic dysfunction is regarded as an important predictor of outcome after liver transplantation (LT). We investigated the influence of liver disease severity on left ventricular diastolic properties using end-diastolic pressure-volume relationship (EDPVR) analysis in patients with end-stage liver disease (ESLD). Association between alterations of the EDPVR and mortality after LT was evaluated. METHODS: In this observational retrospective cohort study, 3,211 patients who underwent LT for ESLD were included in analysis. Variables derived from single-beat EDPVR (diastolic stiffness-coefficient [ß] and end-diastolic volume at an end-diastolic pressure of 20 mmHg [EDVI20] indicating ventricular capacitance) were estimated using preoperative echocardiographic data. Alterations in EDPVR with increased stiffness (ß > 6.16) were categorized into 3 groups; leftward-shifted (EDVI20 <51 mL/m2), rightward-shifted (EDVI20 > 69.7 mL/m2), and intermediate (EDVI20 51-69.7 mL/m2). RESULTS: As the model for ESLD score increases, both EDVI20 and ß gradually increased, which indicated ventricular remodeling with larger capacitance and higher diastolic stiffness. Among patients with increased stiffness (ß > 6.16, n = 1,090), survival rates after LT were lower in leftward-shifted EDPVR than in rightward-shifted EDPVR (73.7% vs 82.9%; log-rank P = 0.002). In the adjusted Cox proportional hazard model, risk of cumulative all-cause mortality at 11 years was the highest in leftward-shifted EDPVR (hazard ratio [HR]: 1.93; 95% confidence interval [CI]: 1.27-2.92), followed by intermediate EDPVR (HR: 1.55; 95% CI: 1.12-2.26), compared with rightward-shifted EDPVR. The SHapley Additive exPlanation model revealed that the variables associated with leftward-shifted EDPVR were diabetes, female sex, old age, and hypertension. CONCLUSIONS: As ESLD advances, diastolic ventricular properties are characterized by increased EDVI20 and ß on rightward-shifted EDPVR, indicating larger capacitance and higher stiffness. However, leftward-shifted EDPVR with left ventricle remodeling failure is associated with poor post-LT survival.
Asunto(s)
Enfermedad Hepática en Estado Terminal , Remodelación Ventricular , Humanos , Femenino , Estudios Retrospectivos , Presión Sanguínea , Enfermedad Hepática en Estado Terminal/cirugía , Diástole , Volumen Sistólico , Función Ventricular IzquierdaRESUMEN
BACKGROUND: Considering the importance of the inflammatory status of recipients on outcomes following liver transplantation (LT), we investigated the association between C-reactive protein-to-albumin ratio (CAR) and one-year mortality following LT and compared it with other parameters reflecting patients' underlying inflammatory status. METHODS: A total of 3,614 consecutive adult LT recipients were retrospectively evaluated. Prognostic parameters were analyzed using area under the receiver operating characteristic curve (AUROC) analysis, and subsequent cutoffs were derived. For survival analysis, Cox proportional hazards and Kaplan-Meier analyses were performed. RESULTS: The AUROC for CAR to predict one-year mortality after LT was 0.68 (0.65-0.72), which was the highest compared with other inflammatory parameters, with the best cutoff of 0.34. A CAR ≥ 0.34 was associated with a significantly higher one-year mortality rate (13.3% vs. 5.8 %, log-rank P < 0.001) and overall mortality rate (24.5% vs. 12.9%, log-rank P = 0.039). A CAR ≥ 0.34 was an independent predictor of one-year mortality (hazard ratio, 1.40 [1.03-1.90], P = 0.031) and overall mortality (hazard ratio 1.39 [1.13-1.71], P = 0.002) after multivariable adjustment. CONCLUSIONS: Preoperative CAR (≥ 0.34) was independently associated with a higher risk of one-year and overall mortality after LT. This may suggest that CAR, a simple and readily available biomarker, maybe a practical index that may assist in the risk stratification of liver transplantation outcomes.
RESUMEN
Recent studies have reported that sarcopenia influences morbidity and mortality in surgical patients. However, few studies have investigated the associations of sarcopenia with short-term and long-term graft failure in recipients after living donor liver transplantation (LDLT). In this study, we investigated the associations between sarcopenia and graft failure/mortality in patients undergoing LDLT. We retrospectively examined 2816 recipients who underwent LDLT between January 2008 and January 2018. Cox regression analysis was performed to evaluate the associations between sarcopenia and graft failure/mortality in recipients at 60 days, 180 days, and 1 year and overall. Sarcopenia in the recipient was significantly associated with 60-day graft failure (adjusted hazard ratio [HR], 1.98; 95% confidence interval [CI], 1.09-3.61; p = 0.03), 180-day graft failure (HR, 1.85; 95% CI, 1.19-2.88; p = 0.01), 1-year graft failure (HR, 1.45; 95% CI, 1.01-2.17; p = 0.05), and overall graft failure (HR, 1.42; 95% CI, 1.08-1.87; p = 0.01). In addition, recipient sarcopenia was associated with 180-day mortality (HR, 1.88; 95% CI, 1.17-3.01; p = 0.01), 1-year mortality (HR, 1.53; 95% CI, 1.01-2.29; p = 0.04), and overall mortality (HR, 1.43; 95% CI, 1.08-1.90; p = 0.01). Preoperative sarcopenia was associated with high rates of graft failure and mortality in LDLT recipients. Therefore, preoperative sarcopenia may be a strong predictor of the surgical prognosis in LDLT recipients.
Asunto(s)
Trasplante de Hígado , Sarcopenia , Supervivencia de Injerto , Humanos , Trasplante de Hígado/efectos adversos , Donadores Vivos , Estudios Retrospectivos , Sarcopenia/complicaciones , Sarcopenia/epidemiología , Resultado del TratamientoRESUMEN
BACKGROUND: Excessive citrate load during therapeutic plasma exchange (TPE) can cause metabolic alkalosis with compensatory hypercarbia and electrolyte disturbances. If TPE is required immediately before ABO-incompatible (ABOi) liver transplant (LT) surgery, metabolic derangement and severe electrolyte disturbance could worsen during LT anesthesia. CASE: We report two ABOi LT cases who received TPE on the day of surgery because isoagglutinin titers did not be dropped below 1:8. One case had a surprisingly high metabolic alkalosis with a pH of 7.73 immediately after tracheal intubation because of hyperventilation during mask bagging. The other experienced sudden ventricular tachycardia and blood pressure drop after surgical incision accompanied with severe hypokalemia of 1.8 mmol/L despite supplementation with potassium. CONCLUSIONS: Special attention should be paid to patients who just completed TPE the operative day morning as they are vulnerable to severe acid-base disturbances and life-threatening ventricular arrhythmias in ABOi LT.
RESUMEN
BACKGROUND: We aimed to explore the distribution of intraoperative lactic acid (LA) level during liver transplantation (LT) and determine the optimal cutoff values to predict post-LT 30-day and 90-day mortality. METHODS: Intraoperative LA data from 3,338 patients were collected between 2008 to 2019 and all-cause mortalities within 30 and 90 days were retrospectively reviewed. Of the three LA levels measured during preanhepatic, anhepatic, and neohepatic phase of LT, the peak LA level was selected to explore the distribution and predict early post-LT mortality. To determine the best cutoff values of LA, we used a classification and regression tree algorithm and maximally selected rank statistics with the smallest P value. RESULTS: The median intraoperative LA level was 4.4 mmol/L (range: 0.5-34.7, interquartile range: 3.0-6.2 mmol/L). Of the 3,338 patients, 1,884 (56.4%) had LA levels > 4.0 mmol/L and 188 (5.6%) had LA levels > 10 mmol/L. Patients with LA levels > 16.7 mmol/L and 13.5-16.7 mmol/L showed significantly higher 30-day mortality rates of 58.3% and 21.2%, respectively. For the prediction of the 90-day mortality, 8.4 mmol/L of intraoperative LA was the best cutoff value. CONCLUSIONS: Approximately 6% of the LT recipients showed intraoperative hyperlactatemia of > 10 mmol/L during LT, and those with LA > 8.4 mmol/L were associated with significantly higher early post-LT mortality.
RESUMEN
BACKGROUND: Although living donor liver transplantation (LDLT) is the standard treatment option for patients with end-stage liver disease, it always entails ethical concerns about the risk of living donors. Recent studies have reported a correlation between sarcopenia and surgical prognosis in recipients. However, there are few studies of donor sarcopenia and the surgical prognosis of donors. This study investigated the association between sarcopenia and postoperative acute kidney injury in liver donors. METHODS: This retrospective study analysed 2892 donors who underwent donor hepatectomy for LDLT between January 2008 and January 2018. Sarcopenia was classified into pre-sarcopenia and severe sarcopenia, which were determined to be -1 standard deviation (SD), and -2 SD from the mean baseline of the skeletal muscle index, respectively. Multivariate regression analysis was performed to evaluate the association between donor sarcopenia and postoperative AKI. Additionally, we assessed the association between donor sarcopenia and delayed recovery of liver function (DRHF). RESULTS: In the multivariate analysis, donor sarcopenia was significantly associated a higher incidence of postoperative AKI (adjusted odds ratio [OR]: 2.65, 95% confidence interval [CI]: 1.15-6.11, P = .022 in pre-sarcopenia, OR: 5.59, 95% CI: 1.11-28.15, P = .037 in severe sarcopenia, respectively). Additionally, hypertension and synthetic colloid use were significantly associated with postoperative AKI. In the multivariate analysis, risk factors of DRHF were male gender, indocyanine green retention rate at 15 minutes, and graft type, however, donor sarcopenia was not a risk factor. CONCLUSIONS: Donor sarcopenia is associated with postoperative AKI following donor hepatectomy.
Asunto(s)
Lesión Renal Aguda , Trasplante de Hígado , Lesión Renal Aguda/epidemiología , Lesión Renal Aguda/etiología , Estudios de Cohortes , Hepatectomía/efectos adversos , Humanos , Hígado/cirugía , Trasplante de Hígado/efectos adversos , Donadores Vivos , Masculino , Músculo Esquelético , Estudios RetrospectivosRESUMEN
We aimed to determine the association between the preoperative antithrombin III (ATIII) level and postoperative acute kidney injury (AKI) after LT (post-LT AKI). We retrospectively evaluated 2395 LT recipients between 2010 and 2018 whose data of perioperative ATIII levels were available. Patients were divided into two groups based on the preoperative level of ATIII (ATIII < 50% vs. ATIII ≥ 50%). Multivariable regression analysis was performed to assess the risk factors for post-LT AKI. The mean preoperative ATIII levels were 30.2 ± 11.8% in the ATIII < 50% group and 67.2 ± 13.2% in the ATIII ≥ 50% group. The incidence of post-LT AKI was significantly lower in the ATIII ≥ 50% group compared to that in the ATIII < 50% group (54.7% vs. 75.5%, p < 0.001); odds ratio (OR, per 10% increase in ATIII level) 0.86, 95% confidence interval (CI) 0.81-0.92; p < 0.001. After a backward stepwise regression model, female sex, high body mass index, low albumin, deceased donor LT, longer duration of surgery, and high red blood cell transfusion remained significantly associated with post-LT AKI. A low preoperative ATIII level is associated with post-LT AKI, suggesting that preoperative ATIII might be a prognostic factor for predicting post-LT AKI.
RESUMEN
Liver transplantation (LT) is closely associated with decreased immune function, a contributor to herpes zoster (HZ). However, risk factors for HZ in living donor LT (LDLT) remain unknown. Neutrophil-lymphocyte ratio (NLR) and immune system function are reportedly correlated. This study investigated the association between NLR and HZ in 1688 patients who underwent LDLT between January 2010 and July 2020 and evaluated risk factors for HZ and postherpetic neuralgia (PHN). The predictive power of NLR was assessed through the concordance index and an integrated discrimination improvement (IDI) analysis. Of the total cohort, 138 (8.2%) had HZ. The incidence of HZ after LT was 11.2 per 1000 person-years and 0.1%, 1.3%, 2.9%, and 13.5% at 1, 3, 5, and 10 years, respectively. In the Cox regression analysis, preoperative NLR was significantly associated with HZ (adjusted hazard ratio [HR], 1.05; 95% confidence interval [CI], 1.02-1.09; p = 0.005) and PHN (HR, 1.08; 95% CI, 1.03-1.13; p = 0.001). Age, sex, mycophenolate mofetil use, and hepatitis B virus infection were risk factors for HZ versus age and sex for PHN. In the IDI analysis, NLR was discriminative for HZ and PHN (p = 0.020 and p = 0.047, respectively). Preoperative NLR might predict HZ and PHN in LDLT recipients.
RESUMEN
Li-rich layered oxide materials are considered promising candidates for high-capacity cathodes for battery applications and improving the reversibility of the anionic redox reaction is the key to exploiting the full capacity of these materials. However, permanent structural change of the electrode occurring upon electrochemical cycling results in capacity and voltage decay. In view of these factors, Ti4+ -substituted Li2 IrO3 (Li2 Ir0.75 Ti0.25 O3 ) is synthesized, which undergoes an oxygen redox reaction with suppressed voltage decay, yielding improved electrochemical performance and good capacity retention. It is shown that the increased bond covalency upon Ti4+ substitution results in structural stability, tuning the phase stability from O3 to O1' upon de-lithiation during charging compared with O3 to T3 and O1 for pristine Li2 IrO3 , thereby facilitating the oxidation of oxygen. This work unravels the role of Ti4+ in stabilizing the cathode framework, providing insight for a fundamental design approach for advanced Li-rich layered oxide battery materials.
RESUMEN
BACKGROUND: Tachycardia-polyuria syndrome is characterized by polyuria occurring because of tachycardia with a heart rate of ≥ 120 beats/min lasting ≥ 30 min. We report such a case occurring after swan-ganz catheterization. CASE: A 41-year-old male was scheduled for living-donor liver transplantation. After general anesthesia, atrial fibrillation occurred during swan-ganz catheterization, and polyuria developed 1 h later. During the anhepatic phase, the patient's heart rate increased further, and cardioversion was performed. After a normal sinus rhythm was achieved, the patient's urine output returned to normal. CONCLUSIONS: The patient's polyuria seemed related to the iatrogenic atrial fibrillation occurring during swan-ganz catheterization. Although we did not measure atrial natriuretic peptide, an increase in its concentration may have been the main mechanism of polyuria, as natriuresis was observed.
RESUMEN
Postoperative hemorrhagic stroke (HS) is a rare yet devastating complication after liver transplantation (LT). Unruptured intracranial aneurysm (UIA) may contribute to HS; however, related data are limited. We investigated UIA prevalence and aneurysmal subarachnoid hemorrhage (SAH) and HS incidence post-LT. We identified risk factors for 1-year HS and constructed a prediction model. This study included 3544 patients who underwent LT from January 2008 to February 2019. Primary outcomes were incidence of SAH, HS, and mortality within 1-year post-LT. Propensity score matching (PSM) analysis and Cox proportional hazard analysis were performed. The prevalence of UIAs was 4.63% (n = 164; 95% confidence interval (CI), 3.95-5.39%). The 1-year SAH incidence was 0.68% (95% CI, 0.02-3.79%) in patients with UIA. SAH and HS incidence and mortality were not different between those with and without UIA before and after PSM. Cirrhosis severity, thrombocytopenia, inflammation, and history of SAH were identified as risk factors for 1-year HS. UIA presence was not a risk factor for SAH, HS, or mortality in cirrhotic patients post-LT. Given the fatal impact of HS, a simple scoring system was constructed to predict 1-year HS risk. These results enable clinical risk stratification of LT recipients with UIA and help assess perioperative HS risk before LT.