ABSTRACT
Seizures are a frequent neurological consequence following liver transplantation (LT), however, research on their clinical impact and risk factors is lacking. Using a nested case-control design, patients diagnosed with seizures (seizure group) within 1-year post-transplantation were matched to controls who had not experienced seizures until the corresponding time points at a 1:5 ratio to perform survival and risk factor analyses. Seizures developed in 61 of 1,243 patients (4.9%) at median of 11 days after LT. Five-year graft survival was significantly lower in the seizure group than in the controls (50.6% vs. 78.2%, respectively, p < 0.001) and seizure was a significant risk factor for graft loss after adjusting for variables (HR 2.04, 95% CI 1.24-3.33). In multivariable logistic regression, body mass index <23 kg/m2, donor age ≥45 years, intraoperative continuous renal replacement therapy and delta sodium level ≥4 mmol/L emerged as independent risk factors for post-LT seizure. Delta sodium level ≥4 mmol/L was associated with seizures, regardless of the severity of preoperative hyponatremia. Identifying and controlling those risk factors are required to prevent post-LT seizures which could result in worse graft outcome.
Subject(s)
Liver Transplantation , Humans , Middle Aged , Liver Transplantation/adverse effects , Case-Control Studies , Retrospective Studies , Risk Factors , Seizures/etiology , Sodium , Treatment OutcomeABSTRACT
The optimal target blood pressure for kidney transplant (KT) patients remains unclear. We included 808 KT patients from the KNOW-KT as a discovery set, and 1,294 KT patients from the KOTRY as a validation set. The main exposures were baseline systolic blood pressure (SBP) at 1 year after KT and time-varying SBP. Patients were classified into five groups: SBP <110; 110-119; 120-129; 130-139; and ≥140 mmHg. SBP trajectories were classified into decreasing, stable, and increasing groups. Primary outcome was composite kidney outcome of ≥50% decrease in eGFR or death-censored graft loss. Compared with the 110-119 mmHg group, both the lowest (adjusted hazard ratio [aHR], 2.43) and the highest SBP (aHR, 2.25) were associated with a higher risk of composite kidney outcome. In time-varying model, also the lowest (aHR, 3.02) and the highest SBP (aHR, 3.60) were associated with a higher risk. In the trajectory model, an increasing SBP trajectory was associated with a higher risk than a stable SBP trajectory (aHR, 2.26). This associations were consistent in the validation set. In conclusion, SBP ≥140 mmHg and an increasing SBP trajectory were associated with a higher risk of allograft dysfunction and failure in KT patients.
Subject(s)
Blood Pressure , Glomerular Filtration Rate , Graft Survival , Kidney Transplantation , Humans , Female , Male , Middle Aged , Adult , Allografts , Aged , Proportional Hazards Models , Graft Rejection , Transplant Recipients , HypertensionABSTRACT
The effect of changes in immunosuppressive therapy during the acute phase post-heart transplantation (HTx) on clinical outcomes remains unclear. This study aimed to investigate the effects of changes in immunosuppressive therapy by corticosteroid (CS) weaning and everolimus (EVR) initiation during the first year post-HTx on clinical outcomes. We analyzed 622 recipients registered in the Korean Organ Transplant Registry (KOTRY) between January 2014 and December 2021. The median age at HTx was 56 years (interquartile range [IQR], 45-62), and the median follow-up time was 3.9 years (IQR 2.0-5.1). The early EVR initiation within the first year post-HTx and maintenance during the follow-up is associated with reduced the risk of primary composite outcome (all-cause mortality or re-transplantation) (HR, 0.24; 95% CI 0.09-0.68; p < 0.001) and cardiac allograft vasculopathy (CAV) (HR, 0.39; 95% CI 0.19-0.79; p = 0.009) compared with EVR-free or EVR intermittent treatment regimen, regardless of CS weaning. However, the early EVR initiation tends to increase the risk of acute allograft rejection compared with EVR-free or EVR intermittent treatment.
Subject(s)
Adrenal Cortex Hormones , Everolimus , Graft Rejection , Heart Transplantation , Immunosuppressive Agents , Registries , Humans , Everolimus/administration & dosage , Everolimus/therapeutic use , Heart Transplantation/adverse effects , Middle Aged , Male , Female , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/administration & dosage , Republic of Korea/epidemiology , Graft Rejection/prevention & control , Adrenal Cortex Hormones/administration & dosage , Adrenal Cortex Hormones/therapeutic use , Treatment Outcome , Graft Survival , Retrospective StudiesABSTRACT
PURPOSE: Patients undergoing transarterial chemoembolisation experience postembolisation symptoms and interferences affecting sleep quality, which require intervention. The study aimed to identify the predictors of sleep quality components in patients undergoing transarterial chemoembolisation. METHODS: This study included two groups of participants: 50 patients undergoing transarterial chemoembolisation and 45 nurses caring for them. Data were collected from September to November 2022 using a structured questionnaire, and analysed using descriptive statistics, the t-test, analysis of variance, Spearman's rank correlation, and multiple regression analysis using the SPSS 27.0 program (IBM Corp., Armonk, NY, USA). RESULTS: The mean sleep quality score was 40.28±14.10. Heat sensation (t=-2.08, p=.043) and fatigue (t=-4.47, p<.001) predicted sleep fragmentation in 38.6% of the patients. Abdominal pain (t=-2.54, p=.014), vomiting (t=-2.21, p=.032), and the expected fatigue by the nurses (t=2.68, p=.014) predicted sleep length in 41.7% of patients. Abdominal pain (t=-2.05, p=.046) explained 42.9% of sleep depth. CONCLUSION: Based on the predictors of sleep quality components obtained in this study, strategies to improve sleep quality tailored to patients undergoing transarterial chemoembolisation should be developed. This study highlighted the need to bridge the gap between patients' and nurses' expected fatigue and its contribution to sleep fragmentation and sleep length. It also highlighted the importance of noncontact temperature measurement, controlling vomiting, and pain relief for improving sleep length in patients undergoing transarterial chemoembolisation.
Subject(s)
Sleep Deprivation , Sleep Quality , Humans , Cross-Sectional Studies , Abdominal Pain , Fatigue/etiology , Fatigue/therapy , VomitingABSTRACT
Post-traumatic striatocapsular infarction (SCI) due to lenticulostriate artery (LSA) damage is rare. Most cases reported are in children. We discuss the pathogenesis and differential diagnosis of this kind of SCI after trauma in adult patients. The most common etiology of non-traumatic SCI are an embolism from the proximal artery, cardiogenic embolism, and atherosclerotic plaque in the proximal middle cerebral artery (MCA). However, injury of the LSA after trauma may lead to hemorrhagic infarction in the basal ganglia (BG). Post-traumatic SCI due to LSA damage might be associated with hemorrhage in the BG. The main locations of these lesions are the distal perfusion area of the LSA, similar to SCI due to intracranial atherosclerotic disease affecting the MCA. Vessel wall imaging, magnetic resonance angiography, and ultrahigh-resolution computed tomography can be used for differentiating the injury mechanism in SCI following a traumatic event.
Subject(s)
Embolism , Middle Cerebral Artery , Adult , Child , Humans , Cerebral Infarction/pathology , Basal Ganglia/diagnostic imaging , Infarction/complications , Infarction/pathology , Embolism/complications , Embolism/pathologyABSTRACT
Although transarterial chemoembolization has improved as an interventional method for hepatocellular carcinoma, subsequent postembolization syndrome is a threat to the patients' quality of life. This study aimed to evaluate the effectiveness of a clinical decision support system in postembolization syndrome management across nurses and patient outcomes. This study is a randomized controlled trial. We included 40 RNs and 51 hospitalized patients in the study. For nurses in the experimental group, a clinical decision support system and a handbook were provided for 6 weeks, and for nurses in the control group, only a handbook was provided. Notably, the experimental group exhibited statistically significant improvements in patient-centered caring attitude, pain management barrier identification, and comfort care competence after clinical decision support system implementation. Moreover, patients' symptom interference during the experimental period significantly decreased compared with before the intervention. This study offers insights into the potential of clinical decision support system in refining nursing practices and nurturing patient well-being, presenting prospects for advancing patient-centered care and nursing competence. The clinical decision support system contents, encompassing postembolization syndrome risk prediction and care recommendations, should underscore its role in fostering a patient-centered care attitude and bolster nurses' comfort care competence.
ABSTRACT
Muscle wasting in chronic kidney disease is associated with increased cardiovascular events, morbidity, and mortality. However, whether pretransplantation skeletal muscle mass affects kidney transplantation (KT) outcomes has not been established. We analyzed 623 patients who underwent KT between 2004 and 2019. We measured the cross-sectional area of total skeletal muscle at the third lumbar vertebra level on pretransplantation computed tomography scan. The patients were grouped into low and normal skeletal muscle mass groups based on the sex-specific skeletal muscle mass index lowest quartile. During the entire follow-up period, 45 patients (7.2%) died and 56 patients (9.0%) experienced death-censored graft loss. Pretransplantation low skeletal muscle mass was independently associated with all-cause mortality (adjusted hazard ratio, 2.269; 95% confidence interval, 1.232-4.182). Low muscle mass was also associated with an increased risk of hospital readmission within 1 year after transplantation. Death-censored graft survival rates were comparable between the 2 groups. The low muscle group showed higher creatinine-based estimated glomerular filtration rates (eGFRs) than the normal muscle group. Although cystatin C-based eGFRs were measured in only one-third of patients, cystatin C-based eGFRs were comparable between the 2 groups. Pretransplantation low skeletal muscle mass index is associated with an increased risk of mortality and hospital readmission after KT.
Subject(s)
Kidney Transplantation , Male , Female , Humans , Kidney Transplantation/methods , Follow-Up Studies , Cystatin C , Glomerular Filtration Rate , Graft Survival , Muscle, Skeletal , Transplant Recipients , Risk FactorsABSTRACT
OBJECTIVE: To compare graft survival after LDLT in patients receiving GRWR<0.8 versus GRWR≥0.8 grafts and identify risk factors for graft loss using GRWR<0.8 grafts. SUMMARY BACKGROUND DATA: Favorable outcomes after living donor liver transplantation (LDLT) using graft-to-recipient weight ratio (GRWR)<0.8 grafts were recently reported; however, these results have not been validated using multicenter data. METHODS: This multicentric cohort study included 3450 LDLT patients. Graft survival was compared between 1:3 propensity score-matched groups and evaluated using various Cox models in the entire population. Risk factors for graft loss with GRWR<0.8 versus GRWR≥0.8 grafts were explored within various subgroups using interaction analyses, and outcomes were stratified according to the number of risk factors. RESULTS: In total, 368 patients (10.7%) received GRWR<0.8 grafts (GRWR<0.8 group), whereas 3082 (89.3%) received GRWR≥0.8 grafts (GRWR≥0.8 group). The 5-y graft survival rate was significantly lower with GRWR<0.8 grafts than with GRWR≥0.8 grafts (85.2% vs. 90.1%, P=0.013). Adjusted hazard ratio (HR) for graft loss using GRWR<0.8 grafts in the entire population was 1.66 (95% confidence interval [CI] 1.17-2.35, P=0.004). Risk factors exhibiting significant interactions with GRWR<0.8 for graft survival were age ≥60 y, MELD score ≥15, and male donor. When ≥2 risk factors were present, GRWR<0.8 grafts showed higher risk of graft loss compared to GRWR≥0.8 graft in LDLT (HR 2.98, 95% CI 1.79-4.88, P<0.001). CONCLUSIONS: GRWR<0.8 graft showed inferior graft survival than controls (85.2% vs. 90.1%), especially when ≥2 risk factors for graft loss (among age ≥60 y, MELD score ≥15, or male donor) were present.
ABSTRACT
The safety of elderly living liver donors and recipient outcomes are always of concern. In the present study, the effects of age in 2 donor groups, a 60+years old group and a 50-59 years old group (referred to as the 60s and 50s donor groups, respectively), on living donor liver transplantation were compared regarding donor safety and recipient outcomes. We retrospectively identified 209 patients 50 years and above of age at 9 centers from 2005 to 2017 in Korea. The 60s donor group represented 10% (n=21) of donor patients. One case in each group was a left liver graft, respectively, and the others were right liver grafts. Postoperative complications were more common in the 60s donor group, but the proportion of Clavien-Dindo grade III in the 60s donor group did not differ from that in the 50s donor group. In-hospital mortality did not occur among donors, and donor mortality was not reported during the observation period. Postoperative total bilirubin and hospitalization in recipients of the 60s donor group were higher and longer than in recipients of the 50s donor group, respectively. Although the cumulative overall survival of the recipients in the 60s donor group was significantly lower than that of the 50s donor group, a difference was not observed in graft survival. Multivariate analysis showed that increased living liver donors age, the coexistence of HCC, and increased intraoperative blood loss during the recipient operation were important predisposing factors for patient death. Present study suggests that highly selected elderly living donors (≥60 y) can safely donate with similar recipient graft survival rates though the recipient overall patient survival is inferior compared to the 50s donor group.
Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Humans , Aged , Middle Aged , Child , Liver Transplantation/adverse effects , Living Donors , Retrospective Studies , Carcinoma, Hepatocellular/etiology , Liver Neoplasms/etiology , Republic of Korea/epidemiology , Graft Survival , Treatment OutcomeABSTRACT
Considerable controversy exists regarding the superiority of tenofovir disoproxil fumarate (TDF) over entecavir (ETV) for reducing the risk of HCC. This study aimed to compare outcomes of ETV versus TDF after liver transplantation (LT) in patients with HBV-related HCC. We performed a multicenter observational study using data from the Korean Organ Transplantation Registry. A total of 845 patients who underwent LT for HBV-related HCC were divided into 2 groups according to oral nucleos(t)ide analogue used for HBV prophylaxis post-LT: ETV group (n = 393) and TDF group (n = 452). HCC recurrence and overall death were compared in naïve and propensity score (PS)-weighted populations, and the likelihood of these outcomes according to the use of ETV or TDF were analyzed with various Cox models. At 1, 3, and 5 years, the ETV and TDF groups had similar HCC recurrence-free survival (90.7%, 85.6%, and 84.1% vs. 90.9%, 84.6%, and 84.2%, respectively, p = 0.98) and overall survival (98.4%, 94.7%, and 93.5% vs. 99.3%, 95.8%, and 94.9%, respectively, p = 0.48). The propensity score-weighted population showed similar results. In Cox models involving covariates adjustment, propensity score-weighting, competing risk regression, and time-dependent covariates adjustment, both groups showed a similar risk of HCC recurrence and overall death. In subgroup analyses stratified according to HCC burden (Milan criteria, Up-to-7 criteria, French alpha-fetoprotein risk score), pretransplantation locoregional therapy, and salvage LT, neither ETV nor TDF was superior. In conclusion, ETV and TDF showed mutual noninferiority for HCC outcomes when used for HBV prophylaxis after LT.
Subject(s)
Carcinoma, Hepatocellular , Hepatitis B, Chronic , Hepatitis B , Liver Neoplasms , Liver Transplantation , Humans , Tenofovir/therapeutic use , Antiviral Agents/therapeutic use , Liver Transplantation/adverse effects , Carcinoma, Hepatocellular/epidemiology , Hepatitis B, Chronic/complications , Hepatitis B, Chronic/diagnosis , Hepatitis B, Chronic/drug therapy , Treatment Outcome , Liver Neoplasms/epidemiology , Hepatitis B/complications , Hepatitis B/diagnosis , Hepatitis B/drug therapy , Hepatitis B virusABSTRACT
BACKGROUND: Statins have been reported to reduce overall death and hepatocellular carcinoma (HCC) recurrence in liver transplantation (LT) recipients. However, previous retrospective studies have significant flaws in immortal time bias. METHODS: Using data from 658 patients who received LT for HCC, we matched 140 statin users with statin nonusers in a 1:2 ratio at the time of the first statin administration after LT using the exposure density sampling (EDS). The propensity score, calculated using baseline variables (including explant pathology), was used for EDS to equilibrate both groups. HCC recurrence and overall death were compared after adjusting for information at the time of sampling. RESULTS: Among statin users, the median time to statin start was 219 (IQR 98-570) days, and intensity of statins was mainly moderate (87.1%). Statin users and nonusers sampled using EDS showed well-balanced baseline characteristics, including detailed tumour pathology, and similar HCC recurrence with cumulative incidences of 11.3% and 11.8% at 5 years, respectively (p = .861). In multivariate Cox models (HR 1.04, p = .918) and subgroup analyses, statins did not affect HCC recurrence. Conversely, statin users showed a significantly lower risk of overall death than nonusers (HR 0.28, p < .001). There was no difference in the type and intensity of statin usage between statin users who experienced HCC recurrence and those who did not. CONCLUSION: Upon controlling immortal time bias by EDS, statins did not affect HCC recurrence but reduced mortality after LT. Statin usage is encouraged for survival benefits but not for preventing HCC recurrence in LT recipients.
Subject(s)
Carcinoma, Hepatocellular , Hydroxymethylglutaryl-CoA Reductase Inhibitors , Illusions , Liver Neoplasms , Liver Transplantation , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Retrospective Studies , Neoplasm Recurrence, Local/epidemiologyABSTRACT
BACKGROUND: Metabolic syndrome (MetS) is prevalent in patients with end-stage kidney disease, and kidney transplantation is expected to modify the metabolic status. However, whether changes in metabolic status at the time of transplantation affect recipient outcomes remains unclear. METHODS: We analyzed 4187 recipients registered in a nationwide prospective cohort from 2014 to 2020. MetS was defined as the presence of three or more components of the metabolic syndrome. Patients were classified based on the pre- and post-transplant MetS status: MetS-free, MetS-developed, MetS-recovered and MetS-persistent. Study outcomes were occurrence of death-censored graft loss and a composite of cardiovascular events and death. RESULTS: Among recipients without pre-transplant MetS, 19.6% (419/2135) developed post-transplant MetS, and MetS disappeared in 38.7% (794/2052) of the recipients with pre-transplant MetS. Among the four groups, the MetS-developed group showed the worst graft survival rate, and the MetS-persistent group had a poorer composite event-free survival rate. Compared with the MetS-free group, the MetS-developed group was associated with an increased risk of graft loss [adjusted hazard ratio (aHR) 2.35; 95% confidence interval (CI) 1.17-4.98] and the risk of graft loss increased with increasing numbers of dysfunctional MetS components. MetS-persistent was associated with increased risks of cardiovascular events and death (aHR 2.46; 95% CI 1.12-5.63), but changes in the number of dysfunctional MetS components was not. CONCLUSION: Kidney transplantation significantly alters the metabolic status. Newly developed MetS after transplantation was associated with an increased risk of graft loss, whereas persistent MetS exposure before and after transplantation was associated with increased risks cardiovascular events and patient survival.
Subject(s)
Cardiovascular Diseases , Kidney Transplantation , Metabolic Syndrome , Humans , Metabolic Syndrome/epidemiology , Metabolic Syndrome/etiology , Kidney Transplantation/adverse effects , Prospective Studies , Risk Factors , Graft Survival , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/etiologyABSTRACT
OBJECTIVES: We aimed to investigate the optimal radiologic method to determine Milan criteria (MC) for the prediction of recurrence in patients who underwent locoregional treatment (LRT) for hepatocellular carcinoma (HCC) and subsequent liver transplantation (LT). METHODS: This retrospective study included 121 HCC patients who underwent LRT and had both liver dynamic CT and MRI. They were classified with MC using four cross combinations of two imaging modalities (CT and MRI) and two diagnostic criteria (modified Response Evaluation Criteria in Solid Tumors [mRECIST] and Liver Imaging Reporting and Data System treatment response algorithm [LI-RADS TRA]). Competing risk regression was performed to analyze the time to recurrence after LT. The predictive abilities of the four methods for recurrence were evaluated using the time-dependent area under the curve (AUC). RESULTS: Competing risk regression analyses found that beyond MC determined by MRI with mRECIST was independently associated with recurrence (hazard ratio, 6.926; p = 0.001). With mRECIST, MRI showed significantly higher AUCs than CT at 3 years and 5 years after LT (0.597 vs. 0.756, p = 0.012 at 3 years; and 0.588 vs. 0.733, p = 0.024 at 5 years). Using the pathologic reference standard, MRI with LI-RADS TRA showed higher sensitivity (61.5%) than CT with LI-RADS TRA (30.8%, p < 0.001) or MRI with mRECIST (38.5%, p < 0.001). CONCLUSIONS: MRI with mRECIST was the optimal radiologic method to determine MC for the prediction of post-LT recurrence in HCC patients with prior LRT. KEY POINTS: ⢠MRI with modified RECIST (mRECIST) is the optimal preoperative method to determine Milan criteria for the prediction of post-transplant HCC recurrence in patients with prior locoregional treatment. ⢠With mRECIST, MRI was better than CT for the prediction of post-transplant recurrence.
Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Humans , Carcinoma, Hepatocellular/diagnostic imaging , Carcinoma, Hepatocellular/therapy , Liver Neoplasms/diagnostic imaging , Liver Neoplasms/therapy , Retrospective Studies , Response Evaluation Criteria in Solid TumorsABSTRACT
Carbapenem-resistant Acinetobacter baumannii bacteremia (CRAB-B) is a fatal infectious complication of liver transplantation (LT). This study investigated the incidence, effects, and risk factors associated with CRAB-B during the early post-LT period. Among 1051 eligible LT recipients, 29 patients experienced CRAB-B within 30 days of LT with a cumulative incidence of 2.7%. In the patients with CRAB-B (n = 29) and matched controls (n = 145) by nested-case control design, the cumulative incidence of death on days 5, 10, and 30 from the index date was 58.6%, 65.5%, and 65.5%, and 2.1%, 2.8%, and 4.2%, respectively (p < .001). Pre-transplant MELD (OR 1.11, 95% confidence interval [CI] 1.04-1.19, p = .002), severe encephalopathy (OR 4.62, 95% CI 1.24-18.61, p = .025), donor body mass index (OR .57, 95% CI .41-.75, p < .001), and reoperation (OR 6.40, 95% CI 1.19-36.82, p = .032) were independent risk factors for 30-day CRAB-B. CRAB-B showed extremely high mortality within 30 days after LT, especially within 5 days after its occurrence. Therefore, assessment of risk factors and early detection of CRAB, followed by proper treatment, are necessary to control CRAB-B after LT.
Subject(s)
Acinetobacter Infections , Acinetobacter baumannii , Bacteremia , Liver Transplantation , Humans , Carbapenems/therapeutic use , Anti-Bacterial Agents/therapeutic use , Incidence , Liver Transplantation/adverse effects , Acinetobacter Infections/drug therapy , Acinetobacter Infections/epidemiology , Acinetobacter Infections/etiology , Bacteremia/drug therapy , Bacteremia/epidemiology , Bacteremia/etiology , Risk FactorsABSTRACT
Patients with end stage kidney disease (ESKD) and a previous acute myocardial infarction (AMI) have less access to KT. Data on ESKD patients with an AMI history who underwent first KT or dialysis between January 2007 and December 2018 were extracted from the Korean National Health Insurance Service. Patients who underwent KT (n = 423) were chronologically matched in a 1:3 ratio with those maintained on dialysis (n = 1,269) at the corresponding dates, based on time-conditional propensity scores. The 1, 5, and 10 years cumulative incidences for all-cause mortality were 12.6%, 39.1%, and 60.1% in the dialysis group and 3.1%, 7.2%, and 14.5% in the KT group. Adjusted hazard ratios (HRs) of KT versus dialysis were 0.17 (95% confidence interval [CI], 0.12-0.24; p < 0.001) for mortality and 0.38 (95% CI, 0.23-0.51; p < 0.001) for major adverse cardiovascular events (MACE). Of the MACE components, KT was most protective against cardiovascular death (HR, 0.23; 95% CI, 0.12-0.42; p < 0.001). Protective effects of KT for all-cause mortality and MACE were consistent across various subgroups, including patients at higher risk (e.g., age >65 years, recent AMI [<6 months], congestive heart failure). KT is associated with lower all-cause mortality and MACE than maintenance dialysis patients with a prior AMI.
Subject(s)
Heart Failure , Kidney Failure, Chronic , Kidney Transplantation , Myocardial Infarction , Humans , Aged , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/surgery , Myocardial Infarction/surgery , Renal DialysisABSTRACT
BACKGROUND: The model for end-stage liver disease 3.0 (MELD3.0) is expected to address the flaws of the current allocation system for deceased donor liver transplantation (DDLT). We aimed to validate MELD3.0 in the Korean population where living donor liver transplantation is predominant due to organ shortages. METHODS: Korean large-volume single-centric waitlist data were merged with the Korean Network for Organ Sharing (KONOS) data. The 90-day mortality was compared between MELD and MELD3.0 using the C-index in 2,353 eligible patients registered for liver transplantation. Patient numbers and outcomes were compared based on changes in KONOS-MELD categorization using MELD3.0. Possible gains in MELD points and reduced waitlist mortality were analyzed. RESULTS: MELD3.0 performed better than MELD (C-index 0.893 for MELD3.0 vs. 0.889 for MELD). When stratified according to the KONOS-MELD categories, 15.9% of the total patients and 35.2% of the deceased patients were up-categorized using MELD3.0 versus MELD categories. The mean gain of MELD points was higher in women (2.6 ± 2.1) than men (2.1 ± 1.9, P < 0.001), and higher in patients with severe ascites (3.3 ± 1.8) than in controls (1.9 ± 1.8, P < 0.001); however, this trend was not significant when the MELD score was higher than 30. When the possible increase in DDLT chance was calculated via up-categorizing using MELD3.0, reducible waitlist mortality was 2.7%. CONCLUSION: MELD3.0 could predict better waitlist mortality than MELD; however, the merit for women and patients with severe ascites is uncertain, and reduced waitlist mortality from implementing MELD3.0 is limited in regions suffering from organ shortage, as in Korea.
Subject(s)
End Stage Liver Disease , Liver Transplantation , Tissue and Organ Procurement , Male , Humans , Female , End Stage Liver Disease/surgery , Ascites , Living Donors , Severity of Illness IndexABSTRACT
OBJECTIVE: The purpose of this study was to evaluate the psychometric properties, including content validity, validity of multiple choice items, and the reliability of the Korean version of the Pressure Ulcer Knowledge Assessment Tool (K-PUKAT 2.0), using classical test theory (CTT) and item response theory (IRT). METHOD: Linguistic validation process and factor analysis were conducted among wound care nurses, staff nurses and nursing students. Items were analysed according to the CTT and IRT using a two-parameter logistic model. Intraclass correlation coefficients were used to examine reliability. RESULTS: A total of 378 wound care nurses, staff nurses and nursing students participated in this study. While most items showed moderate difficulty based on the CTT, difficulty index values based on the IRT were more broadly distributed (low: 5 items; moderate: 9 items; high: 1 item). The intraclass correlation coefficient for K-PUKAT 2.0 was 0.72. CONCLUSION: The K-PUKAT 2.0 demonstrated concise and good psychometric properties. Based on the results of this study, repetitive use of K-PUKAT 2.0 will not only help in distinguishing whether an individual has sufficient clinical knowledge, but will also play a key role in supporting learning.
Subject(s)
Pressure Ulcer , Humans , Psychometrics , Pressure Ulcer/diagnosis , Reproducibility of Results , Surveys and Questionnaires , Republic of KoreaABSTRACT
OBJECTIVE: The liver acts as a frontline barrier against diverse gut-derived pathogens, and the sinusoid is the primary site of liver immune surveillance. However, little is known about liver sinusoidal immune cells in the context of chronic liver disease (CLD). Here, we investigated the antibacterial capacity of liver sinusoidal γδ T cells in patients with various CLDs. DESIGN: We analysed the frequency, phenotype and functions of human liver sinusoidal γδ T cells from healthy donors and recipients with CLD, including HBV-related CLD (liver cirrhosis (LC) and/or hepatocellular carcinoma (HCC)), alcoholic LC and LC or HCC of other aetiologies, by flow cytometry and RNA-sequencing using liver perfusates obtained during living donor liver transplantation. We also measured the plasma levels of D-lactate and bacterial endotoxin to evaluate bacterial translocation. RESULTS: The frequency of liver sinusoidal Vγ9+Vδ2+ T cells was reduced in patients with CLD. Immunophenotypic and transcriptomic analyses revealed that liver sinusoidal Vγ9+Vδ2+ T cells from patients with CLD were persistently activated and pro-apoptotic. In addition, liver sinusoidal Vγ9+Vδ2+ T cells from patients with CLD showed significantly decreased interferon (IFN)-γ production following stimulation with bacterial metabolites and Escherichia coli. The antibacterial IFN-γ response of liver sinusoidal Vγ9+Vδ2+ T cells significantly correlated with liver function, and inversely correlated with the plasma level of D-lactate in patients with CLD. Repetitive in vitro stimulation with E. coli induced activation, apoptosis and functional impairment of liver sinusoidal Vγ9+Vδ2+ T cells. CONCLUSION: Liver sinusoidal Vγ9+Vδ2+ T cells are functionally impaired in patients with CLD. Bacterial translocation and decreasing liver functions are associated with functional impairment of liver sinusoidal Vγ9+Vδ2+ T cells.
Subject(s)
Liver Diseases/immunology , Liver Diseases/pathology , T-Lymphocytes/physiology , Case-Control Studies , Chronic Disease , Endotoxins/blood , Escherichia coli/physiology , Female , Humans , Lactic Acid/blood , Liver Diseases/blood , Liver Transplantation , MaleABSTRACT
BACKGROUND & AIMS: The liver provides a unique niche of lymphocytes enriched with a large proportion of innate-like T cells. However, the heterogeneity and functional characteristics of the hepatic T-cell population remain to be fully elucidated. METHODS: We obtained liver sinusoidal mononuclear cells from the liver perfusate of healthy donors and recipients with HBV-associated chronic liver disease (CLD) during liver transplantation. We performed a CITE-seq analysis of liver sinusoidal CD45+ cells in combination with T cell receptor (TCR)-seq and flow cytometry to examine the phenotypes and functions of liver sinusoidal CD8+ T cells. RESULTS: We identified a distinct CD56hiCD161-CD8+ T-cell population characterized by natural killer (NK)-related gene expression and a uniquely restricted TCR repertoire. The frequency of these cells among the liver sinusoidal CD8+ T-cell population was significantly increased in patients with HBV-associated CLD. Although CD56hiCD161-CD8+ T cells exhibit weak responsiveness to TCR stimulation, CD56hiCD161-CD8+ T cells highly expressed various NK receptors, including CD94, killer immunoglobulin-like receptors, and NKG2C, and exerted NKG2C-mediated NK-like effector functions even in the absence of TCR stimulation. In addition, CD56hiCD161-CD8+ T cells highly respond to innate cytokines, such as IL-12/18 and IL-15, in the absence of TCR stimulation. We validated the results from liver sinusoidal CD8+ T cells using intrahepatic CD8+ T cells obtained from liver tissues. CONCLUSIONS: In summary, the current study found a distinct CD56hiCD161-CD8+ T-cell population characterized by NK-like activation via TCR-independent NKG2C ligation. Further studies are required to elucidate the roles of liver sinusoidal CD56hiCD161-CD8+ T cells in immune responses to microbial pathogens or liver immunopathology. LAY SUMMARY: The role of different immune cell populations in the liver is becoming an area of increasing interest. Herein, we identified a distinct T-cell population that had features similar to those of natural killer (NK) cells - a type of innate immune cell. This distinct population was expanded in the livers of patients with chronic liver disease and could thus have pathogenic relevance.
Subject(s)
CD8-Positive T-Lymphocytes , Interleukin-15 , Immunoglobulins , Interleukin-12 , Liver , Receptors, Antigen, T-CellABSTRACT
OBJECTIVE: To investigate the feasibility and safety of RLDRH. SUMMARY OF BACKGROUND DATA: Data for minimally invasive living-donor right hepatectomy, especially RLDRH, from a relatively large donor cohort that have not been reported yet. METHODS: From March 2016 to March 2019, 52 liver donors underwent RLDRH. The clinical and perioperative outcomes of RLDRH were compared with those of CODRH (n = 62) and LADRH (n = 118). Donor satisfaction with cosmetic results was compared between RLDRH and LADRH using a body image questionnaire. RESULTS: Although RLDRH was associated with longer operative time (minutes) (RLDRH, 493.6; CODRH, 404.4; LADRH, 355.9; P < 0.001), mean estimated blood loss (mL) was significantly lower (RLDRH, 109.8; CODRH, 287.1; LADRH, 265.5; P = 0.001). Postoperative complication rates were similar among the 3 groups (RLDRH, 23.1%; CODRH, 35.5%; LADRH, 28.0%; P = 0.420). Regarding donor satisfaction, body image and cosmetic appearance scores were significantly higher in RLDRH than in LADRH. After propensity score matching, RLDRH showed less estimated blood loss compared to those of CODRH (RLDRH, 114.7âmL; CODRH, 318.4âmL; P < 0.001), but complication rates were similar among the three groups (P = 0.748). CONCLUSIONS: RLDRH resulted in less blood loss compared with that of CODRH and similar postoperative complication rates to CODRH and LADRH. RLDRH provided better body image and cosmetic results compared with those of LADRH. RLDRH is feasible and safe when performed by surgeons experienced with both robotic and open hepatectomy.