RESUMEN
Predicting disease progression in the initial stage to implement early intervention and treatment can effectively prevent the further deterioration of the condition. Traditional methods for medical data analysis usually fail to perform well because of their incapability for mining the correlation pattern of pathogenies. Therefore, many calculation methods have been excavated from the field of deep learning. In this study, we propose a novel method of influence hypergraph convolutional generative adversarial network (IHGC-GAN) for disease risk prediction. First, a hypergraph is constructed with genes and brain regions as nodes. Then, an influence transmission model is built to portray the associations between nodes and the transmission rule of disease information. Third, an IHGC-GAN method is constructed based on this model. This method innovatively combines the graph convolutional network (GCN) and GAN. The GCN is used as the generator in GAN to spread and update the lesion information of nodes in the brain region-gene hypergraph. Finally, the prediction accuracy of the method is improved by the mutual competition and repeated iteration between generator and discriminator. This method can not only capture the evolutionary pattern from early mild cognitive impairment (EMCI) to late MCI (LMCI) but also extract the pathogenic factors and predict the deterioration risk from EMCI to LMCI. The results on the two datasets indicate that the IHGC-GAN method has better prediction performance than the advanced methods in a variety of indicators.
Asunto(s)
Disfunción Cognitiva , Encéfalo , Disfunción Cognitiva/genética , Diagnóstico por Imagen , Progresión de la Enfermedad , HumanosRESUMEN
Imaging genetics provides unique insights into the pathological studies of complex brain diseases by integrating the characteristics of multi-level medical data. However, most current imaging genetics research performs incomplete data fusion. Also, there is a lack of effective deep learning methods to analyze neuroimaging and genetic data jointly. Therefore, this paper first constructs the brain region-gene networks to intuitively represent the association pattern of pathogenetic factors. Second, a novel feature information aggregation model is constructed to accurately describe the information aggregation process among brain region nodes and gene nodes. Finally, a deep learning method called feature information aggregation and diffusion generative adversarial network (FIAD-GAN) is proposed to efficiently classify samples and select features. We focus on improving the generator with the proposed convolution and deconvolution operations, with which the interpretability of the deep learning framework has been dramatically improved. The experimental results indicate that FIAD-GAN can not only achieve superior results in various disease classification tasks but also extract brain regions and genes closely related to AD. This work provides a novel method for intelligent clinical decisions. The relevant biomedical discoveries provide a reliable reference and technical basis for the clinical diagnosis, treatment and pathological analysis of disease.
Asunto(s)
Encefalopatías , Neuroimagen , Humanos , Neuroimagen/métodos , Encéfalo/diagnóstico por imagen , Encefalopatías/diagnóstico por imagen , Encefalopatías/genéticaRESUMEN
Paradoxical embolism is a medical condition characterized by the migration of an embolus from a venous source into the systemic circulation. This occurs through a specific cardiac abnormality known as a right-to-left shunt, ultimately resulting in the possibility of arterial embolism. Patent foramen ovale (PFO) is the most common cause of intracardiac shunting. We reported a rare case of a 56-year-old man on hemodialysis with PFO and arteriovenous fistula dysfunction who suffered a paradoxical embolic ischemic stroke after percutaneous transluminal angioplasty. This case emphasized the potential risk of paradoxical embolism in hemodialysis patients with vascular access problems. We aimed to highlight the importance of searching for PFO, as it may serve as a possible source of embolism in these patients.
Asunto(s)
Angioplastia , Embolia Paradójica , Diálisis Renal , Humanos , Masculino , Persona de Mediana Edad , Diálisis Renal/efectos adversos , Embolia Paradójica/etiología , Embolia Paradójica/diagnóstico , Accidente Cerebrovascular Embólico/etiología , Fallo Renal Crónico/terapia , Fallo Renal Crónico/complicaciones , Foramen Oval Permeable/complicaciones , Foramen Oval Permeable/terapia , Derivación Arteriovenosa Quirúrgica/efectos adversosRESUMEN
Malnutrition is a highly prevalent complication in patients with traumatic brain injury (TBI), and it is closely related to the prognosis of patients. Accurate identification of patients at high risk of malnutrition is essential. Therefore, we analyzed the risk factors of malnutrition in patients with TBI and developed a model to predict the risk of malnutrition. A retrospective collection of 345 patients with TBI, and they were divided into malnutrition and comparison groups according to the occurrence of malnutrition. Univariate correlation and multifactor logistic regression analyses were performed to determine patients' malnutrition risk factors. We used univariate and logistic regression (forward stepwise method) analyses to identify significant predictors associated with malnutrition in patients with TBI and developed a predictive model for malnutrition prediction. The model's discrimination, calibration, and clinical utility were evaluated using the receiver operating characteristic (ROC) curve, calibration plots, and decision curve analysis (DCA). A total of 216 patients (62.6%) developed malnutrition. Multifactorial logistic regression analysis showed that pulmonary infection, urinary tract infection, dysphagia, application of NGT, GCS score ≤ 8, and low ADL score were independent risk factors for malnutrition in patients with TBI (P < 0.05). The area under the curve of the model was 0.947. Calibration plots showed good discrimination of model calibration. DCA showed that the column line plot models were all clinically meaningful when nutritional interventions were performed over a considerable range of threshold probabilities (0-0.98). Malnutrition is widespread in patients with TBI, and the nomogram is a good predictor of whether patients develop malnutrition.
RESUMEN
INTRODUCTION: This study aimed to evaluate the characteristics and prognostic factors for coronavirus disease 2019 (COVID-19) patients on maintenance hemodialysis (HD). METHODS: All admitted HD patients who were infected with SARS-CoV-2 from December 1, 2022, to January 31, 2023, were included. Patients with pneumonia were further classified into the mild, moderate, severe, and critical illness. Clinical symptoms, laboratory results, radiologic findings, treatment, and clinical outcomes were collected. Independent risk factors for progression to critical disease and in-hospital mortality were determined by the multivariate regression analysis. The receiver operating characteristic analysis with the area under the curve was used to evaluate the predictive performance of developing critical status and in-hospital mortality. RESULTS: A total of 182 COVID-19 patients with HD were included, with an average age of the 61.55 years. Out of the total, 84 (46.1%) patients did not have pneumonia and 98 (53.8%) patients had pneumonia. Among patients with pneumonia, 48 (49.0%) had moderate illness, 26 (26.5%) severe illness, and 24 (24.5%) critical illness, respectively. Elder age [HR (95% CI): 1.07 (1.01-1.13), p <0.01], increased levels of lactate dehydrogenase (LDH) [1.01 (1.003-1.01), p <0.01], and C-reactive protein (CRP) [1.01 (1.00-1.01), p = 0.04] were risk factors for developing critical illness. Elder age [1.11 (1.03-1.19), p = 0.01], increased procalcitonin (PCT) [1.07 (1.02-1.12), p = 0.01], and LDH level [1.004 (1-1.01), p = 0.03] were factors associated with increased risk of in-hospital mortality. CONCLUSION: Age, CRP, PCT, and LDH can be used to predict negative clinical outcomes for HD patients with COVID-19 pneumonia.
Asunto(s)
COVID-19 , Neumonía , Humanos , Anciano , Persona de Mediana Edad , SARS-CoV-2 , COVID-19/complicaciones , COVID-19/terapia , Pronóstico , Enfermedad Crítica , Estudios Retrospectivos , Proteína C-Reactiva/análisis , China/epidemiologíaRESUMEN
There is limited understanding of nanoparticle potential ecotoxicity, particularly regarding the influence of environmental factors that can be transferred through the food chain. Here, we assessed the transfer behavior and the ecotoxicity of commercially manufactured graphene oxide nano-materials (GO, <100 nm) in a food chain perspective spanning from Escherichia coli (E. coli) to Caenorhabditis elegans (C. elegans) under simulated environmental conditions. Our findings revealed that E. coli preyed upon GO, subsequently transferring it to C. elegans, with a discernible distribution of GO observed in the digestive system and reproductive system. Accumulated GO generated serious ecological consequences for the higher level of the food chain (C. elegans). More importantly, GO and the resulting injurious effects of germ cells could be transferred to the next generation, indicating that GO exposure could cause genetic damage across generations. Previous research has demonstrated that GO can induce degradation of both the inner and outer cell membranes of E. coli, which is then transmitted to C. elegans through the food chain. Additionally, fulvic acid (FA) possesses various functional groups that enable interaction with nanomaterials. Our findings indicated that these interactions could mitigate ecotoxicity caused by GO exposure via food delivery, and this approach could be extended to modify GO in a way that significantly reduced its toxic effects without compromising performance. These results highlighted how environmental factors could attenuate ecological risks associated with nanomaterial transmission through the food chain.
Asunto(s)
Benzopiranos , Grafito , Nanopartículas , Animales , Caenorhabditis elegans , Escherichia coli/genética , Escherichia coli/metabolismo , Nanopartículas/toxicidad , Grafito/metabolismoRESUMEN
PURPOSE: To evaluate whether the modified suture-button Latarjet procedure with coracoacromial ligament (CAL) and pectoralis minor (PM) preservation could achieve excellent outcomes at the 2-year follow-up. METHODS: During January 2019 to January 2021, the data of patients who underwent modified suture-button Latarjet with CAL and PM preservation in our department were collected. The glenoid bone loss of these patients was greater than 20% or greater than 10% with high demands for exercise. Partial coracoid osteotomy was based on the results of a preoperative 3-dimensional computed tomography evaluation of the glenoid defect area and corresponding coracoid process morphology. The preoperative and postoperative clinical results were assessed. The minimal clinically important difference (MCID) was used to compare improvement in clinical outcomes. Graft-glenoid union and remodeling were assessed using postoperative 3-dimensional computed tomography, and magnetic resonance imaging was performed to confirm the integrity of the CAL and PM postoperatively. RESULTS: In total, 35 patients were included in this study; the mean follow-up time was 26.9 ± 1.9 months. No case of recurrent dislocation or sublaxity. Significant improvements were observed in mean visual analog scale (VAS) scores for pain during motion, American Shoulder and Elbow Surgeons (ASES) score, Rowe score, and Walch-Duplay score (P < .001). The percentage of patients achieving at least an MCID improvement in clinical outcomes was VAS 85.71%, ASES 97.14%, Rowe 100%, and Walch-Duplay 97.14%. Thirty-three patients (94.3% of all cases) were able to return to their preoperative sport levels, 34 grafts (97.1%) achieved bone union (1 soft union) in 6.3 ± 2.2 months, and the coracoid grafts restored 97.1 ± 4.0% of the perfect-fitting circle at the last follow-up. Postoperative computed tomography scan showed that 31 grafts (88.6%) were placed ideally in vertical view. In the axial view, 25 grafts (82.9%) were flushed to the glenoid, whereas 1 and 5 grafts were fixed medially and laterally, respectively. The CAL and PM were visualized postoperatively. No arthropathy was observed in any patient at the last follow-up. CONCLUSIONS: The modified suture-button Latarjet procedure with CAL and PM preservation obtained good clinical and radiological results without recurrence or complications. A substantial number of patients (>85%) achieved the MCID for the VAS, ASES, Rowe, and Walch-Duplay scores. In addition, the malpositioned graft (17.1%) did not cause arthropathy of the joints at 2-year follow-up. LEVEL OF EVIDENCE: Level IV, retrospective case series.
RESUMEN
Heparin-induced thrombocytopenia (HIT) is a severe, potentially life-threatening adverse drug reaction. It is an antibody-mediated process involving platelet activation. Heparin and low-molecular-weight heparin (LMWH) are routinely used in uremic patients undergoing hemodialysis. Here, we report a case of HIT that occurred in a hemodialysis patient after she switched from heparin to the LMWH nadroparin for anticoagulation during hemodialysis. The clinical features, incidence, mechanism, and treatment of HIT are discussed.
Asunto(s)
Heparina , Trombocitopenia , Femenino , Humanos , Heparina/efectos adversos , Heparina de Bajo-Peso-Molecular/efectos adversos , Anticoagulantes/efectos adversos , Trombocitopenia/inducido químicamente , Trombocitopenia/diagnóstico , Trombocitopenia/tratamiento farmacológico , Diálisis Renal/efectos adversosRESUMEN
OBJECTIVE: This study discusses the effects of focus training on heart rate variability (HRV) in post-stroke fatigue (PoSF) patients. METHODS: Self-generate physiological coherence system (SPCS) was used for the focus training of PoSF patients for 12 weeks. Then, fatigue severity scale (FSS), Hamilton depression scale (HAMD), HRV and satisfaction scale (SASC-19) before and after the training were assessed. RESULTS: Compared with the control group, FSS score, HAMD score, RMSSD, PNN50% were significantly lower in the research group at the end of the intervention (P < 0.05); SDNN, SDANN, LF, HF, LF/HF intervention satisfaction rate increased significantly in the research group at the end of the intervention (P < 0.05). CONCLUSION: The use of SPCS software during the focus training of PoSF patients reduced the fatigue and depression, meanwhile improved the HRV of the patients. Therefore, these patients were greatly satisfied with the intervention.
Asunto(s)
Sistema Nervioso Autónomo , Accidente Cerebrovascular , Fatiga/etiología , Fatiga/terapia , Frecuencia Cardíaca/fisiología , Humanos , Accidente Cerebrovascular/complicacionesRESUMEN
BACKGROUND: Mixed Reality technology may provide many advantages over traditional teaching methods. Despite its potential, the technology has yet to be used for the formal assessment of clinical competency. This study sought to collect validity evidence and assess the feasibility of using the HoloLens 2 mixed reality headset for the conduct and augmentation of Objective Structured Clinical Examinations (OSCEs). METHODS: A prospective cohort study was conducted to compare the assessment of undergraduate medical students undertaking OSCEs via HoloLens 2 live (HLL) and recorded (HLR), and gold-standard in-person (IP) methods. An augmented mixed reality scenario was also assessed. RESULTS: Thirteen undergraduate participants completed a total of 65 OSCE stations. Overall inter-modality correlation was 0.81 (p = 0.01), 0.98 (p = 0.01) and 0.82 (p = 0.01) for IP vs. HLL, HLL vs. HLR and IP vs. HLR respectively. Skill based correlations for IP vs. HLR were assessed for history taking (0.82, p = 0.01), clinical examination (0.81, p = 0.01), procedural (0.88, p = 0.01) and clinical skills (0.92, p = 0.01), and assessment of a virtual mixed reality patient (0.74, p = 0.01). The HoloLens device was deemed to be usable and practical (Standard Usability Scale (SUS) score = 51.5), and the technology was thought to deliver greater flexibility and convenience, and have the potential to expand and enhance assessment opportunities. CONCLUSIONS: HoloLens 2 is comparable to traditional in-person examination of undergraduate medical students for both live and recorded assessments, and therefore is a valid and robust method for objectively assessing performance. The technology is in its infancy, and users need to develop confidence in its usability and reliability as an assessment tool. However, the potential to integrate additional functionality including holographic content, automated tracking and data analysis, and to facilitate remote assessment may allow the technology to enhance, expand and standardise examinations across a range of educational contexts.
Asunto(s)
Realidad Aumentada , Estudiantes de Medicina , Competencia Clínica , Evaluación Educacional/métodos , Humanos , Estudios Prospectivos , Reproducibilidad de los Resultados , TecnologíaRESUMEN
AIM: This study aims to analyze the effects of rhythm of music therapy on gait in patients with ischemic stroke, and explore the value of music therapy in walking training in stroke. METHODS: The present study is a prospective clinical study. Sixty patients with ischemic stroke, who were admitted to our hospital from October 2017 to December 2018, were enrolled. These patients were divided into two groups, according to the method of the random number table, with thirty patients in each group: control group and study group. Patients in the control group received conventional drug therapy, rehabilitation training and walking training, while the patients in the study group were given music therapy on the basis of the above mentioned therapies for four weeks, during which Sunday was regarded as a rest day, and the music therapy was suspended. The main outcome measures included indexes in evaluating the walking ability of patients in these two groups. At each time point, the Fugl-Meyer Assessment (FMA), Berg Balance Scale (BBS) and stroke rehabilitation treatment satisfaction questionnaire were used. RESULTS: The results revealed that the stride length, cadence and maximum velocity were higher in patients in the study group, when compared to patients in the control group, at the second week and end of the therapy, and the difference in step length between the affected side and healthy side was significantly lower in the study group than in the control group. These differences were statistically significant (P < 0.05). In the second week of therapy and at the end of therapy, the FMA and BBS scores were higher in the study group than in the control group, and the difference was statistically significant (P < 0.05). The total satisfaction rate was higher in the study group than in the control group, and the difference was statistically significant (P < 0.05). CONCLUSION: Under the stimulation of music rhythm, applying music therapy to patients with ischemic stroke can improve their gait, walking ability, lower limb motor function, balance ability and treatment satisfaction.
Asunto(s)
Marcha , Accidente Cerebrovascular Isquémico/terapia , Musicoterapia , Música , Periodicidad , Rehabilitación de Accidente Cerebrovascular , Anciano , Estudios de Casos y Controles , Femenino , Humanos , Accidente Cerebrovascular Isquémico/diagnóstico , Accidente Cerebrovascular Isquémico/fisiopatología , Masculino , Persona de Mediana Edad , Actividad Motora , Satisfacción del Paciente , Equilibrio Postural , Estudios Prospectivos , Distribución Aleatoria , Recuperación de la Función , Factores de Tiempo , Resultado del TratamientoRESUMEN
OBJECTIVES: Inflammation is associated with the occurrence and prognosis of ischemic stroke. The aim of this study was to evaluate the association between inflammatory biomarkers and the short-term clinical outcomes of acute ischemic stroke (AIS) patients after intravenous thrombolysis (IVT). MATERIALS AND METHODS: A total of 208 AIS patients treated with IVT were enrolled in this retrospective study. Blood tests of inflammatory biomarkers, including the leukocyte count, neutrophil count, lymphocyte count, neutrophil-to-lymphocyte ratio and high-sensitivity C-reactive protein level, were conducted within 24 h after IVT. The primary outcome was decent functional recovery (DFR) [modified Rankin Scale score (mRS) of 0-2] at 3 months. The secondary outcomes included symptomatic intracranial hemorrhage and 3-month mortality. A multivariate analysis was performed to evaluate the associations between inflammatory biomarkers and 3-month clinical outcomes. RESULTS: At 3 months follow-up, 113 (62.2%) patients achieved DFR. As compared to patients with DFR, patients without DFR had higher leukocyte counts (8.5 ± 2.4â¯×â¯109/L versus 6.9 ± 1.7â¯×â¯109/L, P=0.000), neutrophil counts (6.1 ± 2.3â¯×â¯109/L versus 4.6±1.7â¯×â¯109/L, P=0.000) and neutrophil-to-lymphocyte ratio (4.6 ± 2.4 versus 3.3 ± 1.9, P=0.000). After adjusting for the stroke subtype, severity of stroke, and medical history, the leukocyte count and neutrophil count remained significantly correlated with non-DFR (adjusted odds ratio [OR] 1.488; 95% confidence interval [CI], 1.247-1.776; P=0.000 and adjusted OR 1.522; 95% CI, 1.269-1.826; P=0.000, respectively). CONCLUSIONS: This study demonstrates that increased levels of inflammatory biomarkers are independently associated with poor outcomes at 3 months in AIS patients treated with IVT.
Asunto(s)
Proteína C-Reactiva/análisis , Fibrinolíticos/administración & dosificación , Mediadores de Inflamación/sangre , Accidente Cerebrovascular Isquémico/tratamiento farmacológico , Terapia Trombolítica , Anciano , Anciano de 80 o más Años , Biomarcadores/sangre , Evaluación de la Discapacidad , Femenino , Fibrinolíticos/efectos adversos , Estado Funcional , Humanos , Infusiones Intravenosas , Accidente Cerebrovascular Isquémico/sangre , Accidente Cerebrovascular Isquémico/diagnóstico , Recuento de Leucocitos , Masculino , Persona de Mediana Edad , Recuperación de la Función , Estudios Retrospectivos , Factores de Riesgo , Terapia Trombolítica/efectos adversos , Factores de Tiempo , Resultado del TratamientoRESUMEN
RATIONALE & OBJECTIVE: Compared with recipients of blood group ABO-compatible (ABOc) living donor kidney transplants (LDKTs), recipients of ABO-incompatible (ABOi) LDKTs have higher risk for graft loss, particularly in the first few weeks after transplantation. However, the decision to proceed with ABOi LDKT should be based on a comparison of the alternative: waiting for future ABOc LDKTs (eg, through kidney paired exchange) or for a deceased donor kidney transplant (DDKT). We sought to evaluate the patient survival difference between ABOi LDKTs and waiting for an ABOc LDKT or an ABOc DDKT. STUDY DESIGN: Retrospective cohort study of adults in the Scientific Registry of Transplant Recipients. SETTING & PARTICIPANTS: 808 ABOi LDKT recipients and 2,423 matched controls from among 245,158 adult first-time kidney-only waitlist registrants who did not receive an ABOi LDKT and who remained on the waitlist or received either an ABOc LDKT or an ABOc DDKT, 2002 to 2017. EXPOSURE: Receipt of ABOi LDKT. OUTCOME: Death. ANALYTICAL APPROACH: We compared mortality among ABOi LDKT recipients versus a weighted matched comparison population using Cox proportional hazards regression and Cox models that accommodated for changing hazard ratios over time. RESULTS: Compared with matched controls, ABOi LDKT was associated with greater mortality risk in the first 30 days posttransplantation (cumulative survival of 99.0% vs 99.6%) but lower mortality risk beyond 180 days posttransplantation. Patients who received an ABOi LDKT had higher cumulative survival at 5 and 10 years (90.0% and 75.4%, respectively) than similar patients who remained on the waitlist or received an ABOc LDKT or ABOc DDKT (81.9% and 68.4%, respectively). LIMITATIONS: No measurement of ABO antibody titers in recipients; eligibility of participants for kidney paired donation is unknown. CONCLUSIONS: Transplant candidates who receive an ABOi LDKT and survive more than 180 days posttransplantation experience a long-term survival benefit compared to remaining on the waitlist to potentially receive an ABOc kidney transplant.
Asunto(s)
Sistema del Grupo Sanguíneo ABO/inmunología , Rechazo de Injerto/mortalidad , Trasplante de Riñón/mortalidad , Donadores Vivos , Sistema de Registros , Receptores de Trasplantes , Adulto , Femenino , Estudios de Seguimiento , Rechazo de Injerto/inmunología , Supervivencia de Injerto , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Tasa de Supervivencia/tendencias , Estados Unidos/epidemiologíaRESUMEN
Optimization of maintenance immunosuppression (mIS) regimens in the transplant recipient requires a balance between sufficient potency to prevent rejection and avoidance of excessive immunosuppression to prevent toxicities and complications. The optimal regimen after simultaneous liver-kidney (SLK) transplantation remains unclear, but small single-center reports have shown success with steroid-sparing regimens. We studied 4184 adult SLK recipients using the Scientific Registry of Transplant Recipients, from March 1, 2002, to February 28, 2017, on tacrolimus-based regimens at 1 year post-transplant. We determined the association between mIS regimen and mortality and graft failure using Cox proportional hazard models. The use of steroid-sparing regimens increased post-transplant, from 16.1% at discharge to 88.0% at 5 years. Using multi-level logistic regression modeling, we found center-level variation to be the major contributor to choice of mIS regimen (ICC 44.5%; 95% CI: 36.2%-53.0%). In multivariate analysis, use of a steroid-sparing regimen at 1 year was associated with a 21% decreased risk of mortality compared to steroid-containing regimens (aHR 0.79, P = .01) and 20% decreased risk of liver graft failure (aHR 0.80, P = .01), without differences in kidney graft loss risk (aHR 0.92, P = .6). Among SLK recipients, the use of a steroid-sparing regimen appears to be safe and effective without adverse effects on patient or graft survival.
Asunto(s)
Trasplante de Riñón , Adulto , Rechazo de Injerto/tratamiento farmacológico , Rechazo de Injerto/etiología , Rechazo de Injerto/prevención & control , Supervivencia de Injerto , Humanos , Terapia de Inmunosupresión , Inmunosupresores/uso terapéutico , Riñón , Hígado , Esteroides/uso terapéuticoRESUMEN
BACKGROUND: We sought to identify factors that are associated with LOS following pediatric (<18 years) liver transplantation in order to provide personalized counseling and discharge planning for recipients and their families. METHODS: We identified 2726 infants (≤24 months) and 3210 children (>24 months) who underwent pediatric liver-only transplantation from 2002-2017 using the Scientific Registry of Transplant Recipients. We used multilevel multivariable negative binomial regression to analyze associations between LOS and recipient and donor characteristics and calculated the MLOSR to quantify heterogeneity in LOS across centers. RESULTS: In infants, the median LOS (IQR) was 19 (13-32) days. Hospitalization prior to transplant (ICU ratio:1.46 1.591.70 ; non-ICU ratio:1.08 1.161.23 ), public insurance (ratio:1.03 1.091.15 ), and a segmental graft (ratio:1.08 1.151.22 ) were associated with a longer LOS; thus, we would expect a 1.59-fold longer LOS in an infant admitted to the ICU compared to a non-hospitalized infant with similar characteristics. In children, the median LOS (IQR) was 13 (9-21) days. Hospitalization prior to transplant (ICU ratio:1.49 1.621.77 ; non-ICU ratio:1.34 1.441.56 ), public insurance (ratio:1.02 1.071.13 ), a segmental graft (ratio:1.20 1.271.35 ), a living donor graft (ratio:1.27 1.381.51 ), and obesity (ratio:1.03 1.101.17 ) were associated with a longer LOS. The MLOSR was 1.25 in infants and 1.26 in children, meaning if an infant received a transplant at another center with a longer LOS, we would expect a 1.25-fold difference in LOS driven by center practices alone. CONCLUSIONS: While center-level practices account for substantial variation in LOS, consideration of donor and recipient factors can help clinicians provide more personalized counseling for families of pediatric liver transplant candidates.
Asunto(s)
Tiempo de Internación/estadística & datos numéricos , Trasplante de Hígado , Niño , Preescolar , Femenino , Humanos , Lactante , MasculinoRESUMEN
OBJECTIVES: To investigate the correlation between brain-derived neurotrophic factor (BDNF) and risk factors, as well as functional outcome in poststroke depression (PSD) or poststroke anxiety (PSA). DESIGN: Cohort study. SETTING: Stroke patients admitted to an urban rehabilitation hospital. PARTICIPANTS: Stroke patients (N=162) without any previous history of depression and anxiety. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: Sociodemographic information and comorbidities were recorded during hospital admission. Functional outcomes were assessed using FIM scores at time of admission and discharge. The influence of various factors such as BDNF and patient characteristics on functional outcome was investigated. Single-factor effect was examined using simple logistic regression, as was multi-factor effect using multiple logistic regression. The goodness-of-fit of those regression models was evaluated by the integrated area under ROC curve. RESULTS: PSD was diagnosed in 61 (37.7%) patients, and PSA was diagnosed in 40 (24.7%). Multiple logistic analysis showed that BDNF, divorce or separation, and history of smoking were significantly associated with the occurrence of PSD but not with the occurrence of PSA. The model combining low BDNF level and divorce or separation improved the prediction for PSD. Among the variables analyzed for prediction of functional outcome, serum BDNF had a minimum correlation with motor FIM scores in PSD but no significant correlation with motor FIM scores in PSA. CONCLUSIONS: BDNF is a valuable prediction for the occurrence of PSD but not for PSA. More strikingly, ischemic stroke patients who are divorced or separated with low serum BDNF have a much higher risk for PSD. BDNF has a minimum correlation with motor function outcome in PSD but no significant correlation with motor outcome in PSA.
Asunto(s)
Ansiedad/sangre , Factor Neurotrófico Derivado del Encéfalo/sangre , Depresión/sangre , Accidente Cerebrovascular/sangre , Accidente Cerebrovascular/psicología , Anciano , Anciano de 80 o más Años , Ansiedad/etiología , Ansiedad/fisiopatología , Estudios de Cohortes , Depresión/etiología , Depresión/fisiopatología , Divorcio , Femenino , Humanos , Masculino , Persona de Mediana Edad , Rendimiento Físico Funcional , Factores de Riesgo , Accidente Cerebrovascular/fisiopatología , Rehabilitación de Accidente Cerebrovascular , Resultado del TratamientoRESUMEN
Microcystin-LR (MC-LR) is a widely known hepatotoxin which could induce the occurrence and metastasis of hepatocellular carcinoma. In recent years, with the frequent outbreak of cyanobacteria, the harm of MC-LR has gradually attracted more attention. Hence, this study focused on the effect of MC-LR on DNA damage in HepG2 cells, identifying the types and sources of free radicals that make an important function on this issue. Our data suggested that MC-LR induced concentration- and time-dependent increasement of DNA double-strand breaks (DSBs). After exposure to 1 µM MC-LR for 3 days, the protein expression and immunofluorescence staining of γ-H2AX was significantly increased. Using a scavenger of mitochondrial O2.- (4-hydroxy-tempo), a inhibitor of mitochondrial NOS (7-nitroindazole), and a scavenger of ONOO- (uric acid), it was revealed that ONOO- originated from mitochondria made a significant contribution to the genotoxicity of MC-LR. Moreover, a significant decreasement of mitochondrial membrane potential (MMP) was observed. These findings suggested that peroxynitrite targeting mitochondria plays a vital role in the MC-LR-induced genotoxic response in mammalian cells.
Asunto(s)
Roturas del ADN de Doble Cadena , Microcistinas/toxicidad , Mitocondrias Hepáticas/efectos de los fármacos , Ácido Peroxinitroso/metabolismo , Animales , Carcinoma Hepatocelular/genética , Cianobacterias/crecimiento & desarrollo , Células Hep G2 , Histonas/metabolismo , Humanos , Neoplasias Hepáticas/genética , Toxinas Marinas , Potencial de la Membrana Mitocondrial/efectos de los fármacos , Mitocondrias Hepáticas/metabolismoRESUMEN
A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001-2016 to evaluate death-censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15-year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.73 0.770.82 , P < .001), and was attenuated among African American donors (aHR 0.77 0.850.95 ; interaction: P = .01) and female recipients (aHR 0.77 0.840.91 , P < .001). Although offspring kidney recipients had higher mortality (15-year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.02 1.061.10 , P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.93 0.971.01 , P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.
Asunto(s)
Fallo Renal Crónico/mortalidad , Fallo Renal Crónico/cirugía , Trasplante de Riñón/métodos , Riñón/cirugía , Donadores Vivos , Receptores de Trasplantes , Adulto , Negro o Afroamericano , Anciano , Femenino , Rechazo de Injerto/etiología , Supervivencia de Injerto , Antígenos HLA , Humanos , Incidencia , Fallo Renal Crónico/etnología , Masculino , Persona de Mediana Edad , Complicaciones Posoperatorias , Modelos de Riesgos Proporcionales , Sistema de Registros , Resultado del Tratamiento , Estados UnidosRESUMEN
Delayed graft function (DGF) complicates 20%-40% of deceased-donor kidney transplants and is associated with increased length of stay and subsequent allograft failure. Accurate prediction of DGF risk for a particular allograft could influence organ allocation, patient counseling, and postoperative planning. Mitochondrial dysfunction, a reported surrogate of tissue health in ischemia-perfusion injury, might also be a surrogate for tissue health after organ transplantation. To understand the potential of mitochondrial membrane potential (MMP) in clinical decision-making, we analyzed whether lower MMP, a measure of mitochondrial dysfunction, was associated with DGF. In a prospective, single-center proof-of-concept study, we measured pretransplant MMP in 28 deceased donor kidneys and analyzed the association between MMP and DGF. We used hybrid registry-augmented regression to adjust for donor and recipient characteristics, minimizing overfitting by leveraging Scientific Registry of Transplant Recipients data. The range of MMP levels was 964-28 333 units. Low-MMP kidneys (MMP<4000) were more likely from female donors (75% vs 10%, P = .002) and donation after cardiac death donors (75% vs 12%, P = .004). For every 10% decrease in MMP levels, there were 38% higher odds of DGF (adjusted odds ratio = 1.08 1.381.78 , P = .01). In summary, MMP might be a promising pretransplant surrogate for tissue health in kidney transplantation and, after further validation, could improve clinical decision-making through its independent association with DGF.
Asunto(s)
Funcionamiento Retardado del Injerto/etiología , Rechazo de Injerto/etiología , Supervivencia de Injerto , Fallo Renal Crónico/cirugía , Trasplante de Riñón/efectos adversos , Potencial de la Membrana Mitocondrial , Complicaciones Posoperatorias , Adulto , Funcionamiento Retardado del Injerto/patología , Femenino , Estudios de Seguimiento , Tasa de Filtración Glomerular , Rechazo de Injerto/patología , Humanos , Pruebas de Función Renal , Masculino , Persona de Mediana Edad , Perfusión , Proyectos Piloto , Pronóstico , Estudios Prospectivos , Factores de Riesgo , Donantes de Tejidos , Obtención de Tejidos y Órganos , Receptores de Trasplantes , Adulto JovenRESUMEN
Deceased donor kidney transplantation (DDKT) rates for highly sensitized (HS) candidates increased early after implementation of the Kidney Allocation System (KAS) in 2014. However, this may represent a bolus effect, and a granular investigation of the current state of DDKT for HS candidates remains lacking. We studied 270 722 DDKT candidates from the SRTR from 12/4/2011 to 12/3/2014 ("pre-KAS") and 12/4/2014 to 12/3/2017 ("post-KAS"), analyzing DDKT rates for HS candidates using adjusted negative binomial regression. Post-KAS, candidates with the highest levels of sensitization had an increased DDKT rate compared with pre-KAS (cPRA 98% adjusted incidence rate ratio [aIRR]:1.27 1.772.46 P = .001, cPRA 99% aIRR:3.18 4.365.98 P < .001, cPRA 99.5-99.9% aIRR:16.91 24.2934.89 P < .001, and cPRA 99.9%+ aIRR:8.79 11.5815.26 P < .001). To determine whether these changes produced more equitable access to DDKT, we compared DDKT rates of HS to non-HS candidates (cPRA 0-79%). Post-KAS, cPRA, 98% candidates had an equivalent DDKT rate (aIRR:0.65 0.941.36 , P = .8) to non-HS candidates, whereas 99% candidates had a higher DDKT rate (aIRR:1.19 1.682.38 , P = .02). Although cPRA 99.5-99.9% candidates had an increased DDKT rate (aIRR:2.46 3.504.98 , P < .001) compared to non-HS candidates, cPRA 99.9%+ candidates had a significantly lower DDKT rate (aIRR:0.29 0.400.56 , P < .001). KAS has improved access to DDKT for HS candidates, although substantial imbalance exists between cPRA 99.5-99.9% and 99.9%+ candidates.