Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 26
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Brain ; 2024 Jun 18.
Artículo en Inglés | MEDLINE | ID: mdl-38889248

RESUMEN

The default mode network (DMN) is a widely distributed, intrinsic brain network thought to play a crucial role in internally-directed cognition. The present study employs stereo-electroencephalography in 13 human patients, obtaining high resolution neural recordings across multiple canonical DMN regions during two processes that have been associated with creative thinking: spontaneous and divergent thought. We probe these two DMN-associated higher cognitive functions through mind wandering and alternate uses tasks, respectively. Our results reveal DMN recruitment during both tasks, as well as a task-specific dissociation in spatiotemporal response dynamics. When compared to the fronto-parietal network, DMN activity was characterized by a stronger increase in gamma band power (30-70 Hz) coupled with lower theta band power (4-8 Hz). The difference in activity between the two networks was especially strong during the mind wandering task. Within the DMN, we found that the tasks showed different dynamics, with the alternate uses task engaging the DMN more during the initial stage of the task, and mind wandering in the later stage. Gamma power changes were mainly driven by lateral DMN sites, while theta power displayed task-specific effects. During alternate uses task, theta changes did not show spatial differences within the DMN, while mind wandering was associated to an early lateral and late dorsomedial DMN engagement. Furthermore, causal manipulations of DMN regions using direct cortical stimulation preferentially decreased the originality of responses in the alternative uses task, without affecting fluency or mind wandering. Our results suggest that DMN activity is flexibly modulated as a function of specific cognitive processes and supports its causal role in divergent thinking. These findings shed light on the neural constructs supporting different forms of cognition and provide causal evidence for the role of DMN in the generation of original connections among concepts.

2.
Artículo en Inglés | MEDLINE | ID: mdl-36127157

RESUMEN

Deep brain stimulation (DBS) is an established and growing intervention for treatment-resistant obsessive-compulsive disorder (TROCD). We assessed current evidence on the efficacy of DBS in alleviating OCD and comorbid depressive symptoms including newly available evidence from recent trials and a deeper risk of bias analysis than previously available. PubMed and EMBASE databases were systematically queried using Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. We included studies reporting primary data on multiple patients who received DBS therapy with outcomes reported through the Yale-Brown Obsessive-Compulsive Scale (Y-BOCS). Primary effect measures included Y-BOCS mean difference and per cent reduction as well as responder rate (≥35% Y-BOCS reduction) at last follow-up. Secondary effect measures included standardised depression scale reduction. Risk of bias assessments were performed on randomised controlled (RCTs) and non-randomised trials. Thirty-four studies from 2005 to 2021, 9 RCTs (n=97) and 25 non-RCTs (n=255), were included in systematic review and meta-analysis based on available outcome data. A random-effects model indicated a meta-analytical average 14.3 point or 47% reduction (p<0.01) in Y-BOCS scores without significant difference between RCTs and non-RCTs. At last follow-up, 66% of patients were full responders to DBS therapy. Sensitivity analyses indicated a low likelihood of small study effect bias in reported outcomes. Secondary analysis revealed a 1 standardised effect size (Hedges' g) reduction in depressive scale symptoms. Both RCTs and non-RCTs were determined to have a predominantly low risk of bias. A strong evidence base supports DBS for TROCD in relieving both OCD and comorbid depression symptoms in appropriately selected patients.

3.
Clin Transplant ; 36(3): e14544, 2022 03.
Artículo en Inglés | MEDLINE | ID: mdl-34854503

RESUMEN

The study of marginal liver transplant outcomes, including post-transplant length of stay (LOS), is necessary for determining the practicality of their use. 50 155 patients who received transplants from 2012 to 2020 were retrospectively analyzed with data from the Scientific Registry of Transplant Recipients database using Kaplan-Meier survival curves and multivariable Cox regression. Six different definitions were used to classify an allograft as being marginal: 90th percentile Donor Risk Index (DRI) allografts, donation after cardiac death (DCD) donors, national share donors, donors over 70, donors with > 30% macrovesicular steatosis, or 90th percentile Discard Risk Index donors. 24% (n = 12 124) of subjects received marginal allografts. Average LOS was 15.6 days among those who received standard allografts. Among those who received marginal allografts, LOS was found to be highest in those who received 90th percentile DRI allografts at 15.6 days, and lowest in those who received DCD allografts at 12.7 days. Apart from fatty livers (95% CI .86-.98), marginal allografts were not associated with a prolonged LOS. We conclude that accounting for experience and recipient matching, transplant centers may be more aggressive in their use of extended criteria donors with limited fear of increasing LOS and its associated costs.


Asunto(s)
Trasplante de Hígado , Aloinjertos , Supervivencia de Injerto , Humanos , Tiempo de Internación , Trasplante de Hígado/efectos adversos , Estudios Retrospectivos , Donantes de Tejidos
4.
Clin Transplant ; 36(6): e14646, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-35304775

RESUMEN

Despite improvements in survival across races in the past 20 years, African Americans have worse liver transplant outcomes after orthotopic liver transplantation (OLT). This study aims at quantifying the change in disparities between African Americans and other races in survival after OLT. We retrospectively analyzed the United Network for Organ Sharing (UNOS) database for patient data for candidates who received a liver transplant between January 1, 2007 and December 31, 2017. Multivariate Cox proportional hazards regression indicated similar decreases in mortality over time for each race with a decrease in mortality for African Americans: 2010-2012 (HR = .930), 2012-2015 (HR = .882), and 2015-2017 (HR = .883) when compared to 2007-2010. Risk of mortality for African Americans compared to Caucasians varied across the 4 eras: 2007-2010 (HR = 1.083), 2010-2012 (HR = 1.090), 2012-2015 (HR = 1.070), and 2015-2017 (HR = 1.125). While African Americans have seen increases in survival in the past decade, a similar increase in survival for other races leaves a significant survival disparity in African Americans.


Asunto(s)
Negro o Afroamericano , Trasplante de Hígado , Bases de Datos Factuales , Humanos , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Estados Unidos/epidemiología , Población Blanca
5.
Nephrology (Carlton) ; 27(5): 450-457, 2022 May.
Artículo en Inglés | MEDLINE | ID: mdl-34984749

RESUMEN

Despite advancements in diabetic care, diabetic kidney transplant recipients have significantly worse outcomes than non-diabetics. AIM: Our study aims to demonstrate the impact of diabetes, types I and II, on American young adults (18-40 years old) requiring kidney transplantation. METHODS: Using the United Network for Organ Sharing database, we conducted a population cohort study that included all first-time, kidney-only transplant recipients during 2002-2019, ages 18-40 years old. Patients were grouped according to indication for transplant. Primary outcomes were cumulative all-cause mortality and death-censored graft failure. Death-censored graft failure and patient survival at 1, 5, and 10 years were calculated via the Kaplan-Meier method. Multivariate Cox regression was used to assess for potential confounders. RESULTS: Of 42 466 transplant recipients, 3418 (8.1%) had end-stage kidney disease associated with diabetes. At each time-point, cumulative mortality was higher in diabetics compared to patients with non-diabetic causes of renal failure. Conversely, cumulative graft failure was similar between the groups. Adjusted hazard ratios for all-cause mortality and graft failure in diabetics were 2.99 (95% CI 2.67-3.35; p < .01) and 0.98 (95% CI 0.92-1.05, p < .01), respectively. CONCLUSION: Diabetes mellitus in young adult kidney transplant recipients is associated with a nearly three-fold increase in mortality, reflecting a relatively vulnerable patient population. Identifying the underlying causes of poor outcomes in this population should be a priority for future study.


Asunto(s)
Diabetes Mellitus , Receptores de Trasplantes , Adolescente , Adulto , Estudios de Cohortes , Diabetes Mellitus/diagnóstico , Diabetes Mellitus/epidemiología , Rechazo de Injerto/epidemiología , Supervivencia de Injerto , Humanos , Estudios Retrospectivos , Factores de Riesgo , Resultado del Tratamiento , Estados Unidos/epidemiología , Adulto Joven
6.
Neurosurg Focus ; 52(4): E8, 2022 04.
Artículo en Inglés | MEDLINE | ID: mdl-35364585

RESUMEN

OBJECTIVE: Vestibular schwannomas (VSs) are the most common neoplasm of the cerebellopontine angle in adults. Though these lesions are generally slow growing, their growth patterns and associated symptoms can be unpredictable, which may complicate the decision to pursue conservative management versus active intervention. Additionally, surgical decision-making can be controversial because of limited high-quality evidence and multiple quality-of-life considerations. Machine learning (ML) is a powerful tool that utilizes data sets to essentialize multidimensional clinical processes. In this study, the authors trained multiple tree-based ML algorithms to predict the decision for active treatment versus MRI surveillance of VS in a single institutional cohort. In doing so, they sought to assess which preoperative variables carried the most weight in driving the decision for intervention and could be used to guide future surgical decision-making through an evidence-based approach. METHODS: The authors reviewed the records of patients who had undergone evaluation by neurosurgery and otolaryngology with subsequent active treatment (resection or radiation) for unilateral VS in the period from 2009 to 2021, as well as those of patients who had been evaluated for VS and were managed conservatively throughout 2021. Clinical presentation, radiographic data, and management plans were abstracted from each patient record from the time of first evaluation until the last follow-up or surgery. Each encounter with the patient was treated as an instance involving a management decision that depended on demographics, symptoms, and tumor profile. Decision tree and random forest classifiers were trained and tested to predict the decision for treatment versus imaging surveillance on the basis of unseen data using an 80/20 pseudorandom split. Predictor variables were tuned to maximize performance based on lowest Gini impurity indices. Model performance was optimized using fivefold cross-validation. RESULTS: One hundred twenty-four patients with 198 rendered decisions concerning management were included in the study. In the decision tree analysis, only a maximum tumor dimension threshold of 1.6 cm and progressive symptoms were required to predict the decision for treatment with 85% accuracy. Optimizing maximum dimension thresholds and including age at presentation boosted accuracy to 88%. Random forest analysis (n = 500 trees) predicted the decision for treatment with 80% accuracy. Factors with the highest variable importance based on multiple measures of importance, including mean minimal conditional depth and largest Gini impurity reduction, were maximum tumor dimension, age at presentation, Koos grade, and progressive symptoms at presentation. CONCLUSIONS: Tree-based ML was used to predict which factors drive the decision for active treatment of VS with 80%-88% accuracy. The most important factors were maximum tumor dimension, age at presentation, Koos grade, and progressive symptoms. These results can assist in surgical decision-making and patient counseling. They also demonstrate the power of ML algorithms in extracting useful insights from limited data sets.


Asunto(s)
Neuroma Acústico , Adulto , Algoritmos , Árboles de Decisión , Humanos , Aprendizaje Automático , Imagen por Resonancia Magnética , Neuroma Acústico/diagnóstico por imagen , Neuroma Acústico/cirugía
7.
Neurosurg Focus ; 53(6): E16, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-36455273

RESUMEN

Targeted therapies for driver gene fusions in cancers have yielded substantial improvements in care. Here, the authors outline a case series of 6 patients with FGFR3-TACC3 fusion in primary brain tumors ranging from polymorphous low-grade neuroepithelial tumor of the young to papillary glioneuronal tumors and glioblastoma (GBM). Previous studies indicated the FGFR3-TACC3 fusion provides survival benefit to GBM patients. Consistent with this, 2 patients with GBM had unexpectedly good outcomes and survived for 5 and 7 years, respectively. In contrast, 2 patients with initially lower graded tumors survived only 3 years and 1 year, respectively. One patient received erdafitinib, a targeted FGFR inhibitor, for 3 months at late disease recurrence and no response was seen. There were varied histomorphological features, including many cases that lacked the characteristic FGFR3-TACC3 pathology. The findings of this cohort suggest that molecular testing is justified, even for glioma cases lacking classic histopathological signatures. Currently, FGFR3-TACC3 fusion gliomas are often classified on the basis of histopathological features. However, further research is needed to examine whether IDH1/2-wild-type tumors with FGFR3-TACC3 fusion should be classified as a subtype on the basis of this molecular fusion. Because patients with IDH1/2-wild-type GBM with FGFR3-TACC3 fusion have improved survival, routine molecular testing for this mutation in patients enrolled in clinical trials and subsequent stratification may be warranted.


Asunto(s)
Glioblastoma , Glioma , Humanos , Glioma/genética , Glioma/cirugía , Mutación , Inhibidores de Proteínas Quinasas , Receptor Tipo 3 de Factor de Crecimiento de Fibroblastos/genética , Proteínas Asociadas a Microtúbulos
8.
Transpl Int ; 34(10): 1971-1983, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34218471

RESUMEN

Dysnatremias are a rare but significant event in liver transplantation. While recipient pre-transplant hypernatremia has been demonstrated to increase post-transplant mortality, the degree of hypernatremia and the impact of its resolution have been less well characterized. Here, we used multivariate Cox regression with a comprehensive list of donor and recipient factors in order to conduct a robust multivariate retrospective database study of 54,311 United Network for Organ Sharing (UNOS) liver transplant patients to analyze the effect of pre-transplant serum sodium on post-transplant mortality, post-transplant length of hospitalization, and post-transplant graft survival. Mortality and graft failure increased in a stepwise fashion with increasing pre-transplant hypernatremia: 145 -150 mEq/L (HR = 1.118 and HR = 1.113), 150-155 mEq/L (HR = 1.324 and HR = 1.306), and > 155 mEq/L (HR = 1.623 and HR = 1.661). Pre-transplant hypo- and hypernatremia also increased length of post-transplant hospitalization: < 125 mEq/L (HR = 1.098), 125-130 mEq/L (HR = 1.060), 145 -150 mEq/L (HR = 1.140), and 150-155 mEq/L (HR = 1.358). Resolution of hypernatremia showed no significant difference in mortality compared with normonatremia, while unresolved hypernatremia significantly increased mortality (HR = 1.254), including a durable long-term increased mortality risk for patients with creatinine < 2 mg/dL and MELD < 25. Pre-transplant hypernatremia serves as a morbid prognostic indicator for post-transplant morbidity and mortality.


Asunto(s)
Hipernatremia , Hiponatremia , Trasplante de Hígado , Humanos , Estudios Retrospectivos , Factores de Riesgo , Sodio
9.
Pediatr Transplant ; 25(4): e13999, 2021 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-33704871

RESUMEN

Pediatric kidney transplant recipients generally have good outcomes post-transplantation. However, the younger age and longer life span after transplantation in the pediatric population make understanding the multifactorial nature of long-term graft survival critical. This investigation analyzes factors associated with 10-year survival to identify areas for improvement in patient care. Kaplan-Meier with log-rank test and univariable and multivariable logistic regression methods were used to retrospectively analyze 7785 kidney transplant recipients under the age of 18 years from January 1, 1998, until March 9, 2008, using United Network for Organ Sharing (UNOS) data. Our end-point was death-censored 10-year graft survival after excluding recipients whose grafts failed within one year of transplant. Recipients aged 5-18 years had lower 10-year graft survival, which worsened as age increased: 5-9 years (OR: 0.66; CI: 0.52-0.83), 10-14 years (OR: 0.43; CI: 0.33-0.55), and 15-18 years (OR: 0.34; CI: 0.26-0.44). Recipient African American ethnicity (OR: 0.67; CI: 0.58-0.78) and Hispanic donor ethnicity (OR: 0.82; CI: 0.72-0.94) had worse outcomes than other donor and recipient ethnicities, as did patients on dialysis at the time of transplant (OR: 0.82; CI: 0.73-0.91). Recipient private insurance status (OR: 1.35; CI: 1.22-1.50) was protective for 10-year graft survival. By establishing the role of age, race, and insurance status on long-term graft survival, we hope to guide clinicians in identifying patients at high risk for graft failure. This study highlights the need for increased allocation of resources and medical care to reduce the disparity in outcomes for certain patient populations.


Asunto(s)
Supervivencia de Injerto , Fallo Renal Crónico/cirugía , Trasplante de Riñón , Adolescente , Factores de Edad , Niño , Preescolar , Femenino , Estudios de Seguimiento , Disparidades en el Estado de Salud , Humanos , Lactante , Recién Nacido , Estimación de Kaplan-Meier , Modelos Logísticos , Masculino , Factores Protectores , Estudios Retrospectivos , Factores de Riesgo , Resultado del Tratamiento
10.
Nat Commun ; 15(1): 5528, 2024 Jul 15.
Artículo en Inglés | MEDLINE | ID: mdl-39009561

RESUMEN

The rewards that we get from our choices and actions can have a major influence on our future behavior. Understanding how reward biasing of behavior is implemented in the brain is important for many reasons, including the fact that diminution in reward biasing is a hallmark of clinical depression. We hypothesized that reward biasing is mediated by the anterior cingulate cortex (ACC), a cortical hub region associated with the integration of reward and executive control and with the etiology of depression. To test this hypothesis, we recorded neural activity during a biased judgment task in patients undergoing intracranial monitoring for either epilepsy or major depressive disorder. We found that beta (12-30 Hz) oscillations in the ACC predicted both associated reward and the size of the choice bias, and also tracked reward receipt, thereby predicting bias on future trials. We found reduced magnitude of bias in depressed patients, in whom the beta-specific effects were correspondingly reduced. Our findings suggest that ACC beta oscillations may orchestrate the learning of reward information to guide adaptive choice, and, more broadly, suggest a potential biomarker for anhedonia and point to future development of interventions to enhance reward impact for therapeutic benefit.


Asunto(s)
Trastorno Depresivo Mayor , Giro del Cíngulo , Recompensa , Humanos , Giro del Cíngulo/fisiología , Giro del Cíngulo/diagnóstico por imagen , Giro del Cíngulo/fisiopatología , Masculino , Adulto , Femenino , Trastorno Depresivo Mayor/fisiopatología , Trastorno Depresivo Mayor/psicología , Conducta de Elección/fisiología , Persona de Mediana Edad , Ritmo beta/fisiología , Epilepsia/fisiopatología , Adulto Joven
11.
Neurospine ; 20(4): 1112-1123, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-38171281

RESUMEN

Osteoporotic vertebral fractures (OVFs) are a significant health concern linked to increased morbidity, mortality, and diminished quality of life. Traditional OVF risk assessment tools like bone mineral density (BMD) only capture a fraction of the risk profile. Artificial intelligence, specifically computer vision, has revolutionized other fields of medicine through analysis of videos, histopathology slides and radiological scans. In this review, we provide an overview of computer vision algorithms and current computer vision models used in predicting OVF risk. We highlight the clinical applications, future directions and limitations of computer vision in OVF risk prediction.

12.
J Neurosurg Spine ; 38(4): 417-424, 2023 04 01.
Artículo en Inglés | MEDLINE | ID: mdl-36681945

RESUMEN

OBJECTIVE: Knowledge of the manufacturer of the previously implanted pedicle screw systems prior to revision spinal surgery may facilitate faster and safer surgery. Often, this information is unavailable because patients are referred by other centers or because of missing information in the patients' records. Recently, machine learning and computer vision have gained wider use in clinical applications. The authors propose a computer vision approach to classify posterior thoracolumbar instrumentation systems. METHODS: Lateral and anteroposterior (AP) radiographs obtained in patients undergoing posterior thoracolumbar pedicle screw implantation for any indication at the authors' institution (2015-2021) were obtained. DICOM images were cropped to include both the pedicle screws and rods. Images were labeled with the manufacturer according to the operative record. Multiple feature detection methods were tested (SURF, MESR, and Minimum Eigenvalues); however, the bag-of-visual-words technique with KAZE feature detection was ultimately used to construct a computer vision support vector machine (SVM) classifier for lateral, AP, and fused lateral and AP images. Accuracy was tested using an 80%/20% training/testing pseudorandom split over 100 iterations. Using a reader study, the authors compared the model performance with the current practice of surgeons and manufacturer representatives identifying spinal hardware by visual inspection. RESULTS: Among the three image types, 355 lateral, 379 AP, and 338 fused radiographs were obtained. The five pedicle screw implants included in this study were the Globus Medical Creo, Medtronic Solera, NuVasive Reline, Stryker Xia, and DePuy Expedium. When the two most common manufacturers used at the authors' institution were binarily classified (Globus Medical and Medtronic), the accuracy rates for lateral, AP, and fused images were 93.15% ± 4.06%, 88.98% ± 4.08%, and 91.08% ± 5.30%, respectively. Classification accuracy decreased by approximately 10% with each additional manufacturer added. The multilevel five-way classification accuracy rates for lateral, AP, and fused images were 64.27% ± 5.13%, 60.95% ± 5.52%, and 65.90% ± 5.14%, respectively. In the reader study, the model performed five-way classification on 100 test images with 79% accuracy in 14 seconds, compared with an average of 44% accuracy in 20 minutes for two surgeons and three manufacturer representatives. CONCLUSIONS: The authors developed a KAZE feature detector with an SVM classifier that successfully identified posterior thoracolumbar hardware at five-level classification. The model performed more accurately and efficiently than the method currently used in clinical practice. The relative computational simplicity of this model, from input to output, may facilitate future prospective studies in the clinical setting.


Asunto(s)
Tornillos Pediculares , Fusión Vertebral , Cirugía Asistida por Computador , Humanos , Vértebras Lumbares/diagnóstico por imagen , Vértebras Lumbares/cirugía , Estudios Prospectivos , Cirugía Asistida por Computador/métodos , Fusión Vertebral/métodos , Vértebras Torácicas/diagnóstico por imagen , Vértebras Torácicas/cirugía
13.
J Neurosurg ; 138(2): 347-357, 2023 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-35907186

RESUMEN

OBJECTIVE: Stereotactic radiosurgical capsulotomy (SRS-C) is an effective neurosurgical option for patients with treatment-resistant obsessive-compulsive disorder (TROCD). Unlike other procedures such as deep brain stimulation and radiofrequency ablation, the cost-effectiveness of SRS-C for TROCD has not been investigated. The authors herein report the first cost-effectiveness analysis of SRS-C for TROCD. METHODS: Using a decision analytic model, the authors compared the cost-effectiveness of SRS-C to treatment as usual (TAU) for TROCD. Treatment response and complication rates were derived from a review of relevant clinical trials. Published algorithms were used to convert Yale-Brown Obsessive Compulsive Scale scores into utility scores reflecting improvements in quality of life. Costs were approached from the healthcare sector perspective and were drawn from Medicare reimbursement rates and available healthcare economics data. A Monte Carlo simulation and probabilistic sensitivity analysis were performed to estimate the incremental cost-effectiveness ratio. RESULTS: One hundred fifty-eight TROCD patients across 9 studies who had undergone SRS-C and had at least 36 months of follow-up were included in the model. Compared to TAU, SRS-C was more cost-effective, with an estimated incremental cost-effectiveness ratio of $28,960 per quality-adjusted life year (QALY) gained. Within the 3-year time horizon, net QALYs gained were greater in the SRS-C group than the TAU group by 0.27 (95% CI 0.2698-0.2702, p < 0.0001). At willingness-to-pay thresholds of $50,000 and $100,000 per QALY, the Monte Carlo simulation revealed that SRS-C was more cost-effective than TAU in 83% and 100% of iterations, respectively. CONCLUSIONS: Compared to TAU, SRS-C for TROCD is more cost-effective under a range of possible cost and effectiveness values.


Asunto(s)
Trastorno Obsesivo Compulsivo , Radiocirugia , Estados Unidos , Humanos , Anciano , Análisis de Costo-Efectividad , Calidad de Vida , Radiocirugia/métodos , Análisis Costo-Beneficio , Medicare , Trastorno Obsesivo Compulsivo/cirugía
14.
Transplant Direct ; 9(4): e1467, 2023 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-37009165

RESUMEN

Donation after circulatory death (DCD) allografts might represent one of the largest untapped sources of liver allografts. Our aim was to identify independent recipient risk factors that predict mortality in DCD allograft recipients to preselect optimal candidates for successful transplantation. Furthermore, we compared the application of our newly constructed DCD Recipient Selector Index (RSI) score to previously developed models to determine superiority in predicting recipient survival. Methods: Using the Organ Procurement and Transplantation Network database, we performed univariate and multivariate retrospective analyses on 4228 DCD liver allograft recipients. Results: We identified 8 significant factors and incorporated them into the weighted RSI to predict 3-mo survival following DCD liver transplantation with a C-statistic of 0.6971. The most significant recipient risk factors were recipient serum sodium levels >150 mEq/L at transplant, recipient albumin <2.0 g/dL at transplant, and a history of portal vein thrombosis. Because Model for End-Stage Liver Disease (MELD) score components were included as individual predictors, the DCD RSI predicts survival independently of MELD. Upon comparison with 3 previous recipient risk scores-Balance of Risk, Renal Risk Index, Patient-Survival Outcomes Following Liver Transplantation-the DCD RSI was determined to be superior at selecting optimal candidates pre-DCD transplantation, yielding a C-statistic of 0.6971. Conclusions: After verifying the performance of predictive indices for selection of DCD recipients, the DCD RSI is best used to preselect patients for optimized outcomes after DCD transplantation. This can increase utilization of DCD donors by improving outcomes.

15.
J Neurosurg ; 138(4): 1016-1027, 2023 04 01.
Artículo en Inglés | MEDLINE | ID: mdl-35932263

RESUMEN

OBJECTIVE: Deep brain stimulation (DBS) for Parkinson disease (PD) is traditionally performed with awake intraoperative testing and/or microelectrode recording. Recently, however, the procedure has been increasingly performed under general anesthesia with image-based verification. The authors sought to compare structural and functional networks engaged by awake and asleep PD-DBS of the subthalamic nucleus (STN) and correlate them with clinical outcomes. METHODS: Levodopa equivalent daily dose (LEDD), pre- and postoperative motor scores on the Movement Disorders Society-Unified Parkinson's Disease Rating Scale part III (MDS-UPDRS III), and total electrical energy delivered (TEED) at 6 months were retroactively assessed in patients with PD who received implants of bilateral DBS leads. In subset analysis, implanted electrodes were reconstructed using the Lead-DBS toolbox. Volumes of tissue activated (VTAs) were used as seed points in group volumetric and connectivity analysis. RESULTS: The clinical courses of 122 patients (52 asleep, 70 awake) were reviewed. Operating room and procedure times were significantly shorter in asleep cases. LEDD reduction, MDS-UPDRS III score improvement, and TEED at the 6-month follow-up did not differ between groups. In subset analysis (n = 40), proximity of active contact, VTA overlap, and desired network fiber counts with motor STN correlated with lower DBS energy requirement and improved motor scores. Discriminative structural fiber tracts involving supplementary motor area, thalamus, and brainstem were associated with optimal clinical improvement. Areas of highest structural and functional connectivity with VTAs did not significantly differ between the two groups. CONCLUSIONS: Compared to awake STN DBS, asleep procedures can achieve similarly optimal targeting-based on clinical outcomes, electrode placement, and connectivity estimates-in more efficient procedures and shorter operating room times.


Asunto(s)
Estimulación Encefálica Profunda , Enfermedad de Parkinson , Núcleo Subtalámico , Humanos , Enfermedad de Parkinson/terapia , Estimulación Encefálica Profunda/métodos , Vigilia , Núcleo Subtalámico/cirugía , Levodopa/uso terapéutico , Resultado del Tratamiento
16.
Brain Stimul ; 16(6): 1792-1798, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38135358

RESUMEN

BACKGROUND: Deep brain stimulation (DBS) and other neuromodulatory techniques are being increasingly utilized to treat refractory neurologic and psychiatric disorders. OBJECTIVE: /Hypothesis: To better understand the circuit-level pathophysiology of treatment-resistant depression (TRD) and treat the network-level dysfunction inherent to this challenging disorder, we adopted an approach of inpatient intracranial monitoring borrowed from the epilepsy surgery field. METHODS: We implanted 3 patients with 4 DBS leads (bilateral pair in both the ventral capsule/ventral striatum and subcallosal cingulate) and 10 stereo-electroencephalography (sEEG) electrodes targeting depression-relevant network regions. For surgical planning, we used an interactive, holographic visualization platform to appreciate the 3D anatomy and connectivity. In the initial surgery, we placed the DBS leads and sEEG electrodes using robotic stereotaxy. Subjects were then admitted to an inpatient monitoring unit for depression-specific neurophysiological assessments. Following these investigations, subjects returned to the OR to remove the sEEG electrodes and internalize the DBS leads to implanted pulse generators. RESULTS: Intraoperative testing revealed positive valence responses in all 3 subjects that helped verify targeting. Given the importance of the network-based hypotheses we were testing, we required accurate adherence to the surgical plan (to engage DBS and sEEG targets) and stability of DBS lead rotational position (to ensure that stimulation field estimates of the directional leads used during inpatient monitoring were relevant chronically), both of which we confirmed (mean radial error 1.2±0.9 mm; mean rotation 3.6±2.6°). CONCLUSION: This novel hybrid sEEG-DBS approach allows detailed study of the neurophysiological substrates of complex neuropsychiatric disorders.


Asunto(s)
Estimulación Encefálica Profunda , Trastorno Depresivo Resistente al Tratamiento , Epilepsia , Humanos , Epilepsia/terapia , Electroencefalografía/métodos , Trastorno Depresivo Resistente al Tratamiento/terapia , Electrodos , Estimulación Encefálica Profunda/métodos , Electrodos Implantados
17.
Surg Neurol Int ; 13: 178, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35509526

RESUMEN

Background: Anxiety is a common symptom of mental health disorders. Surgical treatment of anxiety-related disorders is limited by our understanding of the neural circuitry responsible for emotional regulation. Limbic regions communicate with other cortical and subcortical regions to generate emotional responses and behaviors toward anxiogenic stimuli. Epilepsy involving corticolimbic regions may disrupt normal neural circuitry and present with mood disorders. Anxiety presenting in patients with mesial temporal lobe epilepsy is common; however, anxiety in patients with cingulate epilepsy is not well described. Neurosurgical cases with rare clinical presentations may provide insight into the basic functionality of the human mind and ultimately lead to improvements in surgical treatments. Case Description: We present the case of a 24-year-old male with a 20-year history of nonlesional and cingulate epilepsy with an aura of anxiety and baseline anxiety. Noninvasive work-up was discordant. Intracranial evaluation using stereoelectroencephalography established the epileptogenic zone in the left anterior and mid-cingulate gyrus. Stimulation of the cingulate reproduced a sense of anxiety typical of the habitual auras. We performed laser interstitial thermal therapy of the left anterior and mid-cingulate gyrus. At 8 months following ablation, the patient reported a substantial reduction in seizure frequency and complete elimination of his baseline anxiety and anxious auras. Conclusion: This case highlights the role of the cingulate cortex (CC) in regulating anxiety. Ablation of the epileptic focus resolved both epilepsy-related anxiety and baseline features.a Future studies assessing the role of the CC in anxiety disorders may enable improvements in surgical treatments for anxiety disorders.

18.
Brain Circ ; 8(1): 38-44, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35372723

RESUMEN

OBJECTIVE: Proper blood flow is essential for the maintenance of homeostasis for the human cerebrum. The dural venous sinuses comprise the dominant cerebral venous outflow path. Understanding the spatial configuration of the dural venous sinuses can provide valuable insight into several pathological conditions. Previously, only two-dimensional or cadaveric data have been used to understand cerebral outflow. For the first time, we applied three-dimensional rotational venography (3D-RV) to study and provide detailed quantitative morphological measurements of the terminal cerebral venous sinus system in several pathological states. SUBJECTS AND METHODS: Patients who underwent a 3D-RV procedure were identified by reviewing our local institution's endovascular database. Patients with high-quality angiographic images were selected. Eighteen patients were included (37.1 ± 3.8 years). Sinuses were divided into four segments, starting at the torcula and ending at the internal jugular vein. Segment length, 3D displacement, and cross-sectional area were measured. RESULTS: The transverse sinus (60.2 mm) was the longest segment, followed by the sigmoid sinus (55.1 mm). Cross-sectional areas were smallest at the middle of the transverse sinus (21.3 mm2) but increased at the sigmoid sinus (33.5 mm2) and at the jugular bulb (49.7 mm2). The only variation in displacements of venous flow was at the sigmoid-jugular junction, where 55% of cases had lateral displacements versus 45% medial, and 78% superior versus 22% inferior. CONCLUSIONS: We describe the terminal venous sinus system of patients with a variety of diagnoses, detailing segment length, cross-sectional area, and 3D path.

19.
World Neurosurg ; 162: e561-e567, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-35331948

RESUMEN

BACKGROUND: Adult spinal deformity (ASD) surgery is becoming increasingly prevalent. Soft tissue defects arising from revision closure and impaired healing can predispose to wound complications including dehiscence and infection. Soft tissue coverage with local muscle flaps has been shown to minimize wound complications in high-risk patients. In this study we evaluate the role of complex wound closure in preventing wound complications in high-risk spinal deformity patients. METHODS: The authors retrospectively reviewed charts of patients who underwent ASD surgery. Patients were stratified into muscle flap advancement (by neurosurgery or plastic surgery) closure versus primary approximation by neurosurgery. Relevant patient and operative factors were collected and summarized using descriptive statistics. Outcomes of interest included wound complication and revision surgery. RESULTS: Ninety-four cases met inclusion criteria including 56 wounds closed by neurosurgery and 38 wounds closed by plastic surgery (PRS). Of the neurosurgery wounds, 31 and 25 were closed by primary approximation and muscular flap advancement, respectively. Patients operated on by PRS were higher risk than all patients operated on only by neurosurgery (P = 0.0037) but were not significantly higher risk than the neurosurgery performed flap cohort (P = 0.4914). In subgroup analysis, despite similar levels of risk, the PRS population experienced lower rates of any wound complication (P = 0.028) and specifically dehiscence (P = 0.029) compared with the neurosurgery performed flap closure cohort. CONCLUSIONS: Prophylactic involvement of plastic surgery in ASD surgery wound closure may improve wound outcomes in higher risk patients. A multidisciplinary approach with plastic and spine surgeons may lessen the risk of wound complications in high-risk spine surgeries.


Asunto(s)
Procedimientos de Cirugía Plástica , Infección de la Herida Quirúrgica , Adulto , Humanos , Procedimientos Neuroquirúrgicos/efectos adversos , Procedimientos de Cirugía Plástica/efectos adversos , Estudios Retrospectivos , Colgajos Quirúrgicos/efectos adversos , Infección de la Herida Quirúrgica/epidemiología , Infección de la Herida Quirúrgica/etiología , Infección de la Herida Quirúrgica/prevención & control
20.
J Pers Med ; 12(7)2022 Jul 17.
Artículo en Inglés | MEDLINE | ID: mdl-35887656

RESUMEN

Orthotopic liver transplantation (OLT) is a lifesaving therapy for patients with irreversible liver damage caused by autoimmune liver diseases (AutoD) including autoimmune hepatitis (AIH), primary biliary cholangitis (PBC), and primary sclerosing cholangitis (PSC). Currently, it is unclear how access to transplantation differs among patients with various etiologies of liver disease. Our aim is to evaluate the likelihood of transplant and the long-term patient and graft survival after OLT for each etiology for transplantation from 2000 to 2021. We conducted a large retrospective study of United Network for Organ Sharing (UNOS) liver transplant patients in five 4-year eras with five cohorts: AutoD (PBC, PSC, AIH cirrhosis), alcohol-related liver disease (ALD), hepatocellular carcinoma (HCC), viral hepatitis, and nonalcoholic steatohepatitis (NASH). We conducted a multivariate analysis for probability of transplant. Intent-to-treat (ITT) analysis was performed to assess the 10-year survival differences for each listing diagnosis while accounting for both waitlist and post-transplant survival. Across all eras, autoimmune conditions had a lower adjusted probability of transplant of 0.92 (0.92, 0.93) compared to ALD 0.97 (0.97, 0.97), HCC 1.08 (1.07, 1.08), viral hepatitis 0.99 (0.99, 0.99), and NASH 0.99 (0.99, 1.00). Patients with AutoD had significantly better post-transplant patient and graft survival than ALD, HCC, viral hepatitis, and NASH in each and across all eras (p-values all < 0.001). Patients with AutoD had superior ITT survival (p-value < 0.001, log rank test). In addition, the waitlist survival for patients with AutoD compared to other listing diagnoses was improved with the exception of ALD, which showed no significant difference (p-value = 0.1056, log rank test). Despite a superior 10-year graft and patient survival in patients transplanted for AutoD, patients with AutoD have a significantly lower probability of receiving a liver transplant compared to those transplanted for HCC, ALD, viral hepatitis, and NASH. Patients with AutoD may benefit from improved liver allocation while maintaining superior waitlist and post-transplant survival. Decreased access in spite of appropriate outcomes for patients poses a significant risk for increased morbidity for patients with AutoD.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA