ABSTRACT
BACKGROUND: Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians. METHODS: In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance. RESULTS: Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12. CONCLUSIONS: Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
Subject(s)
Cannabis , Stress Disorders, Post-Traumatic , Substance-Related Disorders , Humans , Female , Stress Disorders, Post-Traumatic/epidemiology , Stress Disorders, Post-Traumatic/diagnosis , Depression/diagnosis , Substance-Related Disorders/complications , PsychopathologyABSTRACT
BACKGROUND: Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD. METHODS: As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men. RESULTS: Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects. CONCLUSIONS: Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
ABSTRACT
Considerable racial/ethnic disparities persist in exposure to life stressors and socioeconomic resources that can directly affect threat neurocircuitry, particularly the amygdala, that partially mediates susceptibility to adverse posttraumatic outcomes. Limited work to date, however, has investigated potential racial/ethnic variability in amygdala reactivity or connectivity that may in turn be related to outcomes such as post-traumatic stress disorder (PTSD). Participants from the AURORA study (n = 283), a multisite longitudinal study of trauma outcomes, completed functional magnetic resonance imaging and psychophysiology within approximately two-weeks of trauma exposure. Seed-based amygdala connectivity and amygdala reactivity during passive viewing of fearful and neutral faces were assessed during fMRI. Physiological activity was assessed during Pavlovian threat conditioning. Participants also reported the severity of posttraumatic symptoms 3 and 6 months after trauma. Black individuals showed lower baseline skin conductance levels and startle compared to White individuals, but no differences were observed in physiological reactions to threat. Further, Hispanic and Black participants showed greater amygdala connectivity to regions including the dorsolateral prefrontal cortex (PFC), dorsal anterior cingulate cortex, insula, and cerebellum compared to White participants. No differences were observed in amygdala reactivity to threat. Amygdala connectivity was associated with 3-month PTSD symptoms, but the associations differed by racial/ethnic group and were partly driven by group differences in structural inequities. The present findings suggest variability in tonic neurophysiological arousal in the early aftermath of trauma between racial/ethnic groups, driven by structural inequality, impacts neural processes that mediate susceptibility to later PTSD symptoms.
Subject(s)
Fear , Stress Disorders, Post-Traumatic , Humans , Longitudinal Studies , Fear/physiology , Amygdala , Gyrus Cinguli/pathology , Magnetic Resonance Imaging , Prefrontal Cortex/pathologyABSTRACT
Childhood trauma is a known risk factor for trauma and stress-related disorders in adulthood. However, limited research has investigated the impact of childhood trauma on brain structure linked to later posttraumatic dysfunction. We investigated the effect of childhood trauma on white matter microstructure after recent trauma and its relationship with future posttraumatic dysfunction among trauma-exposed adult participants (n = 202) recruited from emergency departments as part of the AURORA Study. Participants completed self-report scales assessing prior childhood maltreatment within 2-weeks in addition to assessments of PTSD, depression, anxiety, and dissociation symptoms within 6-months of their traumatic event. Fractional anisotropy (FA) obtained from diffusion tensor imaging (DTI) collected at 2-weeks and 6-months was used to index white matter microstructure. Childhood maltreatment load predicted 6-month PTSD symptoms (b = 1.75, SE = 0.78, 95% CI = [0.20, 3.29]) and inversely varied with FA in the bilateral internal capsule (IC) at 2-weeks (p = 0.0294, FDR corrected) and 6-months (p = 0.0238, FDR corrected). We observed a significant indirect effect of childhood maltreatment load on 6-month PTSD symptoms through 2-week IC microstructure (b = 0.37, Boot SE = 0.18, 95% CI = [0.05, 0.76]) that fully mediated the effect of childhood maltreatment load on PCL-5 scores (b = 1.37, SE = 0.79, 95% CI = [-0.18, 2.93]). IC microstructure did not mediate relationships between childhood maltreatment and depressive, anxiety, or dissociative symptomatology. Our findings suggest a unique role for IC microstructure as a stable neural pathway between childhood trauma and future PTSD symptoms following recent trauma. Notably, our work did not support roles of white matter tracts previously found to vary with PTSD symptoms and childhood trauma exposure, including the cingulum bundle, uncinate fasciculus, and corpus callosum. Given the IC contains sensory fibers linked to perception and motor control, childhood maltreatment might impact the neural circuits that relay and process threat-related inputs and responses to trauma.
ABSTRACT
Post-traumatic stress disorder (PTSD) is an independent risk factor for developing heart failure; however, the underlying cardiac mechanisms are still elusive. This study aims to evaluate the real-time effects of experimentally induced PTSD symptom activation on various cardiac contractility and autonomic measures. We recorded synchronized electrocardiogram and impedance cardiogram from 137 male veterans (17 PTSD, 120 non-PTSD; 48 twin pairs, 41 unpaired singles) during a laboratory-based traumatic reminder stressor. To identify the parameters describing the cardiac mechanisms by which trauma reminders can create stress on the heart, we utilized a feature selection mechanism along with a random forest classifier distinguishing PTSD and non-PTSD. We extracted 99 parameters, including 76 biosignal-based and 23 sociodemographic, medical history, and psychiatric diagnosis features. A subject/twin-wise stratified nested cross-validation procedure was used for parameter tuning and model assessment to identify the important parameters. The identified parameters included biomarkers such as pre-ejection period, acceleration index, velocity index, Heather index, and several physiology-agnostic features. These identified parameters during trauma recall suggested a combination of increased sympathetic nervous system (SNS) activity and deteriorated cardiac contractility that may increase the heart failure risk for PTSD. This indicates that the PTSD symptom activation associates with real-time reductions in several cardiac contractility measures despite SNS activation. This finding may be useful in future cardiac prevention efforts.
Subject(s)
Heart Failure , Stress Disorders, Post-Traumatic , Veterans , Humans , Male , Stress Disorders, Post-Traumatic/diagnosis , Electric Impedance , Mental Recall/physiology , Twins , Veterans/psychologyABSTRACT
BACKGROUND: Atrial fibrillation (AF) is often asymptomatic and thus under-observed. Given the high risks of stroke and heart failure among patients with AF, early prediction and effective management are crucial. Given the prevalence of obstructive sleep apnea among AF patients, electrocardiogram (ECG) analysis from polysomnography (PSG) offers a unique opportunity for early AF prediction. Our aim is to identify individuals at high risk of AF development from singlelead ECGs during standard PSG. METHODS: We analyzed 18,782 singlelead ECG recordings from 13,609 subjects undergoing PSG at the Massachusetts General Hospital sleep laboratory. AF presence was identified using ICD-9/10 codes. The dataset included 15,913 recordings without AF history and 2054 recordings from patients diagnosed with AF between one month to fifteen years post-PSG. Data were partitioned into training, validation, and test cohorts ensuring that individual patients remained exclusive to each cohort. The test set was held out during the training process. We employed two different methods for feature extraction to build a final model for AF prediction: Extraction of hand-crafted ECG features and a deep learning method. For extraction of ECG-hand-crafted features, recordings were split into 30-s windows, and those with a signal quality index (SQI) below 0.95 were discarded. From each remaining window, 150 features were extracted from the time, frequency, time-frequency domains, and phase-space reconstructions of the ECG. A compilation of 12 statistical features summarized these window-specific features per recording, resulting in 1800 features (12 × 150). A pre-trained deep neural network from the PhysioNet Challenge 2021 was updated using transfer learning to discriminate recordings with and without AF. The model processed PSG ECGs in 16-s windows to generate AF probabilities, from which 13 statistical features were extracted. Combining 1800 features from feature extraction with 13 from the deep learning model, we performed a feature selection and subsequently trained a shallow neural network to predict future AF and evaluated its performance on the test cohort. RESULTS: On the test set, our model exhibited sensitivity, specificity, and precision of 0.67, 0.81, and 0.3, respectively, for AF prediction. Survival analysis revealed a hazard ratio of 8.36 (p-value: 1.93 × 10-52) for AF outcomes using the log-rank test. CONCLUSIONS: Our proposed ECG analysis method, utilizing overnight PSG data, shows promise in AF prediction despite modest precision, suggesting false positives. This approach could enable low-cost screening and proactive treatment for high-risk patients. Refinements, including additional physiological parameters, may reduce false positives, enhancing clinical utility and accuracy.
Subject(s)
Atrial Fibrillation , Electrocardiography , Polysomnography , Humans , Atrial Fibrillation/diagnosis , Atrial Fibrillation/physiopathology , Electrocardiography/methods , Male , Female , Middle Aged , Aged , Predictive Value of Tests , Deep Learning , Heart Rate/physiology , SleepABSTRACT
Regular blood pressure (BP) monitoring in clinical and ambulatory settings plays a crucial role in the prevention, diagnosis, treatment, and management of cardiovascular diseases. Recently, the widespread adoption of ambulatory BP measurement devices has been predominantly driven by the increased prevalence of hypertension and its associated risks and clinical conditions. Recent guidelines advocate for regular BP monitoring as part of regular clinical visits or even at home. This increased utilization of BP measurement technologies has raised significant concerns regarding the accuracy of reported BP values across settings. In this survey, which focuses mainly on cuff-based BP monitoring technologies, we highlight how BP measurements can demonstrate substantial biases and variances due to factors such as measurement and device errors, demographics, and body habitus. With these inherent biases, the development of a new generation of cuff-based BP devices that use artificial intelligence (AI) has significant potential. We present future avenues where AI-assisted technologies can leverage the extensive clinical literature on BP-related studies together with the large collections of BP records available in electronic health records. These resources can be combined with machine learning approaches, including deep learning and Bayesian inference, to remove BP measurement biases and provide individualized BP-related cardiovascular risk indexes.
Subject(s)
Artificial Intelligence , Hypertension , Humans , Blood Pressure/physiology , Bayes Theorem , Blood Pressure Determination , Hypertension/diagnosisABSTRACT
Background: Atrial fibrillation (AFib) detection via mobile ECG devices is promising, but algorithms often struggle to generalize across diverse datasets and platforms, limiting their real-world applicability. Objective: This study aims to develop a robust, generalizable AFib detection approach for mobile ECG devices using crowdsourced algorithms. Methods: We developed a voting algorithm using random forest, integrating six open-source AFib detection algorithms from the PhysioNet Challenge. The algorithm was trained on an AliveCor dataset and tested on two disjoint AliveCor datasets and one Apple Watch dataset. Results: The voting algorithm outperformed the base algorithms across all metrics: the average of sensitivity (0.884), specificity (0.988), PPV (0.917), NPV (0.985), and F1-score (0.943) on all datasets. It also demonstrated the least variability among datasets, signifying its highest robustness and effectiveness in diverse data environments. Moreover, it surpassed Apple's algorithm on all metrics and showed higher specificity but lower sensitivity than AliveCor's Kardia algorithm. Conclusions: This study demonstrates the potential of crowdsourced, multi-algorithmic strategies in enhancing AFib detection. Our approach shows robust cross-platform performance, addressing key generalization challenges in AI-enabled cardiac monitoring and underlining the potential for collaborative algorithms in wearable monitoring devices.
Subject(s)
Algorithms , Atrial Fibrillation , Crowdsourcing , Electrocardiography , Atrial Fibrillation/diagnosis , Atrial Fibrillation/physiopathology , Humans , Crowdsourcing/methods , Electrocardiography/methods , Wearable Electronic DevicesABSTRACT
Hippocampal impairments are reliably associated with post-traumatic stress disorder (PTSD); however, little research has characterized how increased threat-sensitivity may interact with arousal responses to alter hippocampal reactivity, and further how these interactions relate to the sequelae of trauma-related symptoms. In a sample of individuals recently exposed to trauma (N=116, 76 Female), we found that PTSD symptoms at 2-weeks were associated with decreased hippocampal responses to threat as assessed with functional magnetic resonance imaging (fMRI). Further, the relationship between hippocampal threat sensitivity and PTSD symptomology only emerged in individuals who showed transient, high threat-related arousal, as assayed by an independently collected measure of Fear Potentiated Startle. Collectively, our finding suggests that development of PTSD is associated with threat-related decreases in hippocampal function, due to increases in fear-potentiated arousal.Significance StatementAlterations in hippocampal function linked to threat-related arousal are reliably associated with post-traumatic stress disorder (PTSD); however, how these alterations relate to the sequelae of trauma-related symptoms is unknown. Prior models based on non-trauma samples suggest that arousal may impact hippocampal neurophysiology leading to maladaptive behavior. Here we show that decreased hippocampal threat sensitivity interacts with fear-potentiated startle to predict PTSD symptoms. Specifically, individuals with high fear-potentiated startle and low, transient hippocampal threat sensitivity showed the greatest PTSD symptomology. These findings bridge literatures of threat-related arousal and hippocampal function to better understand PTSD risk.
ABSTRACT
OBJECTIVES: To develop the International Cardiac Arrest Research (I-CARE), a harmonized multicenter clinical and electroencephalography database for acute hypoxic-ischemic brain injury research involving patients with cardiac arrest. DESIGN: Multicenter cohort, partly prospective and partly retrospective. SETTING: Seven academic or teaching hospitals from the United States and Europe. PATIENTS: Individuals 16 years old or older who were comatose after return of spontaneous circulation following a cardiac arrest who had continuous electroencephalography monitoring were included. INTERVENTIONS: Not applicable. MEASUREMENTS AND MAIN RESULTS: Clinical and electroencephalography data were harmonized and stored in a common Waveform Database-compatible format. Automated spike frequency, background continuity, and artifact detection on electroencephalography were calculated with 10-second resolution and summarized hourly. Neurologic outcome was determined at 3-6 months using the best Cerebral Performance Category (CPC) scale. This database includes clinical data and 56,676 hours (3.9 terabytes) of continuous electroencephalography data for 1,020 patients. Most patients died ( n = 603, 59%), 48 (5%) had severe neurologic disability (CPC 3 or 4), and 369 (36%) had good functional recovery (CPC 1-2). There is significant variability in mean electroencephalography recording duration depending on the neurologic outcome (range, 53-102 hr for CPC 1 and CPC 4, respectively). Epileptiform activity averaging 1 Hz or more in frequency for at least 1 hour was seen in 258 patients (25%) (19% for CPC 1-2 and 29% for CPC 3-5). Burst suppression was observed for at least 1 hour in 207 (56%) and 635 (97%) patients with CPC 1-2 and CPC 3-5, respectively. CONCLUSIONS: The I-CARE consortium electroencephalography database provides a comprehensive real-world clinical and electroencephalography dataset for neurophysiology research of comatose patients after cardiac arrest. This dataset covers the spectrum of abnormal electroencephalography patterns after cardiac arrest, including epileptiform patterns and those in the ictal-interictal continuum.
Subject(s)
Coma , Heart Arrest , Humans , Adolescent , Coma/diagnosis , Retrospective Studies , Prospective Studies , Heart Arrest/diagnosis , ElectroencephalographyABSTRACT
Post-traumatic stress disorder (PTSD) is an independent risk factor for incident heart failure, but the underlying cardiac mechanisms remained elusive. Impedance cardiography (ICG), especially when measured during stress, can help understand the underlying psychophysiological pathways linking PTSD with heart failure. We investigated the association between PTSD and ICG-based contractility metrics (pre-ejection period (PEP) and Heather index (HI)) using a controlled twin study design with a laboratory-based traumatic reminder stressor. PTSD status was assessed using structured clinical interviews. We acquired synchronized electrocardiograms and ICG data while playing personalized-trauma scripts. Using linear mixed-effects models, we examined twins as individuals and within PTSD-discordant pairs. We studied 137 male veterans (48 pairs, 41 unpaired singles) from Vietnam War Era with a mean (standard deviation) age of 68.5(2.5) years. HI during trauma stress was lower in the PTSD vs. non-PTSD individuals (7.2 vs. 9.3 [ohm/s2 ], p = .003). PEP reactivity (trauma minus neutral) was also more negative in PTSD vs. non-PTSD individuals (-7.4 vs. -2.0 [ms], p = .009). The HI and PEP associations with PTSD persisted for adjusted models during trauma and reactivity, respectively. For within-pair analysis of eight PTSD-discordant twin pairs (out of 48 pairs), PTSD was associated with lower HI in neutral, trauma, and reactivity, whereas no association was found between PTSD and PEP. PTSD was associated with reduced HI and PEP, especially with trauma recall stress. This combination of increased sympathetic activation and decreased cardiac contractility combined may be concerning for increased heart failure risk after recurrent trauma re-experiencing in PTSD.
Subject(s)
Heart Failure , Stress Disorders, Post-Traumatic , Veterans , Humans , Male , Aged , Stress Disorders, Post-Traumatic/complications , Electric Impedance , Twins , Heart Failure/complicationsABSTRACT
STUDY OBJECTIVE: To derive and initially validate a brief bedside clinical decision support tool that identifies emergency department (ED) patients at high risk of substantial, persistent posttraumatic stress symptoms after a motor vehicle collision. METHODS: Derivation (n=1,282, 19 ED sites) and validation (n=282, 11 separate ED sites) data were obtained from adults prospectively enrolled in the Advancing Understanding of RecOvery afteR traumA study who were discharged from the ED after motor vehicle collision-related trauma. The primary outcome was substantial posttraumatic stress symptoms at 3 months (Posttraumatic Stress Disorder Checklist for Diagnostic and Statistical Manual of Mental Disorders-5 ≥38). Logistic regression derivation models were evaluated for discriminative ability using the area under the curve and the accuracy of predicted risk probabilities (Brier score). Candidate posttraumatic stress predictors assessed in these models (n=265) spanned a range of sociodemographic, baseline health, peritraumatic, and mechanistic domains. The final model selection was based on performance and ease of administration. RESULTS: Significant 3-month posttraumatic stress symptoms were common in the derivation (27%) and validation (26%) cohort. The area under the curve and Brier score of the final 8-question tool were 0.82 and 0.14 in the derivation cohort and 0.76 and 0.17 in the validation cohort. CONCLUSION: This simple 8-question tool demonstrates promise to risk-stratify individuals with substantial posttraumatic stress symptoms who are discharged to home after a motor vehicle collision. Both external validation of this instrument, and work to further develop more accurate tools, are needed. Such tools might benefit public health by enabling the conduct of preventive intervention trials and assisting the growing number of EDs that provide services to trauma survivors aimed at promoting psychological recovery.
Subject(s)
Stress Disorders, Post-Traumatic , Adult , Humans , Stress Disorders, Post-Traumatic/psychology , Emergency Service, Hospital , Accidents, Traffic , Motor VehiclesABSTRACT
BACKGROUND: It has been hypothesized that low access to healthy and nutritious food increases health disparities. Low-accessibility areas, called food deserts, are particularly commonplace in lower-income neighborhoods. The metrics for measuring the food environment's health, called food desert indices, are primarily based on decadal census data, limiting their frequency and geographical resolution to that of the census. We aimed to create a food desert index with finer geographic resolution than census data and better responsiveness to environmental changes. MATERIALS AND METHODS: We augmented decadal census data with real-time data from platforms such as Yelp and Google Maps and crowd-sourced answers to questionnaires by the Amazon Mechanical Turks to create a real-time, context-aware, and geographically refined food desert index. Finally, we used this refined index in a concept application that suggests alternative routes with similar ETAs between a source and destination in the Atlanta metropolitan area as an intervention to expose a traveler to better food environments. RESULTS: We made 139,000 pull requests to Yelp, analyzing 15,000 unique food retailers in the metro Atlanta area. In addition, we performed 248,000 walking and driving route analyses on these retailers using Google Maps' API. As a result, we discovered that the metro Atlanta food environment creates a strong bias towards eating out rather than preparing a meal at home when access to vehicles is limited. Contrary to the food desert index that we started with, which changed values only at neighborhood boundaries, the food desert index that we built on top of it captured the changing exposure of a subject as they walked or drove through the city. This model was also sensitive to the changes in the environment that occurred after the census data was collected. CONCLUSIONS: Research on the environmental components of health disparities is flourishing. New machine learning models have the potential to augment various information sources and create fine-tuned models of the environment. This opens the way to better understanding the environment and its effects on health and suggesting better interventions.
Subject(s)
Censuses , Crowdsourcing , Humans , Food Deserts , Information Sources , Machine LearningABSTRACT
Characterizing motor subtypes of Parkinson's disease (PD) is an important aspect of clinical care that is useful for prognosis and medical management. Although all PD cases involve the loss of dopaminergic neurons in the brain, individual cases may present with different combinations of motor signs, which may indicate differences in underlying pathology and potential response to treatment. However, the conventional method for distinguishing PD motor subtypes involves resource-intensive physical examination by a movement disorders specialist. Moreover, the standardized rating scales for PD rely on subjective observation, which requires specialized training and unavoidable inter-rater variability. In this work, we propose a system that uses machine learning models to automatically and objectively identify some PD motor subtypes, specifically Tremor-Dominant (TD) and Postural Instability and Gait Difficulty (PIGD), from 3D kinematic data recorded during walking tasks for patients with PD (MDS-UPDRS-III Score, 34.7 ± 10.5, average disease duration 7.5 ± 4.5 years). This study demonstrates a machine learning model utilizing kinematic data that identifies PD motor subtypes with a 79.6% F1 score (N = 55 patients with parkinsonism). This significantly outperformed a comparison model using classification based on gait features (19.8% F1 score). Variants of our model trained to individual patients achieved a 95.4% F1 score. This analysis revealed that both temporal, spectral, and statistical features from lower body movements are helpful in distinguishing motor subtypes. Automatically assessing PD motor subtypes simply from walking may reduce the time and resources required from specialists, thereby improving patient care for PD treatments. Furthermore, this system can provide objective assessments to track the changes in PD motor subtypes over time to implement and modify appropriate treatment plans for individual patients as needed.
Subject(s)
Gait Disorders, Neurologic , Parkinson Disease , Humans , Parkinson Disease/pathology , Tremor/diagnosis , Biomechanical Phenomena , Gait , Brain/pathology , Gait Disorders, Neurologic/diagnosis , Postural Balance/physiologyABSTRACT
Freezing of gait (FOG) is a poorly understood heterogeneous gait disorder seen in patients with parkinsonism which contributes to significant morbidity and social isolation. FOG is currently measured with scales that are typically performed by movement disorders specialists (i.e., MDS-UPDRS), or through patient completed questionnaires (N-FOG-Q) both of which are inadequate in addressing the heterogeneous nature of the disorder and are unsuitable for use in clinical trials The purpose of this study was to devise a method to measure FOG objectively, hence improving our ability to identify it and accurately evaluate new therapies. A major innovation of our study is that it is the first study of its kind that uses the largest sample size (>30 h, N = 57) in order to apply explainable, multi-task deep learning models for quantifying FOG over the course of the medication cycle and at varying levels of parkinsonism severity. We trained interpretable deep learning models with multi-task learning to simultaneously score FOG (cross-validated F1 score 97.6%), identify medication state (OFF vs. ON levodopa; cross-validated F1 score 96.8%), and measure total PD severity (MDS-UPDRS-III score prediction error ≤ 2.7 points) using kinematic data of a well-characterized sample of N = 57 patients during levodopa challenge tests. The proposed model was able to explain how kinematic movements are associated with each FOG severity level that were highly consistent with the features, in which movement disorders specialists are trained to identify as characteristics of freezing. Overall, we demonstrate that deep learning models' capability to capture complex movement patterns in kinematic data can automatically and objectively score FOG with high accuracy. These models have the potential to discover novel kinematic biomarkers for FOG that can be used for hypothesis generation and potentially as clinical trial outcome measures.
Subject(s)
Gait Disorders, Neurologic , Parkinson Disease , Humans , Gait Disorders, Neurologic/diagnosis , Levodopa/therapeutic use , Parkinson Disease/diagnosis , Gait , MovementABSTRACT
Spatial navigation patterns in indoor space usage can reveal important cues about the cognitive health of participants. In this work, we present a low-cost, scalable, open-source edge computing system using Bluetooth low energy (BLE) beacons for tracking indoor movements in a large, 1700 m2 facility used to carry out therapeutic activities for participants with mild cognitive impairment (MCI). The facility is instrumented with 39 edge computing systems, along with an on-premise fog server. The participants carry a BLE beacon, in which BLE signals are received and analyzed by the edge computing systems. Edge computing systems are sparsely distributed in the wide, complex indoor space, challenging the standard trilateration technique for localizing subjects, which assumes a dense installation of BLE beacons. We propose a graph trilateration approach that considers the temporal density of hits from the BLE beacon to surrounding edge devices to handle the inconsistent coverage of edge devices. This proposed method helps us tackle the varying signal strength, which leads to intermittent detection of beacons. The proposed method can pinpoint the positions of multiple participants with an average error of 4.4 m and over 85% accuracy in region-level localization across the entire study area. Our experimental results, evaluated in a clinical environment, suggest that an ordinary medical facility can be transformed into a smart space that enables automatic assessment of individuals' movements, which may reflect health status or response to treatment.
Subject(s)
Cloud Computing , Spatial Navigation , Humans , Wireless Technology , Health Status , Movement , Spatial Navigation/physiologyABSTRACT
BACKGROUND: A better understanding of the extent to which prior occurrences of posttraumatic stress disorder (PTSD) and major depressive episode (MDE) predict psychopathological reactions to subsequent traumas might be useful in targeting posttraumatic preventive interventions. METHODS: Data come from 1306 patients presenting to 29 U.S. emergency departments (EDs) after a motor vehicle collision (MVC) in the advancing understanding of recovery after trauma study. Patients completed self-reports in the ED and 2-weeks, 8-weeks, and 3-months post-MVC. Associations of pre-MVC probable PTSD and probable MDE histories with subsequent 3-months post-MVC probable PTSD and probable MDE were examined along with mediation through intervening peritraumatic, 2-, and 8-week disorders. RESULTS: 27.6% of patients had 3-month post-MVC probable PTSD and/or MDE. Pre-MVC lifetime histories of these disorders were not only significant (relative risk = 2.6-7.4) but were dominant (63.1% population attributable risk proportion [PARP]) predictors of this 3-month outcome, with 46.6% prevalence of the outcome among patients with pre-MVC disorder histories versus 9.9% among those without such histories. The associations of pre-MVC lifetime disorders with the 3-month outcome were mediated largely by 2- and 8-week probable PTSD and MDE (PARP decreasing to 22.8% with controls for these intervening disorders). Decomposition showed that pre-MVC lifetime histories predicted both onset and persistence of these intervening disorders as well as the higher conditional prevalence of the 3-month outcome in the presence of these intervening disorders. CONCLUSIONS: Assessments of pre-MVC PTSD and MDE histories and follow-ups at 2 and 8 weeks could help target early interventions for psychopathological reactions to MVCs.
Subject(s)
Depressive Disorder, Major , Stress Disorders, Post-Traumatic , Accidents, Traffic , Depression , Depressive Disorder, Major/epidemiology , Humans , Motor Vehicles , Stress Disorders, Post-Traumatic/epidemiologyABSTRACT
We aim to provide a critical appraisal of basic concepts underlying signal recording and processing technologies applied for (i) atrial fibrillation (AF) mapping to unravel AF mechanisms and/or identifying target sites for AF therapy and (ii) AF detection, to optimize usage of technologies, stimulate research aimed at closing knowledge gaps, and developing ideal AF recording and processing technologies. Recording and processing techniques for assessment of electrical activity during AF essential for diagnosis and guiding ablative therapy including body surface electrocardiograms (ECG) and endo- or epicardial electrograms (EGM) are evaluated. Discussion of (i) differences in uni-, bi-, and multi-polar (omnipolar/Laplacian) recording modes, (ii) impact of recording technologies on EGM morphology, (iii) global or local mapping using various types of EGM involving signal processing techniques including isochronal-, voltage- fractionation-, dipole density-, and rotor mapping, enabling derivation of parameters like atrial rate, entropy, conduction velocity/direction, (iv) value of epicardial and optical mapping, (v) AF detection by cardiac implantable electronic devices containing various detection algorithms applicable to stored EGMs, (vi) contribution of machine learning (ML) to further improvement of signals processing technologies. Recording and processing of EGM (or ECG) are the cornerstones of (body surface) mapping of AF. Currently available AF recording and processing technologies are mainly restricted to specific applications or have technological limitations. Improvements in AF mapping by obtaining highest fidelity source signals (e.g. catheter-electrode combinations) for signal processing (e.g. filtering, digitization, and noise elimination) is of utmost importance. Novel acquisition instruments (multi-polar catheters combined with improved physical modelling and ML techniques) will enable enhanced and automated interpretation of EGM recordings in the near future.
Subject(s)
Atrial Fibrillation , Cardiology , Atrial Fibrillation/diagnosis , Atrial Fibrillation/therapy , Body Surface Potential Mapping , Heart Atria , Humans , Latin AmericaABSTRACT
BACKGROUND: Obtaining medical data using wearable sensors is a potential replacement for in-hospital monitoring, but the lack of data for such sensors poses a challenge for development. One solution is using in-hospital recordings to boost performance via transfer learning. While there are many possible transfer learning algorithms, few have been tested in the domain of EEG-based sleep staging. Furthermore, there are few ways for determining which transfer learning method will work best besides exhaustive testing. Measures of transferability do exist, but are typically used for selection of pre-trained models rather than algorithms and few have been tested on medical signals. We tested several supervised transfer learning algorithms on a sleep staging task using a single channel of EEG (AF7-Fpz) captured from an in-home commercial system. RESULTS: Two neural networks-one bespoke and another state-of-art open-source architecture-were pre-trained on one of six source datasets comprising 11,561 subjects undergoing clinical polysomnograms (PSGs), then re-trained on a target dataset of 75 full-night recordings from 24 subjects. Several transferability measures were then tested to determine which is most effective for assessing performance on unseen target data. Performance on the target dataset was improved using transfer learning, with re-training the head layers being the most effective in the majority of cases (up to 63.9% of cases). Transferability measures generally provided significant correlations with accuracy (up to [Formula: see text]). CONCLUSION: Re-training the head layers provided the largest performance boost. Transferability measures are useful indicators of transfer learning effectiveness.
Subject(s)
Sleep Stages , Wearable Electronic Devices , Algorithms , Humans , Machine Learning , Neural Networks, ComputerABSTRACT
BACKGROUND: The expanding usage of complex machine learning methods such as deep learning has led to an explosion in human activity recognition, particularly applied to health. However, complex models which handle private and sometimes protected data, raise concerns about the potential leak of identifiable data. In this work, we focus on the case of a deep network model trained on images of individual faces. MATERIALS AND METHODS: A previously published deep learning model, trained to estimate the gaze from full-face image sequences was stress tested for personal information leakage by a white box inference attack. Full-face video recordings taken from 493 individuals undergoing an eye-tracking- based evaluation of neurological function were used. Outputs, gradients, intermediate layer outputs, loss, and labels were used as inputs for a deep network with an added support vector machine emission layer to recognize membership in the training data. RESULTS: The inference attack method and associated mathematical analysis indicate that there is a low likelihood of unintended memorization of facial features in the deep learning model. CONCLUSIONS: In this study, it is showed that the named model preserves the integrity of training data with reasonable confidence. The same process can be implemented in similar conditions for different models.