Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
medRxiv ; 2024 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-38343835

RESUMO

Poor sleep quality in Autism Spectrum Disorder (ASD) individuals is linked to severe daytime behaviors. This study explores the relationship between a prior night's sleep structure and its predictive power for next-day behavior in ASD individuals. The motion was extracted using a low-cost near-infrared camera in a privacy-preserving way. Over two years, we recorded overnight data from 14 individuals, spanning over 2,000 nights, and tracked challenging daytime behaviors, including aggression, self-injury, and disruption. We developed an ensemble machine learning algorithm to predict next-day behavior in the morning and the afternoon. Our findings indicate that sleep quality is a more reliable predictor of morning behavior than afternoon behavior the next day. The proposed model attained an accuracy of 74% and a F1 score of 0.74 in target-sensitive tasks and 67% accuracy and 0.69 F1 score in target-insensitive tasks. For 7 of the 14, better-than-chance balanced accuracy was obtained (p-value < 0.05), with 3 showing significant trends (p-value < 0.1). These results suggest off-body, privacy-preserving sleep monitoring as a viable method for predicting next-day adverse behavior in ASD individuals, with the potential for behavioral intervention and enhanced care in social and learning settings.

2.
Anesth Analg ; 134(2): 380-388, 2022 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-34673658

RESUMO

BACKGROUND: The retrospective analysis of electroencephalogram (EEG) signals acquired from patients under general anesthesia is crucial in understanding the patient's unconscious brain's state. However, the creation of such database is often tedious and cumbersome and involves human labor. Hence, we developed a Raspberry Pi-based system for archiving EEG signals recorded from patients under anesthesia in operating rooms (ORs) with minimal human involvement. METHODS: Using this system, we archived patient EEG signals from over 500 unique surgeries at the Emory University Orthopaedics and Spine Hospital, Atlanta, for about 18 months. For this, we developed a software package that runs on a Raspberry Pi and archives patient EEG signals from a SedLine Root EEG Monitor (Masimo) to a secure Health Insurance Portability and Accountability Act (HIPAA) compliant cloud storage. The OR number corresponding to each surgery was archived along with the EEG signal to facilitate retrospective EEG analysis. We retrospectively processed the archived EEG signals and performed signal quality checks. We also proposed a formula to compute the proportion of true EEG signal and calculated the corresponding statistics. Further, we curated and interleaved patient medical record information with the corresponding EEG signals. RESULTS: We retrospectively processed the EEG signals to demonstrate a statistically significant negative correlation between the relative alpha power (8-12 Hz) of the EEG signal captured under anesthesia and the patient's age. CONCLUSIONS: Our system is a standalone EEG archiver developed using low cost and readily available hardware. We demonstrated that one could create a large-scale EEG database with minimal human involvement. Moreover, we showed that the captured EEG signal is of good quality for retrospective analysis and combined the EEG signal with the patient medical records. This project's software has been released under an open-source license to enable others to use and contribute.


Assuntos
Curadoria de Dados/métodos , Eletroencefalografia/instrumentação , Eletroencefalografia/métodos , Monitorização Intraoperatória/instrumentação , Monitorização Intraoperatória/métodos , Adulto , Idoso , Idoso de 80 Anos ou mais , Gerenciamento de Dados/instrumentação , Gerenciamento de Dados/métodos , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
3.
Physiol Meas ; 38(8): 1701-1713, 2017 Jul 31.
Artigo em Inglês | MEDLINE | ID: mdl-28562369

RESUMO

OBJECTIVE: This paper builds upon work submitted as part of the 2016 PhysioNet/CinC Challenge, which used sparse coding as a feature extraction tool on audio PCG data for heart sound classification. APPROACH: In sparse coding, preprocessed data is decomposed into a dictionary matrix and a sparse coefficient matrix. The dictionary matrix represents statistically important features of the audio segments. The sparse coefficient matrix is a mapping that represents which features are used by each segment. Working in the sparse domain, we train support vector machines (SVMs) for each audio segment (S1, systole, S2, diastole) and the full cardiac cycle. We train a sixth SVM to combine the results from the preliminary SVMs into a single binary label for the entire PCG recording. In addition to classifying heart sounds using sparse coding, this paper presents two novel modifications. The first uses a matrix norm in the dictionary update step of sparse coding to encourage the dictionary to learn discriminating features from the abnormal heart recordings. The second combines the sparse coding features with time-domain features in the final SVM stage. MAIN RESULTS: The original algorithm submitted to the challenge achieved a cross-validated mean accuracy (MAcc) score of 0.8652 (Se = 0.8669 and Sp = 0.8634). After incorporating the modifications new to this paper, we report an improved cross-validated MAcc of 0.8926 (Se = 0.9007 and Sp = 0.8845). SIGNIFICANCE: Our results show that sparse coding is an effective way to define spectral features of the cardiac cycle and its sub-cycles for the purpose of classification. In addition, we demonstrate that sparse coding can be combined with additional feature extraction methods to improve classification accuracy.


Assuntos
Ruídos Cardíacos , Processamento de Sinais Assistido por Computador , Máquina de Vetores de Suporte , Bases de Dados Factuais , Humanos , Fonocardiografia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA