Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros

Banco de datos
Tipo de estudio
País/Región como asunto
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 18(12)2018 Nov 22.
Artículo en Inglés | MEDLINE | ID: mdl-30467276

RESUMEN

Social sensors perceive the real world through social media and online web services, which have the advantages of low cost and large coverage over traditional physical sensors. In intelligent transportation researches, sensing and analyzing such social signals provide a new path to monitor, control and optimize transportation systems. However, current research is largely focused on using single channel online social signals to extract and sense traffic information. Clearly, sensing and exploiting multi-channel social signals could effectively provide deeper understanding of traffic incidents. In this paper, we utilize cross-platform online data, i.e., Sina Weibo and News, as multi-channel social signals, then we propose a word2vec-based event fusion (WBEF) model for sensing, detecting, representing, linking and fusing urban traffic incidents. Thus, each traffic incident can be comprehensively described from multiple aspects, and finally the whole picture of unban traffic events can be obtained and visualized. The proposed WBEF architecture was trained by about 1.15 million multi-channel online data from Qingdao (a coastal city in China), and the experiments show our method surpasses the baseline model, achieving an 88.1% F1 score in urban traffic incident detection. The model also demonstrates its effectiveness in the open scenario test.


Asunto(s)
Accidentes de Tránsito , Medios de Comunicación Sociales , Transportes , China , Ciudades , Humanos
2.
Artículo en Inglés | MEDLINE | ID: mdl-38683706

RESUMEN

Due to the nonstationary nature, the distribution of real-world multivariate time series (MTS) changes over time, which is known as distribution drift. Most existing MTS forecasting models greatly suffer from distribution drift and degrade the forecasting performance over time. Existing methods address distribution drift via adapting to the latest arrived data or self-correcting per the meta knowledge derived from future data. Despite their great success in MTS forecasting, these methods hardly capture the intrinsic distribution changes, especially from a distributional perspective. Accordingly, we propose a novel framework temporal conditional variational autoencoder (TCVAE) to model the dynamic distributional dependencies over time between historical observations and future data in MTSs and infer the dependencies as a temporal conditional distribution to leverage latent variables. Specifically, a novel temporal Hawkes attention (THA) mechanism represents temporal factors that subsequently fed into feedforward networks to estimate the prior Gaussian distribution of latent variables. The representation of temporal factors further dynamically adjusts the structures of Transformer-based encoder and decoder to distribution changes by leveraging a gated attention mechanism (GAM). Moreover, we introduce conditional continuous normalization flow (CCNF) to transform the prior Gaussian to a complex and form-free distribution to facilitate flexible inference of the temporal conditional distribution. Extensive experiments conducted on six real-world MTS datasets demonstrate the TCVAE's superior robustness and effectiveness over the state-of-the-art MTS forecasting baselines. We further illustrate the TCVAE applicability through multifaceted case studies and visualization in real-world scenarios.

3.
J Neural Eng ; 21(3)2024 May 15.
Artículo en Inglés | MEDLINE | ID: mdl-38701773

RESUMEN

Objective. Electroencephalogram (EEG) analysis has always been an important tool in neural engineering, and the recognition and classification of human emotions are one of the important tasks in neural engineering. EEG data, obtained from electrodes placed on the scalp, represent a valuable resource of information for brain activity analysis and emotion recognition. Feature extraction methods have shown promising results, but recent trends have shifted toward end-to-end methods based on deep learning. However, these approaches often overlook channel representations, and their complex structures pose certain challenges to model fitting.Approach. To address these challenges, this paper proposes a hybrid approach named FetchEEG that combines feature extraction and temporal-channel joint attention. Leveraging the advantages of both traditional feature extraction and deep learning, the FetchEEG adopts a multi-head self-attention mechanism to extract representations between different time moments and channels simultaneously. The joint representations are then concatenated and classified using fully-connected layers for emotion recognition. The performance of the FetchEEG is verified by comparison experiments on a self-developed dataset and two public datasets.Main results. In both subject-dependent and subject-independent experiments, the FetchEEG demonstrates better performance and stronger generalization ability than the state-of-the-art methods on all datasets. Moreover, the performance of the FetchEEG is analyzed for different sliding window sizes and overlap rates in the feature extraction module. The sensitivity of emotion recognition is investigated for three- and five-frequency-band scenarios.Significance. FetchEEG is a novel hybrid method based on EEG for emotion classification, which combines EEG feature extraction with Transformer neural networks. It has achieved state-of-the-art performance on both self-developed datasets and multiple public datasets, with significantly higher training efficiency compared to end-to-end methods, demonstrating its effectiveness and feasibility.


Asunto(s)
Electroencefalografía , Emociones , Humanos , Electroencefalografía/métodos , Emociones/fisiología , Aprendizaje Profundo , Atención/fisiología , Redes Neurales de la Computación , Masculino , Femenino , Adulto
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA