TimeTector: A Twin-Branch Approach for Unsupervised Anomaly Detection in Livestock Sensor Noisy Data (TT-TBAD).
Sensors (Basel)
; 24(8)2024 Apr 11.
Article
em En
| MEDLINE
| ID: mdl-38676070
ABSTRACT
Unsupervised anomaly detection in multivariate time series sensor data is a complex task with diverse applications in different domains such as livestock farming and agriculture (LF&A), the Internet of Things (IoT), and human activity recognition (HAR). Advanced machine learning techniques are necessary to detect multi-sensor time series data anomalies. The primary focus of this research is to develop state-of-the-art machine learning methods for detecting anomalies in multi-sensor data. Time series sensors frequently produce multi-sensor data with anomalies, which makes it difficult to establish standard patterns that can capture spatial and temporal correlations. Our innovative approach enables the accurate identification of normal, abnormal, and noisy patterns, thus minimizing the risk of misinterpreting models when dealing with mixed noisy data during training. This can potentially result in the model deriving incorrect conclusions. To address these challenges, we propose a novel approach called "TimeTector-Twin-Branch Shared LSTM Autoencoder" which incorporates several Multi-Head Attention mechanisms. Additionally, our system now incorporates the Twin-Branch method which facilitates the simultaneous execution of multiple tasks, such as data reconstruction and prediction error, allowing for efficient multi-task learning. We also compare our proposed model to several benchmark anomaly detection models using our dataset, and the results show less error (MSE, MAE, and RMSE) in reconstruction and higher accuracy scores (precision, recall, and F1) against the baseline models, demonstrating that our approach outperforms these existing models.
Palavras-chave
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Gado
Limite:
Animals
/
Humans
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article