Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Ergonomics ; : 1-13, 2024 Jul 02.
Article in English | MEDLINE | ID: mdl-38954600

ABSTRACT

This article, from the perspective of structural features, focuses on in-car user interface icons and explores the impact of different icon structural features on visual search efficiency. Initially, we categorised the icons into four groups based on structural features: individual structure icons (ISI), enclosed structure icons (ESI), horizontal structure icons (HSI) and vertical structure icons (VSI). Subsequently, we conducted a visual search experiment with structure as the sole variable, recording participants' behaviours and eye-tracking data. Finally, data analysis was conducted using methods including analysis of variance and logistic regression. The results indicate that differences in icon structural features significantly affect visual search efficiency, showcasing significant intergroup differences. HSI exhibit the highest visual search efficiency, while ESI show the lowest efficiency. ISI have shorter response times but the lowest matching accuracy. VSI only perform better than ESI. These findings hold significant implications for optimising icon design and enhancing visual search efficiency.


Visual search efficiency of icons is crucial for human-computer interaction. We investigated how the structural features of icons influence visual search efficiency. Horizontal icons are most effective, enclosed icons the least. Individual icons are quick but less accurate. Vertical icons outperform enclosed ones. Structural features should be considered in design.

2.
PeerJ Comput Sci ; 10: e1977, 2024.
Article in English | MEDLINE | ID: mdl-38660191

ABSTRACT

Emotional recognition is a pivotal research domain in computer and cognitive science. Recent advancements have led to various emotion recognition methods, leveraging data from diverse sources like speech, facial expressions, electroencephalogram (EEG), electrocardiogram, and eye tracking (ET). This article introduces a novel emotion recognition framework, primarily targeting the analysis of users' psychological reactions and stimuli. It is important to note that the stimuli eliciting emotional responses are as critical as the responses themselves. Hence, our approach synergizes stimulus data with physical and physiological signals, pioneering a multimodal method for emotional cognition. Our proposed framework unites stimulus source data with physiological signals, aiming to enhance the accuracy and robustness of emotion recognition through data integration. We initiated an emotional cognition experiment to gather EEG and ET data alongside recording emotional responses. Building on this, we developed the Emotion-Multimodal Fusion Neural Network (E-MFNN), optimized for multimodal data fusion to process both stimulus and physiological data. We conducted extensive comparisons between our framework's outcomes and those from existing models, also assessing various algorithmic approaches within our framework. This comparison underscores our framework's efficacy in multimodal emotion recognition. The source code is publicly available at https://figshare.com/s/8833d837871c78542b29.

SELECTION OF CITATIONS
SEARCH DETAIL