Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Sensors (Basel) ; 23(9)2023 Apr 28.
Artículo en Inglés | MEDLINE | ID: mdl-37177574

RESUMEN

Multimodal emotion recognition has gained much traction in the field of affective computing, human-computer interaction (HCI), artificial intelligence (AI), and user experience (UX). There is growing demand to automate analysis of user emotion towards HCI, AI, and UX evaluation applications for providing affective services. Emotions are increasingly being used, obtained through the videos, audio, text or physiological signals. This has led to process emotions from multiple modalities, usually combined through ensemble-based systems with static weights. Due to numerous limitations like missing modality data, inter-class variations, and intra-class similarities, an effective weighting scheme is thus required to improve the aforementioned discrimination between modalities. This article takes into account the importance of difference between multiple modalities and assigns dynamic weights to them by adapting a more efficient combination process with the application of generalized mixture (GM) functions. Therefore, we present a hybrid multimodal emotion recognition (H-MMER) framework using multi-view learning approach for unimodal emotion recognition and introducing multimodal feature fusion level, and decision level fusion using GM functions. In an experimental study, we evaluated the ability of our proposed framework to model a set of four different emotional states (Happiness, Neutral, Sadness, and Anger) and found that most of them can be modeled well with significantly high accuracy using GM functions. The experiment shows that the proposed framework can model emotional states with an average accuracy of 98.19% and indicates significant gain in terms of performance in contrast to traditional approaches. The overall evaluation results indicate that we can identify emotional states with high accuracy and increase the robustness of an emotion classification system required for UX measurement.


Asunto(s)
Algoritmos , Inteligencia Artificial , Humanos , Emociones/fisiología , Aprendizaje , Reconocimiento en Psicología , Electroencefalografía/métodos
2.
Int J Med Inform ; 141: 104181, 2020 09.
Artículo en Inglés | MEDLINE | ID: mdl-32559726

RESUMEN

OBJECTIVE: Ubiquitous computing has supported personalized health through a vast variety of wellness and healthcare self-quantification applications over the last decade. These applications provide insights for daily life activities but unable to portray the comprehensive impact of personal habits on human health. Therefore, in order to facilitate the individuals, we have correlated the lifestyle habits in an appropriate proportion to determine the overall impact of influenced behavior on the well-being of humans. MATERIALS AND METHODS: To study the combined impact of personal behaviors, we have proposed a methodology to derive the comprehensive Healthy Behavior Index (HBI) consisting of two major processes: (1) Behaviors' Weight-age Identification (BWI), and (2) Healthy Behavior Quantification and Index (HBQI) modeling. The BWI process identifies the high ranked contributing behaviors through life-expectancy based weight-age, whereas HBQI derives a mathematical model based on quantification and indexing of behavior using wellness guidelines. RESULTS: The contributing behaviors are identified through text mining technique and verified by seven experts with a Kappa agreement level of 0.379. A real-world user-centric statistical evaluation is applied through User Experience Questionnaire (UEQ) method to evaluate the impact of HBI service. This HBI service is developed for the Mining Minds, a wellness management application. This study involves 103 registered participants (curious about the chronic disease) for a Korean wellness management organization. They used the HBI service over 12 weeks, the results for which were evaluated through UEQ and user feedback. The service reliability for the Cronbach's alpha coefficient greater than 0.7 was achieved using HBI service whereas the stimulation coefficient of the value 0.86 revealed significant effect. We observed an overall novelty of the value 0.88 showing the potential interest of participants. CONCLUSIONS: The comprehensive HBI has demonstrated positive user experience concerning the stimulation for adapting the healthy behaviors. The HBI service is designed independently to work as a service, so any other wellness management service-enabled platform can consume it to evaluate the healthy behavior index of the person for recommendation generation, behavior indication, and behavior adaptation.


Asunto(s)
Conductas Relacionadas con la Salud , Promoción de la Salud , Estado de Salud , Humanos , Estilo de Vida , Reproducibilidad de los Resultados
3.
Sensors (Basel) ; 20(10)2020 May 13.
Artículo en Inglés | MEDLINE | ID: mdl-32414064

RESUMEN

The recognition of activities of daily living (ADL) in smart environments is a well-known and an important research area, which presents the real-time state of humans in pervasive computing. The process of recognizing human activities generally involves deploying a set of obtrusive and unobtrusive sensors, pre-processing the raw data, and building classification models using machine learning (ML) algorithms. Integrating data from multiple sensors is a challenging task due to dynamic nature of data sources. This is further complicated due to semantic and syntactic differences in these data sources. These differences become even more complex if the data generated is imperfect, which ultimately has a direct impact on its usefulness in yielding an accurate classifier. In this study, we propose a semantic imputation framework to improve the quality of sensor data using ontology-based semantic similarity learning. This is achieved by identifying semantic correlations among sensor events through SPARQL queries, and by performing a time-series longitudinal imputation. Furthermore, we applied deep learning (DL) based artificial neural network (ANN) on public datasets to demonstrate the applicability and validity of the proposed approach. The results showed a higher accuracy with semantically imputed datasets using ANN. We also presented a detailed comparative analysis, comparing the results with the state-of-the-art from the literature. We found that our semantic imputed datasets improved the classification accuracy with 95.78% as a higher one thus proving the effectiveness and robustness of learned models.


Asunto(s)
Actividades Cotidianas/clasificación , Aprendizaje Profundo , Redes Neurales de la Computación , Semántica , Algoritmos , Humanos
4.
Int J Med Inform ; 109: 55-69, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-29195707

RESUMEN

Medical students should be able to actively apply clinical reasoning skills to further their interpretative, diagnostic, and treatment skills in a non-obtrusive and scalable way. Case-Based Learning (CBL) approach has been receiving attention in medical education as it is a student-centered teaching methodology that exposes students to real-world scenarios that need to be solved using their reasoning skills and existing theoretical knowledge. In this paper, we propose an interactive CBL System, called iCBLS, which supports the development of collaborative clinical reasoning skills for medical students in an online environment. The iCBLS consists of three modules: (i) system administration (SA), (ii) clinical case creation (CCC) with an innovative semi-automatic approach, and (iii) case formulation (CF) through intervention of medical students' and teachers' knowledge. Two evaluations under the umbrella of the context/input/process/product (CIPP) model have been performed with a Glycemia study. The first focused on the system satisfaction, evaluated by 54 students. The latter aimed to evaluate the system effectiveness, simulated by 155 students. The results show a high success rate of 70% for students' interaction, 76.4% for group learning, 72.8% for solo learning, and 74.6% for improved clinical skills.


Asunto(s)
Educación Médica/organización & administración , Aprendizaje Basado en Problemas , Entrenamiento Simulado , Estudiantes de Medicina/psicología , Enseñanza/organización & administración , Competencia Clínica , Humanos , Aprendizaje
5.
Sensors (Basel) ; 17(10)2017 Oct 24.
Artículo en Inglés | MEDLINE | ID: mdl-29064459

RESUMEN

The emerging research on automatic identification of user's contexts from the cross-domain environment in ubiquitous and pervasive computing systems has proved to be successful. Monitoring the diversified user's contexts and behaviors can help in controlling lifestyle associated to chronic diseases using context-aware applications. However, availability of cross-domain heterogeneous contexts provides a challenging opportunity for their fusion to obtain abstract information for further analysis. This work demonstrates extension of our previous work from a single domain (i.e., physical activity) to multiple domains (physical activity, nutrition and clinical) for context-awareness. We propose multi-level Context-aware Framework (mlCAF), which fuses the multi-level cross-domain contexts in order to arbitrate richer behavioral contexts. This work explicitly focuses on key challenges linked to multi-level context modeling, reasoning and fusioning based on the mlCAF open-source ontology. More specifically, it addresses the interpretation of contexts from three different domains, their fusioning conforming to richer contextual information. This paper contributes in terms of ontology evolution with additional domains, context definitions, rules and inclusion of semantic queries. For the framework evaluation, multi-level cross-domain contexts collected from 20 users were used to ascertain abstract contexts, which served as basis for behavior modeling and lifestyle identification. The experimental results indicate a context recognition average accuracy of around 92.65% for the collected cross-domain contexts.


Asunto(s)
Conducta/clasificación , Monitoreo Fisiológico/métodos , Semántica , Procesamiento de Señales Asistido por Computador , Concienciación , Humanos , Interfaz Usuario-Computador
6.
Sensors (Basel) ; 16(10)2016 Sep 29.
Artículo en Inglés | MEDLINE | ID: mdl-27690050

RESUMEN

Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users.

7.
Sensors (Basel) ; 16(8)2016 Aug 10.
Artículo en Inglés | MEDLINE | ID: mdl-27517928

RESUMEN

There is sufficient evidence proving the impact that negative lifestyle choices have on people's health and wellness. Changing unhealthy behaviours requires raising people's self-awareness and also providing healthcare experts with a thorough and continuous description of the user's conduct. Several monitoring techniques have been proposed in the past to track users' behaviour; however, these approaches are either subjective and prone to misreporting, such as questionnaires, or only focus on a specific component of context, such as activity counters. This work presents an innovative multimodal context mining framework to inspect and infer human behaviour in a more holistic fashion. The proposed approach extends beyond the state-of-the-art, since it not only explores a sole type of context, but also combines diverse levels of context in an integral manner. Namely, low-level contexts, including activities, emotions and locations, are identified from heterogeneous sensory data through machine learning techniques. Low-level contexts are combined using ontological mechanisms to derive a more abstract representation of the user's context, here referred to as high-level context. An initial implementation of the proposed framework supporting real-time context identification is also presented. The developed system is evaluated for various realistic scenarios making use of a novel multimodal context open dataset and data on-the-go, demonstrating prominent context-aware capabilities at both low and high levels.


Asunto(s)
Conducta de Elección/fisiología , Minería de Datos/métodos , Estilo de Vida , Monitoreo Fisiológico/métodos , Algoritmos , Concienciación/fisiología , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...