Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Sensors (Basel) ; 22(4)2022 Feb 11.
Artículo en Inglés | MEDLINE | ID: mdl-35214275

RESUMEN

Human activity recognition (HAR) using wearable sensors is an increasingly active research topic in machine learning, aided in part by the ready availability of detailed motion capture data from smartphones, fitness trackers, and smartwatches. The goal of HAR is to use such devices to assist users in their daily lives in application areas such as healthcare, physical therapy, and fitness. One of the main challenges for HAR, particularly when using supervised learning methods, is obtaining balanced data for algorithm optimisation and testing. As people perform some activities more than others (e.g., walk more than run), HAR datasets are typically imbalanced. The lack of dataset representation from minority classes hinders the ability of HAR classifiers to sufficiently capture new instances of those activities. We introduce three novel hybrid sampling strategies to generate more diverse synthetic samples to overcome the class imbalance problem. The first strategy, which we call the distance-based method (DBM), combines Synthetic Minority Oversampling Techniques (SMOTE) with Random_SMOTE, both of which are built around the k-nearest neighbors (KNN). The second technique, referred to as the noise detection-based method (NDBM), combines SMOTE Tomek links (SMOTE_Tomeklinks) and the modified synthetic minority oversampling technique (MSMOTE). The third approach, which we call the cluster-based method (CBM), combines Cluster-Based Synthetic Oversampling (CBSO) and Proximity Weighted Synthetic Oversampling Technique (ProWSyn). We compare the performance of the proposed hybrid methods to the individual constituent methods and baseline using accelerometer data from three commonly used benchmark datasets. We show that DBM, NDBM, and CBM reduce the impact of class imbalance and enhance F1 scores by a range of 9-20 percentage point compared to their constituent sampling methods. CBM performs significantly better than the others under a Friedman test, however, DBM has lower computational requirements.


Asunto(s)
Algoritmos , Aprendizaje Automático , Análisis por Conglomerados , Actividades Humanas , Humanos
2.
PLoS One ; 16(10): e0258247, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34610018

RESUMEN

When people interact, they fall into synchrony. This synchrony has been demonstrated in a range of contexts, from walking or playing music together to holding a conversation, and has been linked to prosocial outcomes such as development of rapport and efficiency of cooperation. While the basis of synchrony remains unclear, several studies have found synchrony to increase when an interaction is made challenging, potentially providing a means of facilitating interaction. Here we focus on head movement during free conversation. As verbal information is obscured when conversing over background noise, we investigate whether synchrony is greater in high vs low levels of noise, as well as addressing the effect of background noise complexity. Participants held a series of conversations with unfamiliar interlocutors while seated in a lab, and the background noise level changed every 15-30s between 54, 60, 66, 72, and 78 dB. We report measures of head movement synchrony recorded via high-resolution motion tracking at the extreme noise levels (i.e., 54 vs 78 dB) in dyads (n = 15) and triads (n = 11). In both the dyads and the triads, we report increased movement coherence in high compared to low level speech-shaped noise. Furthermore, in triads we compare behaviour in speech-shaped noise vs multi-talker babble, and find greater movement coherence in the more complex babble condition. Key synchrony differences fall in the 0.2-0.5 Hz frequency bands, and are discussed in terms of their correspondence to talkers' average utterance durations. Additional synchrony differences occur at higher frequencies in the triads only (i.e., >5 Hz), which may relate to synchrony of backchannel cues (as multiple individuals were listening and responding to the same talker). Not only do these studies replicate prior work indicating interlocutors' increased reliance on behavioural synchrony as task difficulty increases, but they demonstrate these effects using multiple difficulty manipulations and across different sized interaction groups.


Asunto(s)
Comunicación , Movimiento , Ruido , Femenino , Humanos , Masculino , Persona de Mediana Edad , Percepción del Habla/fisiología
3.
Autism ; 25(1): 210-226, 2021 01.
Artículo en Inglés | MEDLINE | ID: mdl-32854524

RESUMEN

LAY ABSTRACT: When we are communicating with other people, we exchange a variety of social signals through eye gaze and facial expressions. However, coordinated exchanges of these social signals can only happen when people involved in the interaction are able to see each other. Although previous studies report that autistic individuals have difficulties in using eye gaze and facial expressions during social interactions, evidence from tasks that involve real face-to-face conversations is scarce and mixed. Here, we investigate how eye gaze and facial expressions of typical and high-functioning autistic individuals are modulated by the belief in being seen by another person, and by being in a face-to-face interaction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video (no belief in being seen, no face-to-face), video-call (belief in being seen, no face-to-face) and face-to-face (belief in being seen and face-to-face). Typical participants gazed less to the confederate and made more facial expressions when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial expression patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial expressions as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies.


Asunto(s)
Trastorno del Espectro Autista , Trastorno Autístico , Comunicación , Expresión Facial , Fijación Ocular , Humanos
4.
J Nonverbal Behav ; 44(1): 63-83, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32189820

RESUMEN

Conversation between two people involves subtle nonverbal coordination in addition to speech. However, the precise parameters and timing of this coordination remain unclear, which limits our ability to theorize about the neural and cognitive mechanisms of social coordination. In particular, it is unclear if conversation is dominated by synchronization (with no time lag), rapid and reactive mimicry (with lags under 1 s) or traditionally observed mimicry (with several seconds lag), each of which demands a different neural mechanism. Here we describe data from high-resolution motion capture of the head movements of pairs of participants (n = 31 dyads) engaged in structured conversations. In a pre-registered analysis pathway, we calculated the wavelet coherence of head motion within dyads as a measure of their nonverbal coordination and report two novel results. First, low-frequency coherence (0.2-1.1 Hz) is consistent with traditional observations of mimicry, and modeling shows this behavior is generated by a mechanism with a constant 600 ms lag between leader and follower. This is in line with rapid reactive (rather than predictive or memory-driven) models of mimicry behavior, and could be implemented in mirror neuron systems. Second, we find an unexpected pattern of lower-than-chance coherence between participants, or hypo-coherence, at high frequencies (2.6-6.5 Hz). Exploratory analyses show that this systematic decoupling is driven by fast nodding from the listening member of the dyad, and may be a newly identified social signal. These results provide a step towards the quantification of real-world human behavior in high resolution and provide new insights into the mechanisms of social coordination.

5.
IEEE Trans Pattern Anal Mach Intell ; 33(4): 741-53, 2011 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-20421675

RESUMEN

In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals-saccades, fixations, and blinks-and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.


Asunto(s)
Algoritmos , Movimientos Oculares/fisiología , Adulto , Electrooculografía/métodos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Percepción Visual/fisiología
6.
IEEE Trans Pattern Anal Mach Intell ; 28(10): 1553-67, 2006 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-16986539

RESUMEN

In order to provide relevant information to mobile users, such as workers engaging in the manual tasks of maintenance and assembly, a wearable computer requires information about the user's specific activities. This work focuses on the recognition of activities that are characterized by a hand motion and an accompanying sound. Suitable activities can be found in assembly and maintenance work. Here, we provide an initial exploration into the problem domain of continuous activity recognition using on-body sensing. We use a mock "wood workshop" assembly task to ground our investigation. We describe a method for the continuous recognition of activities (sawing, hammering, filing, drilling, grinding, sanding, opening a drawer, tightening a vise, and turning a screwdriver) using microphones and three-axis accelerometers mounted at two positions on the user's arms. Potentially "interesting" activities are segmented from continuous streams of data using an analysis of the sound intensity detected at the two different locations. Activity classification is then performed on these detected segments using linear discriminant analysis (LDA) on the sound channel and hidden Markov models (HMMs) on the acceleration data. Four different methods at classifier fusion are compared for improving these classifications. Using user-dependent training, we obtain continuous average recall and precision rates (for positive activities) of 78 percent and 74 percent, respectively. Using user-independent training (leave-one-out across five users), we obtain recall rates of 66 percent and precision rates of 63 percent. In isolation, these activities were recognized with accuracies of 98 percent, 87 percent, and 95 percent for the user-dependent, user-independent, and user-adapted cases, respectively.


Asunto(s)
Ergonomía/métodos , Industrias/métodos , Monitoreo Ambulatorio/instrumentación , Monitoreo Ambulatorio/métodos , Actividad Motora/fisiología , Reconocimiento de Normas Patrones Automatizadas/métodos , Análisis y Desempeño de Tareas , Aceleración , Actividades Cotidianas , Inteligencia Artificial , Vestuario , Humanos , Industrias/instrumentación , Ocupaciones , Espectrografía del Sonido/métodos , Transductores , Lugar de Trabajo
7.
IEEE Trans Inf Technol Biomed ; 8(4): 415-27, 2004 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-15615032

RESUMEN

This paper describes an advanced care and alert portable telemedical monitor (AMON), a wearable medical monitoring and alert system targeting high-risk cardiac/respiratory patients. The system includes continuous collection and evaluation of multiple vital signs, intelligent multiparameter medical emergency detection, and a cellular connection to a medical center. By integrating the whole system in an unobtrusive, wrist-worn enclosure and applying aggressive low-power design techniques, continuous long-term monitoring can be performed without interfering with the patients' everyday activities and without restricting their mobility. In the first two and a half years of this EU IST sponsored project, the AMON consortium has designed, implemented, and tested the described wrist-worn device, a communication link, and a comprehensive medical center software package. The performance of the system has been validated by a medical study with a set of 33 subjects. The paper describes the main concepts behind the AMON system and presents details of the individual subsystems and solutions as well as the results of the medical validation.


Asunto(s)
Diagnóstico por Computador/instrumentación , Almacenamiento y Recuperación de la Información/métodos , Internet , Monitoreo Ambulatorio/instrumentación , Procesamiento de Señales Asistido por Computador/instrumentación , Telemedicina/instrumentación , Telemetría/instrumentación , Actividades Cotidianas , Adolescente , Adulto , Anciano , Algoritmos , Presión Sanguínea , Diagnóstico por Computador/métodos , Diseño de Equipo , Falla de Equipo , Análisis de Falla de Equipo/métodos , Estudios de Factibilidad , Femenino , Frecuencia Cardíaca/fisiología , Humanos , Masculino , Sistemas de Registros Médicos Computarizados , Persona de Mediana Edad , Miniaturización/métodos , Monitoreo Ambulatorio/métodos , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Temperatura Cutánea , Integración de Sistemas , Telemedicina/métodos , Telemetría/métodos , Transductores
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...