RESUMEN
This paper presents the Sensory Interactive Table (SIT): an instrumented, interactive dining table. Through the use of load cells and LEDs that are embedded in the table surface, SIT allows us to study: (1) the eating behaviors of people in a social setting, (2) the social interactions around the eating behaviors of people in a social setting, and (3) the continuous cycle of feedback through LEDs on people's eating behavior and their response to this feedback in real time, to ultimately create an effective dietary support system. This paper presents the hard- and software specifications of the system, and it shows the potential of the system to capture mass-related dimensions in real time and with high accuracy and spatial resolution.
Asunto(s)
Dieta Saludable , Conducta Alimentaria , Recolección de Datos , Dieta , Humanos , Diseño Interior y Mobiliario , Apoyo SocialRESUMEN
Robots are promising tools for promoting engagement of autistic children in interventions and thereby increasing the amount of learning opportunities. However, designing deliberate robot behavior aimed at engaging autistic children remains challenging. Our current understanding of what interactions with a robot, or facilitated by a robot, are particularly motivating to autistic children is limited to qualitative reports with small sample sizes. Translating insights from these reports to design is difficult due to the large individual differences among autistic children in their needs, interests, and abilities. To address these issues, we conducted a descriptive study and report on an analysis of how 31 autistic children spontaneously interacted with a humanoid robot and an adult within the context of a robot-assisted intervention, as well as which individual characteristics were associated with the observed interactions. For this analysis, we used video recordings of autistic children engaged in a robot-assisted intervention that were recorded as part of the DE-ENIGMA database. The results showed that the autistic children frequently engaged in exploratory and functional interactions with the robot spontaneously, as well as in interactions with the adult that were elicited by the robot. In particular, we observed autistic children frequently initiating interactions aimed at making the robot do a certain action. Autistic children with stronger language ability, social functioning, and fewer autism spectrum-related symptoms, initiated more functional interactions with the robot and more robot-elicited interactions with the adult. We conclude that the children's individual characteristics, in particular the child's language ability, can be indicative of which types of interaction they are more likely to find interesting. Taking these into account for the design of deliberate robot behavior, coupled with providing more autonomy over the robot's behavior to the autistic children, appears promising for promoting engagement and facilitating more learning opportunities.
RESUMEN
Technologies that measure human nonverbal behavior have existed for some time, and their use in the analysis of social behavior has become more popular following the development of sensor technologies that record full-body movement. However, a standardized methodology to efficiently represent and analyze full-body motion is absent. In this article, we present automated measurement and analysis of body motion (AMAB), a methodology for examining individual and interpersonal nonverbal behavior from the output of full-body motion tracking systems. We address the recording, screening, and normalization of the data, providing methods for standardizing the data across recording condition and across subject body sizes. We then propose a series of dependent measures to operationalize common research questions in psychological research. We present practical examples from several application areas to demonstrate the efficacy of our proposed method for full-body measurements and comparisons across time, space, body parts, and subjects.