Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 22
Filtrar
1.
Proc Natl Acad Sci U S A ; 117(12): 6370-6375, 2020 03 24.
Artículo en Inglés | MEDLINE | ID: mdl-32152118

RESUMEN

Social robots are becoming increasingly influential in shaping the behavior of humans with whom they interact. Here, we examine how the actions of a social robot can influence human-to-human communication, and not just robot-human communication, using groups of three humans and one robot playing 30 rounds of a collaborative game (n = 51 groups). We find that people in groups with a robot making vulnerable statements converse substantially more with each other, distribute their conversation somewhat more equally, and perceive their groups more positively compared to control groups with a robot that either makes neutral statements or no statements at the end of each round. Shifts in robot speech have the power not only to affect how people interact with robots, but also how people interact with each other, offering the prospect for modifying social interactions via the introduction of artificial agents into hybrid systems of humans and machines.


Asunto(s)
Comunicación , Procesos de Grupo , Robótica , Conducta Social , Conducta Cooperativa , Humanos , Relaciones Interpersonales
2.
Cognition ; 249: 105814, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38763071

RESUMEN

We expect children to learn new words, skills, and ideas from various technologies. When learning from humans, children prefer people who are reliable and trustworthy, yet children also forgive people's occasional mistakes. Are the dynamics of children learning from technologies, which can also be unreliable, similar to learning from humans? We tackle this question by focusing on early childhood, an age at which children are expected to master foundational academic skills. In this project, 168 4-7-year-old children (Study 1) and 168 adults (Study 2) played a word-guessing game with either a human or robot. The partner first gave a sequence of correct answers, but then followed this with a sequence of wrong answers, with a reaction following each one. Reactions varied by condition, either expressing an accident, an accident marked with an apology, or an unhelpful intention. We found that older children were less trusting than both younger children and adults and were even more skeptical after errors. Trust decreased most rapidly when errors were intentional, but only children (and especially older children) outright rejected help from intentionally unhelpful partners. As an exception to this general trend, older children maintained their trust for longer when a robot (but not a human) apologized for its mistake. Our work suggests that educational technology design cannot be one size fits all but rather must account for developmental changes in children's learning goals.


Asunto(s)
Robótica , Confianza , Humanos , Niño , Masculino , Femenino , Adulto , Preescolar , Adulto Joven , Aprendizaje/fisiología , Desarrollo Infantil/fisiología , Factores de Edad
3.
Annu Rev Biomed Eng ; 14: 275-94, 2012.
Artículo en Inglés | MEDLINE | ID: mdl-22577778

RESUMEN

Autism spectrum disorders are a group of lifelong disabilities that affect people's ability to communicate and to understand social cues. Research into applying robots as therapy tools has shown that robots seem to improve engagement and elicit novel social behaviors from people (particularly children and teenagers) with autism. Robot therapy for autism has been explored as one of the first application domains in the field of socially assistive robotics (SAR), which aims to develop robots that assist people with special needs through social interactions. In this review, we discuss the past decade's work in SAR systems designed for autism therapy by analyzing robot design decisions, human-robot interactions, and system evaluations. We conclude by discussing challenges and future trends for this young but rapidly developing research area.


Asunto(s)
Trastorno Autístico/diagnóstico , Trastorno Autístico/fisiopatología , Trastornos Generalizados del Desarrollo Infantil/diagnóstico , Trastornos Generalizados del Desarrollo Infantil/fisiopatología , Robótica , Adolescente , Ingeniería Biomédica/métodos , Investigación Biomédica/métodos , Niño , Comunicación , Diseño de Equipo , Humanos , Conducta Social , Interfaz Usuario-Computador
4.
Front Robot AI ; 10: 1249241, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38469397

RESUMEN

Creating an accurate model of a user's skills is an essential task for Intelligent Tutoring Systems (ITS) and robotic tutoring systems. This allows the system to provide personalized help based on the user's knowledge state. Most user skill modeling systems have focused on simpler tasks such as arithmetic or multiple-choice questions, where the user's model is only updated upon task completion. These tasks have a single correct answer and they generate an unambiguous observation of the user's answer. This is not the case for more complex tasks such as programming or engineering tasks, where the user completing the task creates a succession of noisy user observations as they work on different parts of the task. We create an algorithm called Time-Dependant Bayesian Knowledge Tracing (TD-BKT) that tracks users' skills throughout these more complex tasks. We show in simulation that it has a more accurate model of the user's skills and, therefore, can select better teaching actions than previous algorithms. Lastly, we show that a robot can use TD-BKT to model a user and teach electronic circuit tasks to participants during a user study. Our results show that participants significantly improved their skills when modeled using TD-BKT.

5.
medRxiv ; 2023 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-36778502

RESUMEN

Atypical eye gaze in joint attention is a clinical characteristic of autism spectrum disorder (ASD). Despite this documented symptom, neural processing of joint attention tasks in real-life social interactions is not understood. To address this knowledge gap, functional-near infrared spectroscopy (fNIRS) and eye-tracking data were acquired simultaneously as ASD and typically developed (TD) individuals engaged in a gaze-directed joint attention task with a live human and robot partner. We test the hypothesis that face processing deficits in ASD are greater for interactive faces than for simulated (robot) faces. Consistent with prior findings, neural responses during human gaze cueing modulated by face visual dwell time resulted in increased activity of ventral frontal regions in ASD and dorsal parietal systems in TD participants. Hypoactivity of the right dorsal parietal area during live human gaze cueing was correlated with autism spectrum symptom severity: Brief Observations of Symptoms of Autism (BOSA) scores (r = âˆ'0.86). Contrarily, neural activity in response to robot gaze cueing modulated by visual acquisition factors activated dorsal parietal systems in ASD, and this neural activity was not related to autism symptom severity (r = 0.06). These results are consistent with the hypothesis that altered encoding of incoming facial information to the dorsal parietal cortex is specific to live human faces in ASD. These findings open new directions for understanding joint attention difficulties in ASD by providing a connection between superior parietal lobule activity and live interaction with human faces. Lay Summary: Little is known about why it is so difficult for autistic individuals to make eye contact with other people. We find that in a live face-to-face viewing task with a robot, the brains of autistic participants were similar to typical participants but not when the partner was a live human. Findings suggest that difficulties in real-life social situations for autistic individuals may be specific to difficulties with live social interaction rather than general face gaze.

6.
Front Robot AI ; 9: 1009488, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36726401

RESUMEN

Using human tools can significantly benefit robots in many application domains. Such ability would allow robots to solve problems that they were unable to without tools. However, robot tool use is a challenging task. Tool use was initially considered to be the ability that distinguishes human beings from other animals. We identify three skills required for robot tool use: perception, manipulation, and high-level cognition skills. While both general manipulation tasks and tool use tasks require the same level of perception accuracy, there are unique manipulation and cognition challenges in robot tool use. In this survey, we first define robot tool use. The definition highlighted the skills required for robot tool use. The skills coincide with an affordance model which defined a three-way relation between actions, objects, and effects. We also compile a taxonomy of robot tool use with insights from animal tool use literature. Our definition and taxonomy lay a theoretical foundation for future robot tool use studies and also serve as practical guidelines for robot tool use applications. We first categorize tool use based on the context of the task. The contexts are highly similar for the same task (e.g., cutting) in non-causal tool use, while the contexts for causal tool use are diverse. We further categorize causal tool use based on the task complexity suggested in animal tool use studies into single-manipulation tool use and multiple-manipulation tool use. Single-manipulation tool use are sub-categorized based on tool features and prior experiences of tool use. This type of tool may be considered as building blocks of causal tool use. Multiple-manipulation tool use combines these building blocks in different ways. The different combinations categorize multiple-manipulation tool use. Moreover, we identify different skills required in each sub-type in the taxonomy. We then review previous studies on robot tool use based on the taxonomy and describe how the relations are learned in these studies. We conclude with a discussion of the current applications of robot tool use and open questions to address future robot tool use.

7.
Front Robot AI ; 8: 726463, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34970599

RESUMEN

Many real-world applications require robots to use tools. However, robots lack the skills necessary to learn and perform many essential tool-use tasks. To this end, we present the TRansferrIng Skilled Tool Use Acquired Rapidly (TRI-STAR) framework for task-general robot tool use. TRI-STAR has three primary components: 1) the ability to learn and apply tool-use skills to a wide variety of tasks from a minimal number of training demonstrations, 2) the ability to generalize learned skills to other tools and manipulated objects, and 3) the ability to transfer learned skills to other robots. These capabilities are enabled by TRI-STAR's task-oriented approach, which identifies and leverages structural task knowledge through the use of our goal-based task taxonomy. We demonstrate this framework with seven tasks that impose distinct requirements on the usages of the tools, six of which were each performed on three physical robots with varying kinematic configurations. Our results demonstrate that TRI-STAR can learn effective tool-use skills from only 20 training demonstrations. In addition, our framework generalizes tool-use skills to morphologically distinct objects and transfers them to new platforms, with minor performance degradation.

8.
Front Robot AI ; 8: 725780, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-35237667

RESUMEN

The field of Human-Robot Collaboration (HRC) has seen a considerable amount of progress in recent years. Thanks in part to advances in control and perception algorithms, robots have started to work in increasingly unstructured environments, where they operate side by side with humans to achieve shared tasks. However, little progress has been made toward the development of systems that are truly effective in supporting the human, proactive in their collaboration, and that can autonomously take care of part of the task. In this work, we present a collaborative system capable of assisting a human worker despite limited manipulation capabilities, incomplete model of the task, and partial observability of the environment. Our framework leverages information from a high-level, hierarchical model that is shared between the human and robot and that enables transparent synchronization between the peers and mutual understanding of each other's plan. More precisely, we firstly derive a partially observable Markov model from the high-level task representation; we then use an online Monte-Carlo solver to compute a short-horizon robot-executable plan. The resulting policy is capable of interactive replanning on-the-fly, dynamic error recovery, and identification of hidden user preferences. We demonstrate that the system is capable of robustly providing support to the human in a realistic furniture construction task.

9.
Front Robot AI ; 8: 772141, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-35155588

RESUMEN

The field of human-robot interaction (HRI) research is multidisciplinary and requires researchers to understand diverse fields including computer science, engineering, informatics, philosophy, psychology, and more disciplines. However, it is hard to be an expert in everything. To help HRI researchers develop methodological skills, especially in areas that are relatively new to them, we conducted a virtual workshop, Workshop Your Study Design (WYSD), at the 2021 International Conference on HRI. In this workshop, we grouped participants with mentors, who are experts in areas like real-world studies, empirical lab studies, questionnaire design, interview, participatory design, and statistics. During and after the workshop, participants discussed their proposed study methods, obtained feedback, and improved their work accordingly. In this paper, we present 1) Workshop attendees' feedback about the workshop and 2) Lessons that the participants learned during their discussions with mentors. Participants' responses about the workshop were positive, and future scholars who wish to run such a workshop can consider implementing their suggestions. The main contribution of this paper is the lessons learned section, where the workshop participants contributed to forming this section based on what participants discovered during the workshop. We organize lessons learned into themes of 1) Improving study design for HRI, 2) How to work with participants - especially children -, 3) Making the most of the study and robot's limitations, and 4) How to collaborate well across fields as they were the areas of the papers submitted to the workshop. These themes include practical tips and guidelines to assist researchers to learn about fields of HRI research with which they have limited experience. We include specific examples, and researchers can adapt the tips and guidelines to their own areas to avoid some common mistakes and pitfalls in their research.

10.
Sci Robot ; 5(44)2020 Jul 15.
Artículo en Inglés | MEDLINE | ID: mdl-33022606

RESUMEN

Robots have a role in addressing the secondary impacts of infectious disease outbreaks by helping us sustain social distancing, monitoring and improving mental health, supporting education, and aiding in economic recovery.


Asunto(s)
COVID-19/psicología , Brotes de Enfermedades , Pandemias , Robótica/instrumentación , Adulto , COVID-19/economía , COVID-19/epidemiología , Niño , Recesión Económica , Educación a Distancia/métodos , Humanos , Salud Mental , Servicios de Salud Mental , Distanciamiento Físico , Robótica/métodos , SARS-CoV-2 , Ajuste Social , Aislamiento Social/psicología , Educación Vocacional/métodos
11.
Front Psychol ; 11: 590181, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-33424708

RESUMEN

As teams of people increasingly incorporate robot members, it is essential to consider how a robot's actions may influence the team's social dynamics and interactions. In this work, we investigated the effects of verbal support from a robot (e.g., "good idea Salim," "yeah") on human team members' interactions related to psychological safety and inclusion. We conducted a between-subjects experiment (N = 39 groups, 117 participants) where the robot team member either (A) gave verbal support or (B) did not give verbal support to the human team members of a human-robot team comprised of 2 human ingroup members, 1 human outgroup member, and 1 robot. We found that targeted support from the robot (e.g., "good idea George") had a positive effect on outgroup members, who increased their verbal participation after receiving targeted support from the robot. When comparing groups that did and did not have verbal support from the robot, we found that outgroup members received fewer verbal backchannels from ingroup members if their group had robot verbal support. These results suggest that verbal support from a robot may have some direct benefits to outgroup members but may also reduce the obligation ingroup members feel to support the verbal contributions of outgroup members.

12.
Front Robot AI ; 7: 599581, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-33585574

RESUMEN

Robot design to simulate interpersonal social interaction is an active area of research with applications in therapy and companionship. Neural responses to eye-to-eye contact in humans have recently been employed to determine the neural systems that are active during social interactions. Whether eye-contact with a social robot engages the same neural system remains to be seen. Here, we employ a similar approach to compare human-human and human-robot social interactions. We assume that if human-human and human-robot eye-contact elicit similar neural activity in the human, then the perceptual and cognitive processing is also the same for human and robot. That is, the robot is processed similar to the human. However, if neural effects are different, then perceptual and cognitive processing is assumed to be different. In this study neural activity was compared for human-to-human and human-to-robot conditions using near infrared spectroscopy for neural imaging, and a robot (Maki) with eyes that blink and move right and left. Eye-contact was confirmed by eye-tracking for both conditions. Increased neural activity was observed in human social systems including the right temporal parietal junction and the dorsolateral prefrontal cortex during human-human eye contact but not human-robot eye-contact. This suggests that the type of human-robot eye-contact used here is not sufficient to engage the right temporoparietal junction in the human. This study establishes a foundation for future research into human-robot eye-contact to determine how elements of robot design and behavior impact human social processing within this type of interaction and may offer a method for capturing difficult to quantify components of human-robot interaction, such as social engagement.

13.
Sci Robot ; 3(21)2018 08 15.
Artículo en Inglés | MEDLINE | ID: mdl-33141719

RESUMEN

Social robots can be used in education as tutors or peer learners. They have been shown to be effective at increasing cognitive and affective outcomes and have achieved outcomes similar to those of human tutoring on restricted tasks. This is largely because of their physical presence, which traditional learning technologies lack. We review the potential of social robots in education, discuss the technical challenges, and consider how the robot's appearance and behavior affect learning outcomes.

14.
Sci Robot ; 3(21)2018 08 22.
Artículo en Inglés | MEDLINE | ID: mdl-33141724

RESUMEN

Social robots can offer tremendous possibilities for autism spectrum disorder (ASD) interventions. To date, most studies with this population have used short, isolated encounters in controlled laboratory settings. Our study focused on a 1-month, home-based intervention for increasing social communication skills of 12 children with ASD between 6 and 12 years old using an autonomous social robot. The children engaged in a triadic interaction with a caregiver and the robot for 30 min every day to complete activities on emotional storytelling, perspective-taking, and sequencing. The robot encouraged engagement, adapted the difficulty of the activities to the child's past performance, and modeled positive social skills. The system maintained engagement over the 1-month deployment, and children showed improvement on joint attention skills with adults when not in the presence of the robot. These results were also consistent with caregiver questionnaires. Caregivers reported less prompting over time and overall increased communication.

15.
Sci Robot ; 3(14)2018 01 31.
Artículo en Inglés | MEDLINE | ID: mdl-33141701

RESUMEN

One of the ambitions of Science Robotics is to deeply root robotics research in science while developing novel robotic platforms that will enable new scientific discoveries. Of our 10 grand challenges, the first 7 represent underpinning technologies that have a wider impact on all application areas of robotics. For the next two challenges, we have included social robotics and medical robotics as application-specific areas of development to highlight the substantial societal and health impacts that they will bring. Finally, the last challenge is related to responsible innovation and how ethics and security should be carefully considered as we develop the technology further.

16.
Clin Psychol Rev ; 35: 35-46, 2015 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-25462112

RESUMEN

As a field, mental healthcare is faced with major challenges as it attempts to close the huge gap between those who need services and those who receive services. In recent decades, technological advances have provided exciting new resources in this battle. Socially assistive robotics (SAR) is a particularly promising area that has expanded into several exciting mental healthcare applications. Indeed, a growing literature highlights the variety of clinically relevant functions that these robots can serve, from companion to therapeutic play partner. This paper reviews the ways that SAR have already been used in mental health service and research and discusses ways that these applications can be expanded. We also outline the challenges and limitations associated with further integrating SAR into mental healthcare. SAR is not proposed as a replacement for specially trained and knowledgeable professionals nor is it seen as a panacea for all mental healthcare needs. Instead, robots can serve as clinical tools and assistants in a wide range of settings. Given the dramatic growth in this area, now is a critical moment for individuals in the mental healthcare community to become engaged in this research and steer it toward our field's most pressing clinical needs.


Asunto(s)
Trastornos Mentales/terapia , Servicios de Salud Mental , Robótica/métodos , Amigos/psicología , Humanos , Trastornos Mentales/psicología , Juego e Implementos de Juego/psicología
17.
Proc Eye Track Res Appl Symp ; 2014: 67-74, 2014 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-26504903

RESUMEN

Fixation identification algorithms facilitate data comprehension and provide analytical convenience in eye-tracking analysis. However, current fixation algorithms for eye-tracking analysis are heavily dependent on parameter choices, leading to instabilities in results and incompleteness in reporting. This work examines the nature of human scanning patterns during complex scene viewing. We show that standard implementations of the commonly used distance-dispersion algorithm for fixation identification are functionally equivalent to greedy spatiotemporal tiling. We show that modeling the number of fixations as a function of tiling size leads to a measure of fractal dimensionality through box counting. We apply this technique to examine scale-free gaze behaviors in toddlers and adults looking at images of faces and blocks, as well as large number of adults looking at movies or static images. The distributional aspects of the number of fixations may suggest a fractal structure to gaze patterns in free scanning and imply that the incompleteness of standard algorithms may be due to the scale-free behaviors of the underlying scanning distributions. We discuss the nature of this hypothesis, its limitations, and offer directions for future work.

18.
J Autism Dev Disord ; 43(5): 1038-49, 2013 May.
Artículo en Inglés | MEDLINE | ID: mdl-23111617

RESUMEN

In this study we examined the social behaviors of 4- to 12-year-old children with autism spectrum disorders (ASD; N = 24) during three tradic interactions with an adult confederate and an interaction partner, where the interaction partner varied randomly among (1) another adult human, (2) a touchscreen computer game, and (3) a social dinosaur robot. Children spoke more in general, and directed more speech to the adult confederate, when the interaction partner was a robot, as compared to a human or computer game interaction partner. Children spoke as much to the robot as to the adult interaction partner. This study provides the largest demonstration of social human-robot interaction in children with autism to date. Our findings suggest that social robots may be developed into useful tools for social skills and communication therapies, specifically by embedding social interaction into intrinsic reinforcers and motivators.


Asunto(s)
Conducta Infantil/psicología , Trastornos Generalizados del Desarrollo Infantil/psicología , Comunicación , Relaciones Interpersonales , Robótica , Conducta Social , Niño , Preescolar , Estudios Cruzados , Femenino , Humanos , Masculino
19.
Brain Res ; 1380: 246-54, 2011 Mar 22.
Artículo en Inglés | MEDLINE | ID: mdl-21129365

RESUMEN

This study used eye-tracking to examine how 20-month-old toddlers with autism spectrum disorder (ASD) (n=28), typical development (TD) (n=34), and non-autistic developmental delays (DD) (n=16) monitored the activities occurring in a context of an adult-child play interaction. Toddlers with ASD, in comparison to control groups, showed less attention to the activities of others and focused more on background objects (e.g., toys). In addition, while all groups spent the same time overall looking at people, toddlers with ASD looked less at people's heads and more at their bodies. In ASD, these patterns were associated with cognitive deficits and greater autism severity. These results suggest that the monitoring of the social activities of others is disrupted early in the developmental progression of autism, limiting future avenues for observational learning.


Asunto(s)
Atención/fisiología , Trastornos Generalizados del Desarrollo Infantil/fisiopatología , Trastornos Generalizados del Desarrollo Infantil/psicología , Trastorno de la Conducta Social/fisiopatología , Trastorno de la Conducta Social/psicología , Conducta Social , Trastornos Generalizados del Desarrollo Infantil/diagnóstico , Femenino , Humanos , Lactante , Masculino , Trastorno de la Conducta Social/diagnóstico
20.
Top Cogn Sci ; 2(1): 114-26, 2010 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-25163625

RESUMEN

We present a novel, sophisticated intention-based control system for a mobile robot built from an extremely inexpensive webcam and radio-controlled toy vehicle. The system visually observes humans participating in various playground games and infers their goals and intentions through analyzing their spatiotemporal activity in relation to itself and each other, and then builds a coherent narrative out of the succession of these intentional states. Starting from zero information about the room, the rules of the games, or even which vehicle it controls, it learns rich relationships between players, their goals and intentions, probing uncertain situations with its own behavior. The robot is able to watch people playing various playground games, learn the roles and rules that apply to specific games, and participate in the play. The narratives it constructs capture essential information about the observed social roles and types of activity. After watching play for a short while, the system is able to participate appropriately in the games. We demonstrate how the system acts appropriately in scenarios such as chasing, follow-the-leader, and variants of tag.


Asunto(s)
Intención , Aprendizaje , Robótica/métodos , Percepción Social , Juegos Experimentales , Humanos , Movimiento (Física) , Robótica/instrumentación
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA