RESUMEN
Path planning creates the shortest path from the source to the destination based on sensory information obtained from the environment. Within path planning, obstacle avoidance is a crucial task in robotics, as the autonomous operation of robots needs to reach their destination without collisions. Obstacle avoidance algorithms play a key role in robotics and autonomous vehicles. These algorithms enable robots to navigate their environment efficiently, minimizing the risk of collisions and safely avoiding obstacles. This article provides an overview of key obstacle avoidance algorithms, including classic techniques such as the Bug algorithm and Dijkstra's algorithm, and newer developments like genetic algorithms and approaches based on neural networks. It analyzes in detail the advantages, limitations, and application areas of these algorithms and highlights current research directions in obstacle avoidance robotics. This article aims to provide comprehensive insight into the current state and prospects of obstacle avoidance algorithms in robotics applications. It also mentions the use of predictive methods and deep learning strategies.
RESUMEN
This paper deals with sensor fusion of magnetic, angular rate and gravity sensor (MARG). The main contribution of this paper is the sensor fusion performed by supervised learning, which means parallel processing of the different kinds of measured data and estimating the position in periodic and non-periodic cases. During the learning phase, the position estimated by sensor fusion is compared with position data of a motion capture system. The main challenge is avoiding the error caused by the implicit integral calculation of MARG. There are several filter based signal processing methods for disturbance and noise estimation, which are calculated for each sensor separately. These classical methods can be used for disturbance and noise reduction and extracting hidden information from it as well. This paper examines the different types of noises and proposes a machine learning-based method for calculation of position and orientation directly from nine separate sensors. This method includes the disturbance and noise reduction in addition to sensor fusion. The proposed method was validated by experiments which provided promising results on periodic and translational motion as well.
RESUMEN
Emotionally expressive vocalizations can elicit approach-avoidance responses in humans and non-human animals. We investigated whether artificially generated sounds have similar effects on humans. We assessed whether subjects' reactions were linked to acoustic properties, and associated valence and intensity. We generated 343 artificial sounds with differing call lengths, fundamental frequencies and added acoustic features across 7 categories and 3 levels of biological complexity. We assessed the hypothetical behavioural response using an online questionnaire with a manikin task, in which 172 participants indicated whether they would approach or withdraw from an object emitting the sound. (1) Quieter sounds elicited approach, while loud sounds were associated with avoidance. (2) The effect of pitch was modulated by category, call length and loudness. (2a) Low-pitched sounds in complex sound categories prompted avoidance, while in other categories they elicited approach. (2b) Higher pitch in loud sounds had a distancing effect, while higher pitch in quieter sounds prompted approach. (2c) Longer sounds promoted avoidance, especially at high frequencies. (3) Sounds with higher intensity and negative valence elicited avoidance. We conclude that biologically based acoustic signals can be used to regulate the distance between social robots and humans, which can provide an advantage in interactive scenarios.
Asunto(s)
Estimulación Acústica , Motivación , Sonido , Humanos , Masculino , Femenino , Adulto , Motivación/fisiología , Adulto Joven , Encuestas y Cuestionarios , Emociones/fisiologíaRESUMEN
The detailed description of behaviour of the interacting parties is becoming more and more important in human-robot interaction (HRI), especially in social robotics (SR). With the rise in the number of publications, there is a substantial need for the objective and comprehensive description of implemented robot behaviours to ensure comparability and reproducibility of the studies. Ethograms and the meticulous analysis of behaviour was introduced long ago in animal behaviour research (cf. ethology). The adoption of this method in SR and HRI can ensure the desired clarity over robot behaviours, while also providing added benefits during robot development, behaviour modelling and analysis of HRI experiments. We provide an overview of the possible uses and advantages of ethograms in HRI, and propose a general framework for describing behaviour which can be adapted to the requirements of specific studies.
Asunto(s)
Robótica , Animales , Humanos , Robótica/métodos , Etología , Reproducibilidad de los Resultados , Conducta AnimalRESUMEN
Emotionally expressive non-verbal vocalizations can play a major role in human-robot interactions. Humans can assess the intensity and emotional valence of animal vocalizations based on simple acoustic features such as call length and fundamental frequency. These simple encoding rules are suggested to be general across terrestrial vertebrates. To test the degree of this generalizability, our aim was to synthesize a set of artificial sounds by systematically changing the call length and fundamental frequency, and examine how emotional valence and intensity is attributed to them by humans. Based on sine wave sounds, we generated sound samples in seven categories by increasing complexity via incorporating different characteristics of animal vocalizations. We used an online questionnaire to measure the perceived emotional valence and intensity of the sounds in a two-dimensional model of emotions. The results show that sounds with low fundamental frequency and shorter call lengths were considered to have a more positive valence, and samples with high fundamental frequency were rated as more intense across all categories, regardless of the sound complexity. We conclude that applying the basic rules of vocal emotion encoding can be a good starting point for the development of novel non-verbal vocalizations for artificial agents.
Asunto(s)
Percepción Auditiva/fisiología , Comunicación , Emociones/fisiología , Sonido , Adulto , Femenino , Humanos , Masculino , Persona de Mediana EdadRESUMEN
AIM: To evaluate the effectiveness and user satisfaction with the sit-to-stand (STS) assistance system of a smart walker (SW), and to identify factors associated with them in potential users. METHODS: A total of 33 older adults (29 women, aged ≥65 years) with motor impairments (habitual rollator use) and no severe cognitive impairment (Mini-Mental State Examination ≥17 points) carried out a Five-Chair Stand Test without assistance and five STS transfers with the STS assistance system. Based on the number of successfully completed STS transfers, success rates were calculated for the Five-Chair Stand Test and the SW-assisted STS transfers, and compared using the Wilcoxon signed-rank test. User satisfaction was assessed using the Tele-healthcare Satisfaction Questionnaire-Wearable Technology (0-80 points, higher score = higher satisfaction). Bivariate correlations and multiple linear regression analyses were used to identify participant characteristics associated with the success rate and user satisfaction with the STS assistance system. RESULTS: The success rate for the SW-assisted STS transfers was significantly higher than for the Five-Chair Stand Test (93.3 ± 12.9% vs 54.5 ± 50.6%, P < 0.001). User satisfaction was high (Tele-healthcare Satisfaction Questionnaire-Wearable Technology 62.5 ± 11.2 points). The success rate with the STS assistance system was not significantly associated with any participant characteristics. Higher body mass index was a significant independent predictor of higher user satisfaction. CONCLUSIONS: The SW-integrated STS assistance system can provide effective STS support with high user satisfaction for a wide range of potential users. Our findings suggest the high potential of the STS assistance system for promoting mobility, independence and quality of life for older adults with motor impairments. Geriatr Gerontol Int 2020; 20: 312-316.
Asunto(s)
Trastornos Motores/rehabilitación , Robótica/normas , Andadores/normas , Actividades Cotidianas , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Satisfacción del Paciente , Calidad de Vida , Encuestas y CuestionariosRESUMEN
A special area of human-machine interaction, the expression of emotions gains importance with the continuous development of artificial agents such as social robots or interactive mobile applications. We developed a prototype version of an abstract emotion visualization agent to express five basic emotions and a neutral state. In contrast to well-known symbolic characters (e.g., smileys) these displays follow general biological and ethological rules. We conducted a multiple questionnaire study on the assessment of the displays with Hungarian and Japanese subjects. In most cases participants were successful in recognizing the displayed emotions. Fear and sadness were most easily confused with each other while both the Hungarian and Japanese participants recognized the anger display most correctly. We suggest that the implemented biological approach can be a viable complement to the emotion expressions of some artificial agents, for example mobile devices.
RESUMEN
Here we aim to lay the theoretical foundations of human-robot relationship drawing upon insights from disciplines that govern relevant human behaviors: ecology and ethology. We show how the paradox of the so called "uncanny valley hypothesis" can be solved by applying the "niche" concept to social robots, and relying on the natural behavior of humans. Instead of striving to build human-like social robots, engineers should construct robots that are able to maximize their performance in their niche (being optimal for some specific functions), and if they are endowed with appropriate form of social competence then humans will eventually interact with them independent of their embodiment. This new discipline, which we call ethorobotics, could change social robotics, giving a boost to new technical approaches and applications.
RESUMEN
In the last few years there was an increasing interest in building companion robots that interact in a socially acceptable way with humans. In order to interact in a meaningful way a robot has to convey intentionality and emotions of some sort in order to increase believability. We suggest that human-robot interaction should be considered as a specific form of inter-specific interaction and that human-animal interaction can provide a useful biological model for designing social robots. Dogs can provide a promising biological model since during the domestication process dogs were able to adapt to the human environment and to participate in complex social interactions. In this observational study we propose to design emotionally expressive behaviour of robots using the behaviour of dogs as inspiration and to test these dog-inspired robots with humans in inter-specific context. In two experiments (wizard-of-oz scenarios) we examined humans' ability to recognize two basic and a secondary emotion expressed by a robot. In Experiment 1 we provided our companion robot with two kinds of emotional behaviour ("happiness" and "fear"), and studied whether people attribute the appropriate emotion to the robot, and interact with it accordingly. In Experiment 2 we investigated whether participants tend to attribute guilty behaviour to a robot in a relevant context by examining whether relying on the robot's greeting behaviour human participants can detect if the robot transgressed a predetermined rule. Results of Experiment 1 showed that people readily attribute emotions to a social robot and interact with it in accordance with the expressed emotional behaviour. Results of Experiment 2 showed that people are able to recognize if the robot transgressed on the basis of its greeting behaviour. In summary, our findings showed that dog-inspired behaviour is a suitable medium for making people attribute emotional states to a non-humanoid robot.