Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 11 de 11
Filtrar
1.
Chaos ; 31(8): 083131, 2021 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-34470232

RESUMO

When nonlinear measures are estimated from sampled temporal signals with finite-length, a radius parameter must be carefully selected to avoid a poor estimation. These measures are generally derived from the correlation integral, which quantifies the probability of finding neighbors, i.e., pair of points spaced by less than the radius parameter. While each nonlinear measure comes with several specific empirical rules to select a radius value, we provide a systematic selection method. We show that the optimal radius for nonlinear measures can be approximated by the optimal bandwidth of a Kernel Density Estimator (KDE) related to the correlation sum. The KDE framework provides non-parametric tools to approximate a density function from finite samples (e.g., histograms) and optimal methods to select a smoothing parameter, the bandwidth (e.g., bin width in histograms). We use results from KDE to derive a closed-form expression for the optimal radius. The latter is used to compute the correlation dimension and to construct recurrence plots yielding an estimate of Kolmogorov-Sinai entropy. We assess our method through numerical experiments on signals generated by nonlinear systems and experimental electroencephalographic time series.

2.
Conscious Cogn ; 46: 99-109, 2016 11.
Artigo em Inglês | MEDLINE | ID: mdl-27689514

RESUMO

Whole-body embodiment studies have shown that synchronized multi-sensory cues can trick a healthy human mind to perceive self-location outside the bodily borders, producing an illusion that resembles an out-of-body experience (OBE). But can a healthy mind also perceive the sense of self in more than one body at the same time? To answer this question, we created a novel artificial reduplication of one's body using a humanoid robot embodiment system. We first enabled individuals to embody the humanoid robot by providing them with audio-visual feedback and control of the robot head movements and walk, and then explored the self-location and self-identification perceived by them when they observed themselves through the embodied robot. Our results reveal that, when individuals are exposed to the humanoid body reduplication, they experience an illusion that strongly resembles heautoscopy, suggesting that a healthy human mind is able to bi-locate in two different bodies simultaneously.


Assuntos
Ego , Ilusões/fisiologia , Robótica , Percepção Visual/fisiologia , Adulto , Feminino , Humanos , Masculino , Adulto Jovem
3.
Sci Rep ; 11(1): 20070, 2021 10 08.
Artigo em Inglês | MEDLINE | ID: mdl-34625575

RESUMO

Almost all robotic systems in use have hard shells, which is limiting in many ways their full potential of physical interaction with humans or their surrounding environment. Robots with soft-shell covers offer an alternative morphology which is more pleasant in both appearance and for haptic human interaction. A persisting challenge in such soft-shell robotic covers is the simultaneous realization of softness and heat-conducting properties. Such heat-conducting properties are important for enabling temperature-control of robotic covers in the range that is comfortable for human touch. The presented soft-shell robotic cover is composed of a linked two-layer structure: (1) The inner layer, with built-in pipes for water circulation, is soft and acts as a thermal-isolation layer between the cover and the robot structure, whereas (2) the outer layer, which can be patterned with a given desired texture and color, allows heat transfer from the circulating water of the inner part to the surface. Moreover, we demonstrate the ability to integrate our prototype cover with a humanoid robot equipped with capacitance sensors. This fabrication technique enables robotic cover possibilities, including tunable color, surface texture, and size, that are likely to have applications in a variety of robotic systems.

4.
PLoS One ; 13(11): e0206698, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30408062

RESUMO

Does the presence of a robot co-worker influence the performance of humans around it? Studies of motor contagions during human-robot interactions have examined either how the observation of a robot affects a human's movement velocity, or how it affects the human's movement variance, but never both together. Performance however, has to be measured considering both task speed (or frequency) as well as task accuracy. Here we examine an empirical repetitive industrial task in which a human participant and a humanoid robot work near each other. We systematically varied the robot behavior, and observed whether and how the performance of a human participant is affected by the presence of the robot. To investigate the effect of physical form, we added conditions where the robot co-worker torso and head were covered, and only the moving arm was visible to the human participants. Finally, we compared these behaviors with a human co-worker, and examined how the observed behavioral affects scale with experience of robots. Our results show that human task frequency, but not task accuracy, is affected by the observation of a humanoid robot co-worker, provided the robot's head and torso are visible.


Assuntos
Robótica , Desempenho Profissional , Adulto , Braço , Técnicas de Observação do Comportamento/estatística & dados numéricos , Feminino , Humanos , Masculino , Movimento/fisiologia , Robótica/instrumentação , Robótica/estatística & dados numéricos , Desempenho Profissional/estatística & dados numéricos , Adulto Jovem
5.
IEEE Trans Pattern Anal Mach Intell ; 40(12): 2883-2896, 2018 12.
Artigo em Inglês | MEDLINE | ID: mdl-29989962

RESUMO

We consider the problem of estimating realistic contact forces during manipulation, backed with ground-truth measurements, using vision alone. Interaction forces are usually measured by mounting force transducers onto the manipulated objects or the hands. Those are costly, cumbersome, and alter the objects' physical properties and their perception by the human sense of touch. Our work establishes that interaction forces can be estimated in a cost-effective, reliable, non-intrusive way using vision. This is a complex and challenging problem. Indeed, in multi-contact, a given motion can generally be caused by an infinity of possible force distributions. To alleviate the limitations of traditional models based on inverse optimization, we collect and release the first large-scale dataset on manipulation kinodynamics as 3.2 hours of synchronized force and motion measurements under 193 object-grasp configurations. We learn a mapping between high-level kinematic features based on the equations of motion and the underlying manipulation forces using recurrent neural networks (RNN). The RNN predictions are consistently refined using physics-based optimization through second-order cone programming (SOCP). We show that our method can successfully capture interaction forces compatible with both the observations and the way humans intuitively manipulate objects, using a single RGB-D camera.

6.
IEEE Trans Vis Comput Graph ; 23(6): 1650-1662, 2017 06.
Artigo em Inglês | MEDLINE | ID: mdl-26992101

RESUMO

We extend the quadratic program (QP)-based task-space character control approach-initially intended for individual character animation-to multiple characters interacting among each other or with mobile/articulated elements of the environment. The interactions between the characters can be either physical interactions, such as contacts that can be established or broken at will between them and for which the forces are subjected to Newton's third law, or behavioral interactions, such as collision avoidance and cooperation that naturally emerge to achieve collaborative tasks from high-level specifications. We take a systematic approach integrating all the equations of motions of the characters, objects, and articulated environment parts in a single QP formulation in order to embrace and solve the most general instance of the problem, where independent individual character controllers would fail to account for the inherent coupling of their respective motions through those physical and behavioral interactions. Various types of motions/behaviors are controlled with only the one single formulation that we propose, and some examples of the original motions the framework allows are presented in the accompanying video.

7.
IEEE Trans Neural Syst Rehabil Eng ; 25(6): 772-781, 2017 06.
Artigo em Inglês | MEDLINE | ID: mdl-28113631

RESUMO

The efficient control of our body and successful interaction with the environment are possible through the integration of multisensory information. Brain-computer interface (BCI) may allow people with sensorimotor disorders to actively interact in the world. In this study, visual information was paired with auditory feedback to improve the BCI control of a humanoid surrogate. Healthy and spinal cord injured (SCI) people were asked to embody a humanoid robot and complete a pick-and-place task by means of a visual evoked potentials BCI system. Participants observed the remote environment from the robot's perspective through a head mounted display. Human-footsteps and computer-beep sounds were used as synchronous/asynchronous auditory feedback. Healthy participants achieved better placing accuracy when listening to human footstep sounds relative to a computer-generated sound. SCI people demonstrated more difficulty in steering the robot during asynchronous auditory feedback conditions. Importantly, subjective reports highlighted that the BCI mask overlaying the display did not limit the observation of the scenario and the feeling of being in control of the robot. Overall, the data seem to suggest that sensorimotor-related information may improve the control of external devices. Further studies are required to understand how the contribution of residual sensory channels could improve the reliability of BCI systems.


Assuntos
Interfaces Cérebro-Computador , Retroalimentação Sensorial , Imaginação , Movimento , Robótica/instrumentação , Traumatismos da Medula Espinal/fisiopatologia , Traumatismos da Medula Espinal/reabilitação , Adulto , Biomimética/instrumentação , Encéfalo/fisiologia , Pessoas com Deficiência/reabilitação , Feminino , Humanos , Masculino , Sistemas Homem-Máquina , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Traumatismos da Medula Espinal/diagnóstico , Análise e Desempenho de Tarefas , Resultado do Tratamento , Adulto Jovem
8.
IEEE Trans Vis Comput Graph ; 12(1): 36-47, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-16382606

RESUMO

A new computer haptics algorithm to be used in general interactive manipulations of deformable virtual objects is presented. In multimodal interactive simulations, haptic feedback computation often comes from contact forces. Subsequently, the fidelity of haptic rendering depends significantly on contact space modeling. Contact and friction laws between deformable models are often simplified in up to date methods. They do not allow a "realistic" rendering of the subtleties of contact space physical phenomena (such as slip and stick effects due to friction or mechanical coupling between contacts). In this paper, we use Signorini's contact law and Coulomb's friction law as a computer haptics basis. Real-time performance is made possible thanks to a linearization of the behavior in the contact space, formulated as the so-called Delassus operator, and iteratively solved by a Gauss-Seidel type algorithm. Dynamic deformation uses corotational global formulation to obtain the Delassus operator in which the mass and stiffness ratio are dissociated from the simulation time step. This last point is crucial to keep stable haptic feedback. This global approach has been packaged, implemented, and tested. Stable and realistic 6D haptic feedback is demonstrated through a clipping task experiment.


Assuntos
Gráficos por Computador , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Modelos Biológicos , Tato/fisiologia , Interface Usuário-Computador , Algoritmos , Periféricos de Computador , Simulação por Computador , Sistemas Computacionais , Elasticidade , Ambiente Controlado , Processamento de Sinais Assistido por Computador , Estresse Mecânico
9.
R Soc Open Sci ; 3(8): 160407, 2016 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-27853620

RESUMO

The question of how we attribute observed body parts as our own, and the consequences of this attribution on our sensory-motor processes, is fundamental to understand how our brain distinguishes between self and other. Previous studies have identified interactions between the illusion of ownership, and multi-sensory integration and cross-sensory predictions by the brain. Here we show that illusory ownership additionally modifies the motor-sensory predictions by the brain. In our preliminary experiments, we observed a new numbness illusion following the classical rubber-hand illusion (RHI); brushing only the rubber hand after induction of the RHI results in illusory numbness in one's real hand. Previous studies have shown that self-generated actions (like tickling) are attenuated by motor-sensory predictions by the so-called forward model. Motivated by this finding, here we examined whether the numbness illusion after the RHI is different when the rubber hand is brushed oneself, compared with when the brushing is performed by another. We observed that, all other conditions remaining the same, haptic perception in the real hand was lower (numbness higher) during self-generated brushing. Our result suggests that RHI reorganizes the forward model, such that we predict haptic consequences of self-generated motor actions on the rubber hand.

10.
Front Neurorobot ; 8: 20, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24987350

RESUMO

Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to successfully interact with the environment the brain integrates multiple sensory cues to form a coherent representation of the world. Cognitive neuroscience studies demonstrate that multisensory integration may imply a gain with respect to a single modality and ultimately improve the overall sensorimotor performance. For example, reactivity to simultaneous visual and auditory stimuli may be higher than to the sum of the same stimuli delivered in isolation or in temporal sequence. Yet, knowledge about whether audio-visual integration may improve the control of a surrogate is meager. To explore this issue, we provided human footstep sounds as audio feedback to BCI users while controlling a humanoid robot. Participants were asked to steer their robot surrogate and perform a pick-and-place task through BCI-SSVEPs. We found that audio-visual synchrony between footsteps sound and actual humanoid's walk reduces the time required for steering the robot. Thus, auditory feedback congruent with the humanoid actions may improve motor decisions of the BCI's user and help in the feeling of control over it. Our results shed light on the possibility to increase robot's control through the combination of multisensory feedback to a BCI user.

11.
Exp Brain Res ; 162(2): 172-80, 2005 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-15791465

RESUMO

We tested whether auditory sequences of beeps can modulate the tactile perception of sequences of taps (two to four taps per sequence) delivered to the index fingertip. In the first experiment, the auditory and tactile sequences were presented simultaneously. The number of beeps delivered in the auditory sequence were either the same as, less than, or more than the number of taps of the simultaneously presented tactile sequence. Though task-irrelevant (subjects were instructed to focus on the tactile stimuli), the auditory stimuli systematically modulated subjects' tactile perception; in other words subjects' responses depended significantly on the number of delivered beeps. Such modulation only occurred when the auditory and tactile stimuli were similar enough. In the second experiment, we tested whether the automatic auditory-tactile integration depends on simultaneity or whether a bias can be evoked when the auditory and tactile sequence are presented in temporal asynchrony. Audition significantly modulated tactile perception when the stimuli were presented simultaneously but this effect gradually disappeared when a temporal asynchrony was introduced between auditory and tactile stimuli. These results show that when provided with auditory and tactile sensory signals that are likely to be generated by the same stimulus, the central nervous system (CNS) tends to automatically integrate these signals.


Assuntos
Estimulação Acústica/métodos , Percepção Auditiva/fisiologia , Desempenho Psicomotor/fisiologia , Tato/fisiologia , Adolescente , Adulto , Humanos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA