Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Med Biol Eng Comput ; 2024 May 18.
Article in English | MEDLINE | ID: mdl-38760597

ABSTRACT

In the field of sensory neuroprostheses, one ultimate goal is for individuals to perceive artificial somatosensory information and use the prosthesis with high complexity that resembles an intact system. To this end, research has shown that stimulation-elicited somatosensory information improves prosthesis perception and task performance. While studies strive to achieve sensory integration, a crucial phenomenon that entails naturalistic interaction with the environment, this topic has not been commensurately reviewed. Therefore, here we present a perspective for understanding sensory integration in neuroprostheses. First, we review the engineering aspects and functional outcomes in sensory neuroprosthesis studies. In this context, we summarize studies that have suggested sensory integration. We focus on how they have used stimulation-elicited percepts to maximize and improve the reliability of somatosensory information. Next, we review studies that have suggested multisensory integration. These works have demonstrated that congruent and simultaneous multisensory inputs provided cognitive benefits such that an individual experiences a greater sense of authority over prosthesis movements (i.e., agency) and perceives the prosthesis as part of their own (i.e., ownership). Thereafter, we present the theoretical and neuroscience framework of sensory integration. We investigate how behavioral models and neural recordings have been applied in the context of sensory integration. Sensory integration models developed from intact-limb individuals have led the way to sensory neuroprosthesis studies to demonstrate multisensory integration. Neural recordings have been used to show how multisensory inputs are processed across cortical areas. Lastly, we discuss some ongoing research and challenges in achieving and understanding sensory integration in sensory neuroprostheses. Resolving these challenges would help to develop future strategies to improve the sensory feedback of a neuroprosthetic system.

2.
J Neuroeng Rehabil ; 21(1): 8, 2024 01 13.
Article in English | MEDLINE | ID: mdl-38218890

ABSTRACT

BACKGROUND: Tremors are involuntary rhythmic movements commonly present in neurological diseases such as Parkinson's disease, essential tremor, and multiple sclerosis. Intention tremor is a subtype associated with lesions in the cerebellum and its connected pathways, and it is a common symptom in diseases associated with cerebellar pathology. While clinicians traditionally use tests to identify tremor type and severity, recent advancements in wearable technology have provided quantifiable ways to measure movement and tremor using motion capture systems, app-based tasks and tools, and physiology-based measurements. However, quantifying intention tremor remains challenging due to its changing nature. METHODOLOGY & RESULTS: This review examines the current state of upper limb tremor assessment technology and discusses potential directions to further develop new and existing algorithms and sensors to better quantify tremor, specifically intention tremor. A comprehensive search using PubMed and Scopus was performed using keywords related to technologies for tremor assessment. Afterward, screened results were filtered for relevance and eligibility and further classified into technology type. A total of 243 publications were selected for this review and classified according to their type: body function level: movement-based, activity level: task and tool-based, and physiology-based. Furthermore, each publication's methods, purpose, and technology are summarized in the appendix table. CONCLUSIONS: Our survey suggests a need for more targeted tasks to evaluate intention tremors, including digitized tasks related to intentional movements, neurological and physiological measurements targeting the cerebellum and its pathways, and signal processing techniques that differentiate voluntary from involuntary movement in motion capture systems.


Subject(s)
Tremor , Wearable Electronic Devices , Humans , Essential Tremor/diagnosis , Movement/physiology , Parkinson Disease/complications , Parkinson Disease/diagnosis , Tremor/diagnosis , Upper Extremity
3.
IEEE Int Conf Rehabil Robot ; 2023: 1-6, 2023 09.
Article in English | MEDLINE | ID: mdl-37941167

ABSTRACT

The grip force dynamics during grasping and lifting of diversely weighted objects are highly informative about an individual's level of sensorimotor control and potential neurological condition. Therefore, grip force profiles might be used for assessment and bio-feedback training during neurorehabilitation therapy. Modern neurorehabilitation methods, such as exoskeleton-assisted grasping and virtual-reality-based hand function training, strongly differ from classical grasp-and-lift experiments which might influence the sensorimotor control of grasping and thus the characteristics of grip force profiles. In this feasibility study with six healthy participants, we investigated the changes in grip force profiles during exoskeleton-assisted grasping and grasping of virtual objects. Our results show that a light-weight and highly compliant hand exoskeleton is able to assist users during grasping while not removing the core characteristics of their grip force dynamics. Furthermore, we show that when participants grasp objects with virtual weights, they adapt quickly to unknown virtual weights and choose efficient grip forces. Moreover, predictive overshoot forces are produced that match inertial forces which would originate from a physical object of the same weight. In summary, these results suggest that users of advanced neurorehabilitation methods employ and adapt their prior internal forward models for sensorimotor control of grasping. Incorporating such insights about the grip force dynamics of human grasping in the design of neurorehabilitation methods, such as hand exoskeletons, might improve their usability and rehabilitative function.


Subject(s)
Exoskeleton Device , Psychomotor Performance , Humans , Fingers , Hand Strength , Upper Extremity
4.
IEEE Int Conf Rehabil Robot ; 2023: 1-6, 2023 09.
Article in English | MEDLINE | ID: mdl-37941195

ABSTRACT

Essential Tremor (ET) is the most frequent movement disorder in adults. Upper-limb exoskeletons are a promising solution to alleviate ET symptoms. We propose a novel wrist exoskeleton for tremor damping. The TuMove exoskeleton is light-weight, portable, easy to use, and designed for ADLs and activities requiring hand dexterity. We validated the effectiveness of our exoskeleton by inducing forearm tremors using TENS on 5 healthy subjects. Our results show that wrist ranges are generally kept in most of the ROM needed in ADLs. The damping system reduced more than 30% of the tremor's angular velocity during drinking and pouring tasks. Furthermore, the completion time of the Archimedes spiral was decreased by 2.76 seconds (13.0%) and for the 9-Hole-Peg-Test by 2.77 seconds (11.8 %), indicating a performance improvement in dexterity tasks.


Subject(s)
Essential Tremor , Exoskeleton Device , Transcutaneous Electric Nerve Stimulation , Adult , Humans , Wrist , Tremor , Activities of Daily Living , Upper Extremity
5.
PLoS One ; 18(7): e0287958, 2023.
Article in English | MEDLINE | ID: mdl-37432954

ABSTRACT

Human-robot interaction (HRI) describes scenarios in which both human and robot work as partners, sharing the same environment or complementing each other on a joint task. HRI is characterized by the need for high adaptability and flexibility of robotic systems toward their human interaction partners. One of the major challenges in HRI is task planning with dynamic subtask assignment, which is particularly challenging when subtask choices of the human are not readily accessible by the robot. In the present work, we explore the feasibility of using electroencephalogram (EEG) based neuro-cognitive measures for online robot learning of dynamic subtask assignment. To this end, we demonstrate in an experimental human subject study, featuring a joint HRI task with a UR10 robotic manipulator, the presence of EEG measures indicative of a human partner anticipating a takeover situation from human to robot or vice-versa. The present work further proposes a reinforcement learning based algorithm employing these measures as a neuronal feedback signal from the human to the robot for dynamic learning of subtask-assignment. The efficacy of this algorithm is validated in a simulation-based study. The simulation results reveal that even with relatively low decoding accuracies, successful robot learning of subtask-assignment is feasible, with around 80% choice accuracy among four subtasks within 17 minutes of collaboration. The simulation results further reveal that scalability to more subtasks is feasible and mainly accompanied with longer robot learning times. These findings demonstrate the usability of EEG-based neuro-cognitive measures to mediate the complex and largely unsolved problem of human-robot collaborative task planning.


Subject(s)
Robotics , Humans , Brain , Learning , Algorithms , Computer Simulation
6.
J Neurophysiol ; 129(6): 1344-1358, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37141051

ABSTRACT

How the brain responds temporally and spectrally when we listen to familiar versus unfamiliar musical sequences remains unclear. This study uses EEG techniques to investigate the continuous electrophysiological changes in the human brain during passive listening to familiar and unfamiliar musical excerpts. EEG activity was recorded in 20 participants while they passively listened to 10 s of classical music, and they were then asked to indicate their self-assessment of familiarity. We analyzed the EEG data in two manners: familiarity based on the within-subject design, i.e., averaging trials for each condition and participant, and familiarity based on the same music excerpt, i.e., averaging trials for each condition and music excerpt. By comparing the familiar condition with the unfamiliar condition and the local baseline, sustained low-beta power (12-16 Hz) suppression was observed in both analyses in fronto-central and left frontal electrodes after 800 ms. However, sustained alpha power (8-12 Hz) decreased in fronto-central and posterior electrodes after 850 ms only in the first type of analysis. Our study indicates that listening to familiar music elicits a late sustained spectral response (inhibition of alpha/low-beta power from 800 ms to 10 s). Moreover, the results showed that alpha suppression reflects increased attention or arousal/engagement due to listening to familiar music; nevertheless, low-beta suppression exhibits the effect of familiarity.NEW & NOTEWORTHY This study differentiates the dynamic temporal-spectral effects during listening to 10 s of familiar music compared with unfamiliar music. This study highlights that listening to familiar music leads to continuous suppression in the alpha and low-beta bands. This suppression starts ∼800 ms after the stimulus onset.


Subject(s)
Music , Humans , Electroencephalography/methods , Brain/physiology , Auditory Perception/physiology , Recognition, Psychology/physiology
7.
Adv Healthc Mater ; 12(17): e2202869, 2023 07.
Article in English | MEDLINE | ID: mdl-36827235

ABSTRACT

The use of soft and flexible bioelectronic interfaces can enhance the quality for recording cells' electrical activity by ensuring a continuous and intimate contact with the smooth, curving surfaces found in the physiological environment. This work develops soft microelectrode arrays (MEAs) made of silk fibroin (SF) films for recording interfaces that can also serve as a drug delivery system. Inkjet printing is used as a tool to deposit the substrate, conductive electrode, and insulator, as well as a drug-delivery nanocomposite film. This approach is highly versatile, as shown in the fabrication of carbon microelectrodes, sandwiched between a silk substrate and a silk insulator. The technique permits the development of thin-film devices that can be employed for in vitro extracellular recordings of HL-1 cell action potentials. The tuning of SF by applying an electrical stimulus to produce a permeable layer that can be used in on-demand drug delivery systems is also demonstrated. The multifunctional MEA developed here can pave the way for in vitro drug screening by applying time-resolved and localized chemical stimuli.


Subject(s)
Fibroins , Silk , Microelectrodes , Drug Delivery Systems , Electric Conductivity
8.
Brain Res ; 1800: 148198, 2023 02 01.
Article in English | MEDLINE | ID: mdl-36493897

ABSTRACT

Repeated listening to unknown music leads to gradual familiarization with musical sequences. Passively listening to musical sequences could involve an array of dynamic neural responses in reaching familiarization with the musical excerpts. This study elucidates the dynamic brain response and its variation over time by investigating the electrophysiological changes during the familiarization with initially unknown music. Twenty subjects were asked to familiarize themselves with previously unknown 10 s classical music excerpts over three repetitions while their electroencephalogram was recorded. Dynamic spectral changes in neural oscillations are monitored by time-frequency analyses for all frequency bands (theta: 5-9 Hz, alpha: 9-13 Hz, low-beta: 13-21 Hz, high beta: 21-32 Hz, and gamma: 32-50 Hz). Time-frequency analyses reveal sustained theta event-related desynchronization (ERD) in the frontal-midline and the left pre-frontal electrodes which decreased gradually from 1st to 3rd time repetition of the same excerpts (frontal-midline: 57.90 %, left-prefrontal: 75.93 %). Similarly, sustained gamma ERD decreased in the frontal-midline and bilaterally frontal/temporal areas (frontal-midline: 61.47 %, left-frontal: 90.88 %, right-frontal: 87.74 %). During familiarization, the decrease of theta ERD is superior in the first part (1-5 s) whereas the decrease of gamma ERD is superior in the second part (5-9 s) of music excerpts. The results suggest that decreased theta ERD is associated with successfully identifying familiar sequences, whereas decreased gamma ERD is related to forming unfamiliar sequences.


Subject(s)
Music , Humans , Electroencephalography/methods , Brain , Auditory Perception/physiology , Brain Mapping
9.
Sci Rep ; 12(1): 20764, 2022 12 01.
Article in English | MEDLINE | ID: mdl-36456595

ABSTRACT

When a human and machine collaborate on a shared task, ambiguous events might occur that could be perceived as an error by the human partner. In such events, spontaneous error-related potentials (ErrPs) are evoked in the human brain. Knowing whom the human perceived as responsible for the error would help a machine in co-adaptation and shared control paradigms to better adapt to human preferences. Therefore, we ask whether self- and agent-related errors evoke different ErrPs. Eleven subjects participated in an electroencephalography human-agent collaboration experiment with a collaborative trajectory-following task on two collaboration levels, where movement errors occurred as trajectory deviations. Independently of the collaboration level, we observed a higher amplitude of the responses on the midline central Cz electrode for self-related errors compared to observed errors made by the agent. On average, Support Vector Machines classified self- and agent-related errors with 72.64% accuracy using subject-specific features. These results demonstrate that ErrPs can tell if a person relates an error to themselves or an external autonomous agent during collaboration. Thus, the collaborative machine will receive more informed feedback for the error attribution that allows appropriate error identification, a possibility for correction, and avoidance in future actions.


Subject(s)
Brain-Computer Interfaces , Humans , Electroencephalography , Support Vector Machine , Movement , Acclimatization
10.
J Neuroeng Rehabil ; 19(1): 124, 2022 11 11.
Article in English | MEDLINE | ID: mdl-36369025

ABSTRACT

Soft exosuits offer promise to support users in everyday workload tasks by providing assistance. However, acceptance of such systems remains low due to the difficulty of control compared with rigid mechatronic systems. Recently, there has been progress in developing control schemes for soft exosuits that move in line with user intentions. While initial results have demonstrated sufficient device performance, the assessment of user experience via the cognitive response has yet to be evaluated. To address this, we propose a soft pneumatic elbow exosuit designed based on our previous work to provide assistance in line with user expectations utilizing two existing state-of-the-art control methods consisting of a gravity compensation and myoprocessor based on muscle activation. A user experience study was conducted to assess whether the device moves naturally with user expectations and the potential for device acceptance by determining when the exosuit violated user expectations through the neuro-cognitive and motor response. Brain activity from electroencephalography (EEG) data revealed that subjects elicited error-related potentials (ErrPs) in response to unexpected exosuit actions, which were decodable across both control schemes with an average accuracy of 76.63 ± 1.73% across subjects. Additionally, unexpected exosuit actions were further decoded via the motor response from electromyography (EMG) and kinematic data with a grand average accuracy of 68.73 ± 6.83% and 77.52 ± 3.79% respectively. This work demonstrates the validation of existing state-of-the-art control schemes for soft wearable exosuits through the proposed soft pneumatic elbow exosuit. We demonstrate the feasibility of assessing device performance with respect to the cognitive response through decoding when the device violates user expectations in order to help understand and promote device acceptance.


Subject(s)
Exoskeleton Device , Robotics , Humans , Elbow , Biomechanical Phenomena , Cognition
11.
Sci Rep ; 12(1): 919, 2022 01 18.
Article in English | MEDLINE | ID: mdl-35042875

ABSTRACT

Understanding the human brain's perception of different thermal sensations has sparked the interest of many neuroscientists. The identification of distinct brain patterns when processing thermal stimuli has several clinical applications, such as phantom-limb pain prediction, as well as increasing the sense of embodiment when interacting with neurorehabilitation devices. Notwithstanding the remarkable number of studies that have touched upon this research topic, understanding how the human brain processes different thermal stimuli has remained elusive. More importantly, very intense thermal stimuli perception dynamics, their related cortical activations, as well as their decoding using effective features are still not fully understood. In this study, using electroencephalography (EEG) recorded from three healthy human subjects, we identified spatial, temporal, and spectral patterns of brain responses to different thermal stimulations ranging from extremely cold and hot stimuli (very intense), moderately cold and hot stimuli (intense), to a warm stimulus (innocuous). Our results show that very intense thermal stimuli elicit a decrease in alpha power compared to intense and innocuous stimulations. Spatio-temporal analysis reveals that in the first 400 ms post-stimulus, brain activity increases in the prefrontal and central brain areas for very intense stimulations, whereas for intense stimulation, high activity of the parietal area was observed post-500 ms. Based on these identified EEG patterns, we successfully classified the different thermal stimulations with an average test accuracy of 84% across all subjects. En route to understanding the underlying cortical activity, we source localized the EEG signal for each of the five thermal stimuli conditions. Our findings reveal that very intense stimuli were anticipated and induced early activation (before 400 ms) of the anterior cingulate cortex (ACC). Moreover, activation of the pre-frontal cortex, somatosensory, central, and parietal areas, was observed in the first 400 ms post-stimulation for very intense conditions and starting 500 ms post-stimuli for intense conditions. Overall, despite the small sample size, this work presents novel findings and a first comprehensive approach to explore, analyze, and classify EEG-brain activity changes evoked by five different thermal stimuli, which could lead to a better understanding of thermal stimuli processing in the brain and could, therefore, pave the way for developing a real-time withdrawal reaction system when interacting with prosthetic limbs. We underpin this last point by benchmarking our EEG results with a demonstration of a real-time withdrawal reaction of a robotic prosthesis using a human-like artificial skin.


Subject(s)
Brain
12.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 4713-4716, 2021 11.
Article in English | MEDLINE | ID: mdl-34892264

ABSTRACT

This paper presents a visually-guided grip selection based on the combination of object recognition and tactile feedback of a soft-hand exoskeleton intended for hand rehabilitation. A pre-trained neural network is used to recognize the object in front of the hand exoskeleton, which is then mapped to a suitable grip type. With the object cue, it actively assists users in performing different grip movements without calibration. In a pilot experiment, one healthy user completed four different grasp-and-move tasks repeatedly. All trials were completed within 25 seconds and only one out of 20 trials failed. This shows that automated movement training can be achieved by visual guidance even without biomedical sensors. In particular, in the private setting at home without clinical supervision, it is a powerful tool for repetitive training of daily-living activities.


Subject(s)
Exoskeleton Device , Hand , Hand Strength , Humans , Movement , Touch
13.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 6102-6105, 2021 11.
Article in English | MEDLINE | ID: mdl-34892509

ABSTRACT

Accurate and low-power decoding of brain signals such as electroencephalography (EEG) is key to constructing brain-computer interface (BCI) based wearable devices. While deep learning approaches have progressed substantially in terms of decoding accuracy, their power consumption is relatively high for mobile applications. Neuromorphic hardware arises as a promising solution to tackle this problem since it can run massive spiking neural networks with energy consumption orders of magnitude lower than traditional hardware. Herein, we show the viability of directly mapping a continuous-valued convolutional neural network for motor imagery EEG classification to a spiking neural network. The converted network, able to run on the SpiNNaker neuromorphic chip, only shows a 1.91% decrease in accuracy after conversion. Thus, we take full advantage of the benefits of both deep learning accuracies and low-power neuro-inspired hardware, properties that are key for the development of wearable BCI devices.


Subject(s)
Brain-Computer Interfaces , Deep Learning , Algorithms , Electroencephalography , Neural Networks, Computer
14.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 6203-6206, 2021 11.
Article in English | MEDLINE | ID: mdl-34892532

ABSTRACT

Exoskeletons and prosthetic devices controlled using brain-computer interfaces (BCIs) can be prone to errors due to inconsistent decoding. In recent years, it has been demonstrated that error-related potentials (ErrPs) can be used as a feedback signal in electroencephalography (EEG) based BCIs. However, modern BCIs often take large setup times and are physically restrictive, making them impractical for everyday use. In this paper, we use a mobile and easy-to-setup EEG device to investigate whether an erroneously functioning 1-DOF exoskeleton in different conditions, namely, visually observing and wearing the exoskeleton, elicits a brain response that can be classified. We develop a pipeline that can be applied to these two conditions and observe from our experiments that there is evidence for neural responses from electrodes near regions associated with ErrPs in an environment that resembles the real world. We found that these error-related responses can be classified as ErrPs with accuracies ranging from 60% to 71%, depending on the condition and the subject. Our pipeline could be further extended to detect and correct erroneous exoskeleton behavior in real-world settings.


Subject(s)
Brain-Computer Interfaces , Exoskeleton Device , Brain , Electroencephalography , Pilot Projects
15.
Psychol Res ; 85(2): 491-502, 2021 Mar.
Article in English | MEDLINE | ID: mdl-32705336

ABSTRACT

Attentional orienting towards others' gaze direction or pointing has been well investigated in laboratory conditions. However, less is known about the operation of attentional mechanisms in online naturalistic social interaction scenarios. It is equally plausible that following social directional cues (gaze, pointing) occurs reflexively, and/or that it is influenced by top-down cognitive factors. In a mobile eye-tracking experiment, we show that under natural interaction conditions, overt attentional orienting is not necessarily reflexively triggered by pointing gestures or a combination of gaze shifts and pointing gestures. We found that participants conversing with an experimenter, who, during the interaction, would play out pointing gestures as well as directional gaze movements, continued to mostly focus their gaze on the face of the experimenter, demonstrating the significance of attending to the face of the interaction partner-in line with effective top-down control over reflexive orienting of attention in the direction of social cues.


Subject(s)
Attention/physiology , Cues , Face , Gestures , Orientation, Spatial/physiology , Adult , Female , Fixation, Ocular/physiology , Humans , Male , Photic Stimulation/methods , Young Adult
16.
Sci Robot ; 5(49)2020 12 09.
Article in English | MEDLINE | ID: mdl-33298517

ABSTRACT

Advances in neuroscience are inspiring developments in robotics and vice versa.


Subject(s)
Brain-Computer Interfaces , Neurosciences/instrumentation , Robotics/instrumentation , Bioengineering , Biomimetics , Humans , Models, Neurological
17.
Science ; 370(6519): 966-970, 2020 11 20.
Article in English | MEDLINE | ID: mdl-33214278

ABSTRACT

Monitoring of finger manipulation without disturbing the inherent functionalities is critical to understand the sense of natural touch. However, worn or attached sensors affect the natural feeling of the skin. We developed nanomesh pressure sensors that can monitor finger pressure without detectable effects on human sensation. The effect of the sensor on human sensation was quantitatively investigated, and the sensor-applied finger exhibits comparable grip forces with those of the bare finger, even though the attachment of a 2-micrometer-thick polymeric film results in a 14% increase in the grip force after adjusting for friction. Simultaneously, the sensor exhibits an extreme mechanical durability against cyclic shearing and friction greater than hundreds of kilopascals.


Subject(s)
Fingers/physiology , Monitoring, Physiologic/instrumentation , Nanostructures , Touch , Friction , Humans , Pressure , Shear Strength
18.
J Neural Eng ; 17(5): 056006, 2020 10 20.
Article in English | MEDLINE | ID: mdl-33078717

ABSTRACT

OBJECTIVE: A major challenge for controlling a prosthetic arm is communication between the device and the user's phantom limb. We show the ability to enhance phantom limb perception and improve movement decoding through targeted transcutaneous electrical nerve stimulation in individuals with an arm amputation. APPROACH: Transcutaneous nerve stimulation experiments were performed with four participants with arm amputation to map phantom limb perception. We measured myoelectric signals during phantom hand movements before and after participants received sensory stimulation. Using electroencephalogram (EEG) monitoring, we measured the neural activity in sensorimotor regions during phantom movements and stimulation. In one participant, we also tracked sensory mapping over 2 years and movement decoding performance over 1 year. MAIN RESULTS: Results show improvements in the participants' ability to perceive and move the phantom hand as a result of sensory stimulation, which leads to improved movement decoding. In the extended study with one participant, we found that sensory mapping remains stable over 2 years. Sensory stimulation improves within-day movement decoding while performance remains stable over 1 year. From the EEG, we observed cortical correlates of sensorimotor integration and increased motor-related neural activity as a result of enhanced phantom limb perception. SIGNIFICANCE: This work demonstrates that phantom limb perception influences prosthesis control and can benefit from targeted nerve stimulation. These findings have implications for improving prosthesis usability and function due to a heightened sense of the phantom hand.


Subject(s)
Artificial Limbs , Movement , Perception , Phantom Limb , Hand , Humans
19.
Sensors (Basel) ; 20(7)2020 Mar 31.
Article in English | MEDLINE | ID: mdl-32244511

ABSTRACT

The sense of touch enables us to safely interact and control our contacts with our surroundings. Many technical systems and applications could profit from a similar type of sense. Yet, despite the emergence of e-skin systems covering more extensive areas, large-area realizations of e-skin effectively boosting applications are still rare. Recent advancements have improved the deployability and robustness of e-skin systems laying the basis for their scalability. However, the upscaling of e-skin systems introduces yet another challenge-the challenge of handling a large amount of heterogeneous tactile information with complex spatial relations between sensing points. We targeted this challenge and proposed an event-driven approach for large-area skin systems. While our previous works focused on the implementation and the experimental validation of the approach, this work now provides the consolidated foundations for realizing, designing, and understanding large-area event-driven e-skin systems for effective applications. This work homogenizes the different perspectives on event-driven systems and assesses the applicability of existing event-driven implementations in large-area skin systems. Additionally, we provide novel guidelines for tuning the novelty-threshold of event generators. Overall, this work develops a systematic approach towards realizing a flexible event-driven information handling system on standard computer systems for large-scale e-skin with detailed descriptions on the effective design of event generators and decoders. All designs and guidelines are validated by outlining their impacts on our implementations, and by consolidating various experimental results. The resulting system design for e-skin systems is scalable, efficient, flexible, and capable of handling large amounts of information without customized hardware. The system provides the feasibility of complex large-area tactile applications, for instance in robotics.


Subject(s)
Robotics/standards , Touch/physiology , Wearable Electronic Devices/standards , Computers , Humans , Robotics/trends , Wearable Electronic Devices/trends
20.
Biol Cybern ; 114(3): 363-387, 2020 06.
Article in English | MEDLINE | ID: mdl-32185485

ABSTRACT

For spatiotemporal learning with neural networks, hyperparameters are often set manually by a human expert. This is especially the case with multiple timescale networks that require a careful setting of the values of timescales in order to learn spatiotemporal data. However, this implies a cumbersome trial-and-error process until suitable parameters are found and it reduces the long-term autonomy of artificial agents, such as robots that are controlled by multiple timescale networks. To solve the problem, we propose the evolutionary optimized multiple timescale recurrent neural network (EO-MTRNN) that is inspired by the neural plasticity of the human cortex. Our proposed network uses a method of evolutionary optimization to adjust its timescales and to rewire itself in terms of number of neurons and synapses. Moreover, it does not require additional neural networks for pre- and postprocessing input-output data. We validate our EO-MTRNN by applying it to a proposed benchmark training dataset with single and multiple sequence training cases, as well as by applying it to sensory-motor data from a robot. We compare different configuration modes of the network, and we compare the learning performance between a network configuration with manually set hyperparameters and a configuration with automatically estimated hyperparameters. The results show that automatically estimated hyperparameters yield approximately 43% better performance than manually estimated ones, without overfitting the given teaching data. We also validate the generalization ability by successfully learning data that were not included in the hyperparameter estimation process.


Subject(s)
Models, Neurological , Neural Networks, Computer , Psychomotor Performance/physiology , Robotics/methods , Spatial Learning/physiology , Time Perception/physiology , Databases, Factual , Humans , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...