Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
1.
Artículo en Inglés | MEDLINE | ID: mdl-38082616

RESUMEN

Exoskeletons are widely used in the field of rehabilitation robotics. Upper limb exoskeletons (ULEs) can be very useful for patients with diminished ability to control their limbs in aiding activities of daily living (ADLs). The design of ULEs must account for a human's limitations and ability to work with an exoskeleton. It can typically be achieved by the involvement of vulnerable end-users in each design cycle. On the other hand, simulation-based design methods on a model with human-in-the-loop can limit the design cycles, thereby reducing research time and dependency on end users. This study makes it evident by using a case where the design of an exoskeleton wrist can be optimized with the usage of a torsional spring at the joint, that compensates for the required motor torque. Considering the human-in-the-loop system, the multibody modeling results show that the usage of a torsional spring in the joint can be useful in designing a lightweight and compact exoskeleton joint by downsizing the motor.Clinical Relevance- The proposed methodology of designing an upper-limb exoskeleton has a utility in limiting design cycles and making it both convenient and useful to assist users with severe impairment in ADLs.


Asunto(s)
Dispositivo Exoesqueleto , Humanos , Muñeca , Actividades Cotidianas , Diseño de Equipo , Extremidad Superior
2.
Artículo en Inglés | MEDLINE | ID: mdl-38082858

RESUMEN

The inductive tongue-computer interface allows individuals with tetraplegia to control assistive devices. However, controlling assistive robotic arms often requires more than 14 different commands, which cannot always fit into a single control layout. Previous studies have separated the commands into modes, but few have investigated strategies to switch between them. In this feasibility study, we compare the efficiency of switching modes using buttons, swipe gestures and double taps using a preliminary version of a new non-invasive mouthpiece unit (nMPU), which includes an integrated activation unit and a single sensor board. Three participants controlled a JACO assistive robot to pick up a bottle using different mode-switching strategies. Compared with switching modes with buttons, switching modes with swipes and double taps increased the task completion time by 21% and 58% respectively. Therefore, we recommend that configurations with multiple modes for the non-invasive tongue-computer interface include buttons for mode-switching.Clinical relevance- Cumbersome mode-switching strategies can lower a control interface's responsiveness and contribute to end-user abandonment of assistive technologies. This study showed that using buttons to switch modes is more reliable. Moreover, this study will inform the development of future control layouts with improved usability.


Asunto(s)
Procedimientos Quirúrgicos Robotizados , Robótica , Humanos , Interfaz Usuario-Computador , Diseño de Equipo , Computadores , Lengua/fisiología
3.
Artículo en Inglés | MEDLINE | ID: mdl-38082906

RESUMEN

Individuals with severe disabilities can benefit from assistive robotic systems (ARS) for performing activities of daily living. However, limited control interfaces are available for individuals who cannot use their hands for the control, and most of these interfaces require high effort to perform simple tasks. Therefore, autonomous and intelligent control strategies were proposed for assisting with the control in complex tasks. In this paper, we presented an autonomous and adaptive method for adjusting an assistive robot's velocity in different regions of its workspace and reducing the robot velocity where fine control is required. Two participants controlled a JACO assistive robot to grasp and lift a bottle with and without the velocity adjustment method. The task was performed 9.1% faster with velocity adjustment. Furthermore, analyzing the robot trajectory showed that the method recognized highly restrictive regions and reduced the robot end-effector velocity accordingly.Clinical relevance- The autonomous velocity adjustment method can ease the control of ARSs and improve their usability, leading to a higher quality of life for individuals with severe disabilities who can benefit from ARSs.


Asunto(s)
Dispositivo Exoesqueleto , Robótica , Dispositivos de Autoayuda , Humanos , Actividades Cotidianas , Calidad de Vida , Extremidad Superior
4.
Artículo en Inglés | MEDLINE | ID: mdl-38083736

RESUMEN

Tongue computer interfaces have shown promising for both computer control and for control of assistive technologies and robotics. Still, evidence is lacking in relation to their usability resulting in speculations on their effectiveness for general computer use and their impact on other activities such as speaking, drinking, and eating. This paper presents the results of such a usability study performed with two individuals with tetraplegia. The results show a high acceptance of the Inductive Tongue Computer Interface with an average rating of 2.6 on a scale from 1 (normal) to 10 (unacceptable) and a low impact on speech after only 3 days of use.Clinical Relevance- This study emphasizes the applicability and adoptability of the Inductive Tongue Interface as a useful assistive technology for individuals with severe disabilities.


Asunto(s)
Personas con Discapacidad , Uso de Internet , Humanos , Interfaz Usuario-Computador , Diseño de Equipo , Computadores , Lengua
5.
Sensors (Basel) ; 23(4)2023 Feb 17.
Artículo en Inglés | MEDLINE | ID: mdl-36850871

RESUMEN

This study proposes a bioinspired exotendon routing configuration for a tendon-based mechanism to provide finger flexion and extension that utilizes a single motor to reduce the complexity of the system. The configuration was primarily inspired by the extrinsic muscle-tendon units of the human musculoskeletal system. The function of the intrinsic muscle-tendon units was partially compensated by adding a minor modification to the configuration of the extrinsic units. The finger kinematics produced by this solution during flexion and extension were experimentally evaluated on an artificial finger and compared to that obtained using the traditional mechanism, where one exotendon was inserted at the distal phalanx. The experiments were conducted on nine healthy subjects who wore a soft exoskeleton glove equipped with the novel tendon mechanism. Contrary to the traditional approach, the proposed mechanism successfully prevented the hyperextension of the distal interphalangeal (DIP) and the metacarpophalangeal (MCP) joints. During flexion, the DIP joint angles produced by the novel mechanism were smaller than the angles generated by the traditional approach for the same proximal interphalangeal (PIP) joint angles. This provided a flexion trajectory closer to the voluntary flexion motion and avoided straining the interphalangeal coupling between the DIP and PIP joints. Finally, the proposed solution generated similar trajectories when applied to a stiff artificial finger (simulating spasticity). The results, therefore, demonstrate that the proposed approach is indeed an effective solution for the envisioned soft hand exoskeleton system.


Asunto(s)
Biomimética , Dispositivo Exoesqueleto , Humanos , Extremidad Superior , Mano , Tendones
6.
IEEE Int Conf Rehabil Robot ; 2022: 1-6, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-36176082

RESUMEN

Tongue based robotic interfaces have shown the potential to control assistive robotic devices developed for individuals with severe disabilities due to spinal cord injury. However, current tongue-robotic interfaces require invasive methods such as piercing to attach an activation unit (AU) to the tongue. A noninvasive tongue interface concept, which used a frame integrated AU instead of a tongue attached AU, was previously proposed. However, there is a need for the development of compact one-piece sensor printed circuit boards (PCBs) to enable activation of all inductive sensors. In this study, we developed and tested four designs of compact one-piece sensor PCBs incorporating inductive sensors for the design of a noninvasive tongue-robotic interface. We measured electrical parameters of the developed sensors to detect activation and compared them with a sensor of the current version of the inductive tongue-computer interface (ITCI) by moving AUs with different contact surfaces at the surface of the sensors. Results showed that, the newly developed inductive sensors had higher and wider activation than the sensor of ITCI and the AU with a flat contact surface had 3.5 - 4 times higher activation than the AU with a spherical contact surface. A higher sensor activation can result in a higher signal to noise ratio and thus a higher AU tracking resolution.


Asunto(s)
Procedimientos Quirúrgicos Robotizados , Dispositivos de Autoayuda , Diseño de Equipo , Humanos , Lengua/fisiología , Interfaz Usuario-Computador
7.
IEEE Int Conf Rehabil Robot ; 2022: 1-3, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-36176115

RESUMEN

Previous studies have described inductive tongue computer interfaces (ITCI) as a way to manipulate and control assistive robotics, and at least one commercial company is manufacturing ITCI today. This case report investigates the influence of an ITCI on the speed and quality of speech. An individual with tetraplegia read aloud a short part of "The Ugly Duckling", a well-known story by Hans Christian Andersen, in her native language Danish. The reading was done twice, first with her own Removable Full Upper Denture (RFUD) and secondly with a copy of this RFUD with an integrated ITCI in the palatal area. A word count assesses the speed of 5 minutes of reading aloud, and the confidence of an automated transcription into text measures the quality. This study found no difference in the speed or quality of speech between two settings with or without an ITCI.


Asunto(s)
Robótica , Habla , Femenino , Humanos , Cuadriplejía , Lengua , Interfaz Usuario-Computador
8.
IEEE Int Conf Rehabil Robot ; 2022: 1-6, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-36176154

RESUMEN

Despite having the potential to improve the lives of severely paralyzed users, non-invasive Brain Computer Interfaces (BCI) have yet to be integrated into their daily lives. The widespread adoption of BCI-driven assistive technology is hindered by its lacking usability, as both end-users and researchers alike find fault with traditional EEG caps. In this paper, we compare the usability of four EEG recording devices for Steady-State Visually Evoked Potentials (SSVEP)-BCI applications: an EEG cap (active gel electrodes), two headbands (passive gel or active dry electrodes), and two adhesive electrodes placed on each mastoid. Ten able-bodied participants tested each device by completing an 8-target SSVEP paradigm. Setup times were recorded, and participants rated their satisfaction with each device. The EEG cap obtained the best classification accuracies (Median = 98.96%), followed by the gel electrode headband (Median = 93.75%), and the dry electrode headband (Median = 91.14%). The mastoid electrodes obtained classification accuracies close to chance level (Med = 29.69%). Unknowing of the classification accuracy, participants found the mastoid electrodes to be the most comfortable and discrete. The dry electrode headband obtained the lowest user satisfaction score and was criticized for being too uncomfortable. Participants also noted that the EEG cap was too conspicuous. The gel-based headband provided a good trade-off between BCI performance and user satisfaction.


Asunto(s)
Interfaces Cerebro-Computador , Robótica , Electrodos , Electroencefalografía , Potenciales Evocados , Potenciales Evocados Visuales , Humanos
9.
Sensors (Basel) ; 22(18)2022 Sep 13.
Artículo en Inglés | MEDLINE | ID: mdl-36146260

RESUMEN

This paper presents the EXOTIC- a novel assistive upper limb exoskeleton for individuals with complete functional tetraplegia that provides an unprecedented level of versatility and control. The current literature on exoskeletons mainly focuses on the basic technical aspects of exoskeleton design and control while the context in which these exoskeletons should function is less or not prioritized even though it poses important technical requirements. We considered all sources of design requirements, from the basic technical functions to the real-world practical application. The EXOTIC features: (1) a compact, safe, wheelchair-mountable, easy to don and doff exoskeleton capable of facilitating multiple highly desired activities of daily living for individuals with tetraplegia; (2) a semi-automated computer vision guidance system that can be enabled by the user when relevant; (3) a tongue control interface allowing for full, volitional, and continuous control over all possible motions of the exoskeleton. The EXOTIC was tested on ten able-bodied individuals and three users with tetraplegia caused by spinal cord injury. During the tests the EXOTIC succeeded in fully assisting tasks such as drinking and picking up snacks, even for users with complete functional tetraplegia and the need for a ventilator. The users confirmed the usability of the EXOTIC.


Asunto(s)
Dispositivo Exoesqueleto , Actividades Cotidianas , Humanos , Poder Psicológico , Cuadriplejía , Lengua , Extremidad Superior
10.
J Biomech ; 139: 111137, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-35594818

RESUMEN

This study addresses the feasibility of underactuated arm exoskeletons as an alternative solution to the often bulky and heavy exoskeletons which actuate the shoulder with 3 DoF. Specifically, the study investigates how the wrist and elbow joint adapts their kinematics when the shoulder abduction is constrained. Ten healthy participants conducted three different grasping activities of daily living, during natural motion and during constrained shoulder abduction at two fixed angles: the resting position angle and at an angle of 10 ° abduction from the resting position. Motion capture data was collected and used as input for a musculoskeletal computer model adapted to this study. Statistical parametric mapping tools were employed to analyze the joint angles estimated by the model. The results show significant differences within the joint angles when the shoulder abduction is constrained. The wrist flexion angle deviated up to 13.6 ° and the elbow pronation angle decreased by 8.7 ° on average throughout the movement compared to the natural motion during restricted shoulder abduction motion. Thus, the shoulder could be underactuated and the participants could still accomplish the activities of daily living with changes in the wrist and elbow joint kinematic angles.


Asunto(s)
Articulación del Codo , Dispositivo Exoesqueleto , Articulación del Hombro , Actividades Cotidianas , Brazo , Fenómenos Biomecánicos , Codo , Humanos , Movimiento , Rango del Movimiento Articular , Muñeca
11.
Appl Ergon ; 96: 103479, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34126571

RESUMEN

The aim of this study was to explore the ergonomic challenges, the needs and reservations related to robot-assisted ultrasound for obstetric sonographers and thereby to provide information for the design of robotic solutions. A mixed-method design was used, where data from the obstetric sonographers and their immediate managers from 18 out of a Danish total of 20 obstetric departments was collected. The data was collected through a survey and interviews. 98.1% of the obstetric sonographers experienced ache, pain or discomfort related to scans. The most frequent cause for the sonographers' ergonomic challenges were the patients' physique (93,52%) and the need to obtain good image quality (83,33%). These reasons are non-controllable parameters for the obstetric sonographers and requires a solution, which ergonomically supports the sonographers in these situations. All of the interviewed obstetric sonographers (n = 8) and immediate managers (n = 3) claimed they were interested in testing a solution based on robot-assisted ultrasound.


Asunto(s)
Robótica , Técnicos Medios en Salud , Ergonomía , Femenino , Humanos , Embarazo , Encuestas y Cuestionarios , Ultrasonografía
12.
Front Neurosci ; 15: 739279, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34975367

RESUMEN

Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device-safely and intuitively-with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback.

13.
Disabil Rehabil Assist Technol ; 15(7): 731-745, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-31268368

RESUMEN

Purpose: The advances in artificial intelligence have started to reach a level where autonomous systems are becoming increasingly popular as a way to aid people in their everyday life. Such intelligent systems may especially be beneficially for people struggling to complete common everyday tasks, such as individuals with movement-related disabilities. The focus of this paper is hence to review recent work in using computer vision for semi-autonomous control of assistive robotic manipulators (ARMs). Methods: Four databases were searched using a block search, yielding 257 papers which were reduced to 14 papers after applying various filtering criteria. Each paper was reviewed with focus on the hardware used, the autonomous behaviour achieved using computer vision and the scheme for semi-autonomous control of the system. Each of the reviewed systems were also sought characterized by grading their level of autonomy on a pre-defined scale.Conclusions: A re-occurring issue in the reviewed systems was the inability to handle arbitrary objects. This makes the systems unlikely to perform well outside a controlled environment, such as a lab. This issue could be addressed by having the systems recognize good grasping points or primitive shapes instead of specific pre-defined objects. Most of the reviewed systems did also use a rather simple strategy for the semi-autonomous control, where they switch either between full manual control or full automatic control. An alternative could be a control scheme relying on adaptive blending which could provide a more seamless experience for the user.Implications for rehabilitationAssistive robotic manipulators (ARMs) have the potential to empower individuals with disabilities by enabling them to complete common everyday tasks. This potential can be further enhanced by making the ARM semi-autonomous in order to actively aid the user.The scheme used for the semi-autonomous control of the ARM is crucial as it may be a hindrance if done incorrectly. Especially the ability to customize the semi-autonomous behaviour of the ARM is found to be important.Further research is needed to make the final move from the lab to the homes of the users. Most of the reviewed systems suffer from a rather fixed scheme for the semi-autonomous control and an inability to handle arbitrary objects.


Asunto(s)
Inteligencia Artificial , Automatización , Personas con Discapacidad/rehabilitación , Dispositivo Exoesqueleto , Robótica , Dispositivos de Autoayuda , Actividades Cotidianas , Humanos
14.
IEEE Int Conf Rehabil Robot ; 2019: 1043-1048, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-31374767

RESUMEN

Assistive robotic arms have shown the potential to improve the quality of life of people with severe disabilities. However, a high performance and intuitive control interface for robots with 6-7 DOFs is still missing for these individuals. An inductive tongue computer interface (ITCI) was recently tested for control of robots and the study illustrated potential in this field. The paper describes the investigation of the possibility of developing a high performance tongue-based joystick-like controller for robots through two studies. The first compared different methods for mapping the 18 sensor signals to a 2D coordinate, as a touchpad. The second evaluated the performance of a novel approach for emulating an analog joystick by the ITCI based on the ISO9241-411 standard. Two subjects performed a multi-directional tapping test using a standard analog joystick, the ITCI system held in hand and operated by the other hand, and finally by tongue when mounted inside the mouth. Throughput was measured as the evaluation parameter. The results show that the contact on the touchpads can be localized by almost 1 mm accuracy. The effective throughput of ITCI system for the multi-directional tapping test was 2.03 bps while keeping it in the hand and 1.31 bps when using it inside the mouth.


Asunto(s)
Personas con Discapacidad/rehabilitación , Robótica , Lengua , Algoritmos , Diseño de Equipo , Femenino , Humanos , Masculino , Calidad de Vida , Dispositivos de Autoayuda , Interfaz Usuario-Computador
15.
J Neuroeng Rehabil ; 14(1): 110, 2017 11 06.
Artículo en Inglés | MEDLINE | ID: mdl-29110736

RESUMEN

BACKGROUND: For an individual with tetraplegia assistive robotic arms provide a potentially invaluable opportunity for rehabilitation. However, there is a lack of available control methods to allow these individuals to fully control the assistive arms. METHODS: Here we show that it is possible for an individual with tetraplegia to use the tongue to fully control all 14 movements of an assistive robotic arm in a three dimensional space using a wireless intraoral control system, thus allowing for numerous activities of daily living. We developed a tongue-based robotic control method incorporating a multi-sensor inductive tongue interface. One abled-bodied individual and one individual with tetraplegia performed a proof of concept study by controlling the robot with their tongue using direct actuator control and endpoint control, respectively. RESULTS: After 30 min of training, the able-bodied experimental participant tongue controlled the assistive robot to pick up a roll of tape in 80% of the attempts. Further, the individual with tetraplegia succeeded in fully tongue controlling the assistive robot to reach for and touch a roll of tape in 100% of the attempts and to pick up the roll in 50% of the attempts. Furthermore, she controlled the robot to grasp a bottle of water and pour its contents into a cup; her first functional action in 19 years. CONCLUSION: To our knowledge, this is the first time that an individual with tetraplegia has been able to fully control an assistive robotic arm using a wireless intraoral tongue interface. The tongue interface used to control the robot is currently available for control of computers and of powered wheelchairs, and the robot employed in this study is also commercially available. Therefore, the presented results may translate into available solutions within reasonable time.


Asunto(s)
Cuadriplejía/rehabilitación , Robótica , Dispositivos de Autoayuda , Lengua/fisiología , Tecnología Inalámbrica , Actividades Cotidianas , Adulto , Brazo , Diseño de Equipo , Femenino , Dedos , Fuerza de la Mano , Humanos , Persona de Mediana Edad , Interfaz Usuario-Computador
16.
IEEE Trans Neural Syst Rehabil Eng ; 25(11): 2094-2104, 2017 11.
Artículo en Inglés | MEDLINE | ID: mdl-28541213

RESUMEN

For severely paralyzed individuals, alternative computer interfaces are becoming increasingly essential for everyday life as social and vocational activities are facilitated by information technology and as the environment becomes more automatic and remotely controllable. Tongue computer interfaces have proven to be desirable by the users partly due to their high degree of aesthetic acceptability, but so far the mature systems have shown a relatively low error-free text typing efficiency. This paper evaluated the intra-oral inductive tongue computer interface (ITCI) in its intended use: Error-free text typing in a generally available text editing system, Word. Individuals with tetraplegia and able bodied individuals used the ITCI for typing using a MATLAB interface and for Word typing for 4 to 5 experimental days, and the results showed an average error-free text typing rate in Word of 11.6 correct characters/min across all participants and of 15.5 correct characters/min for participants familiar with tongue piercings. Improvements in typing rates between the sessions suggest that typing ratescan be improved further through long-term use of the ITCI.


Asunto(s)
Equipos de Comunicación para Personas con Discapacidad , Cuadriplejía/rehabilitación , Lengua , Interfaz Usuario-Computador , Adulto , Algoritmos , Personas con Discapacidad , Diseño de Equipo , Femenino , Voluntarios Sanos , Humanos , Masculino , Persona de Mediana Edad , Desempeño Psicomotor , Lengua/cirugía
17.
Assist Technol ; 28(1): 22-9, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-26479838

RESUMEN

This study compares the time required to activate a grasp or function of a hand prosthesis when using an electromyogram (EMG) based control scheme and when using a control scheme combining EMG and control signals from an inductive tongue control system (ITCS). Using a cross-over study design, 10 able-bodied subjects used a computer model of a hand and completed simulated grasping exercises. The time required to activate grasps was recorded and analyzed for both control schemes. End session mean activation times (ATs; seconds) for the EMG control scheme grasps 1 -5 were 0.80, 1.51, 1.95, 2.93, and 3.42; for the ITCS control scheme grasps 1 ‒5 they were 1.19, 1.89, 1.75, 2.26, and 1.80. Mean AT for grasps 1 and 2 was statistically significant in favor of the EMG control scheme (p = 0.030; p = 0.004). For grasp 3 no statistical significance occurred, and for grasps 4 and 5 there was a statistical significance in favour of the ITCS control scheme (p = 0.048; p = 0.004). Based on the amount of training and the achieved level of performance, it is concluded that the proposed ITCS control scheme can be used as a means of enhancing prosthesis control.


Asunto(s)
Miembros Artificiales , Electromiografía/métodos , Mano/fisiología , Dispositivos de Autoayuda , Lengua/fisiología , Adulto , Estudios Cruzados , Electromiografía/instrumentación , Femenino , Fuerza de la Mano , Humanos , Masculino , Diseño de Prótesis , Programas Informáticos
18.
Artículo en Inglés | MEDLINE | ID: mdl-24111084

RESUMEN

Two tetraplegic subjects performed typing tasks on a computer in an experiment using a tongue controlled oral interface. This paper reports mapping of the sensor activation time for a full alphabet text input using 10 inductive sensors. A small cylindrical piece of soft ferromagnetic material activated the sensors when placed at or glided along the surface of the sensor. The activation unit was attached to the tongue as the upper ball of a piercing. The tasks consisted of typing characters according to ordered (rows and columns) or random test strings during 30 seconds, with and without deleting characters typed by mistake. Visual feedback assisted the subjects to perform the typing tasks. Average activation times were of 0.82+/-0.38 and 1.06 +/-0.27 seconds respectively for the two subjects. Analysis of activation times may be useful in characterization of the tongue ability to activate the interface as well as in design optimization of the layout of the sensors.


Asunto(s)
Equipos de Comunicación para Personas con Discapacidad , Computadores , Análisis y Desempeño de Tareas , Lengua/fisiología , Interfaz Usuario-Computador , Humanos , Factores de Tiempo
19.
Disabil Rehabil Assist Technol ; 8(4): 330-9, 2013 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-22779705

RESUMEN

PURPOSE: To investigate the effects of visual and tactile intra-oral sensor-position feedback for target selection tasks with the tip of the tongue. METHOD: Target selection tasks were performed using an inductive tongue-computer interface (ITCI). Visual feedback was established by highlighting the area on a visual display corresponding to the activated intra-oral target. Tactile feedback was established using a sensor-border matrix over the sensor plates of the ITCI, which provided sensor-position tactile queues via the user's tongue. Target selection tasks using an on-screen keyboard by controlling the mouse pointer with the ITCI's was also evaluated. RESULTS: Mean target selection rates of 23, 5 and 15 activations per minute were obtained using visual, tactile and "none" feedback techniques in the 3rd training session. On-screen keyboard target selection tasks averaged 10 activations per minute in the 3rd training session. Involuntary activations while speaking or drinking were significantly reduced either through a sensor-matrix or dwell time for sensor activation. CONCLUSIONS: These results provide key design considerations to further increase the typing efficiency of tongue-computer interfaces for individuals with upper-limb mobility impairments.


Asunto(s)
Equipos de Comunicación para Personas con Discapacidad , Lengua , Interfaz Usuario-Computador , Adulto , Personas con Discapacidad , Retroalimentación Sensorial , Humanos
20.
Artículo en Inglés | MEDLINE | ID: mdl-23366638

RESUMEN

An inductive pointing device was designed and implemented successfully in a tongue controlled oral interface. Sensors were manufactured as an assembly of multilayer coils in the printed circuit board technology on two pads. The sensor pads were encapsulated together with electronics and battery in a mouthpiece, placed in the upper palate of the oral cavity. The PCB technology allowed surface activation of one or more sensors by gliding over the surface of the coils assembly of a small cylindrical unit attached to the tongue. The model consisted of 8 sensors and allowed real time proportional control of both speed and direction similar to a joystick. However, the size of the oral cavity, the number and geometry of the coil loops and characteristics of the activation unit impose limits in designing the sensors and call for an alternative layout design. Two alternative sensor designs are proposed in this paper, aiming to reduce the size of the sensor pad by one third, extending the target group, including children, and increasing the easiness of wear of the oral interface.


Asunto(s)
Diseño de Equipo , Lengua , Interfaz Usuario-Computador , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...