Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
Mais filtros

Bases de dados
Tipo de documento
Intervalo de ano de publicação
1.
Front Robot AI ; 10: 1122914, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37771605

RESUMO

Abdominal palpation is one of the basic but important physical examination methods used by physicians. Visual, auditory, and haptic feedback from the patients are known to be the main sources of feedback they use in the diagnosis. However, learning to interpret this feedback and making accurate diagnosis require several years of training. Many abdominal palpation training simulators have been proposed to date, but very limited attempts have been reported in integrating vocal pain expressions into physical abdominal palpation simulators. Here, we present a vocal pain expression augmentation for a robopatient. The proposed robopatient is capable of providing real-time facial and vocal pain expressions based on the exerted palpation force and position on the abdominal phantom of the robopatient. A pilot study is conducted to test the proposed system, and we show the potential of integrating vocal pain expressions to the robopatient. The platform has also been tested by two clinical experts with prior experience in abdominal palpation. Their evaluations on functionality and suggestions for improvements are presented. We highlight the advantages of the proposed robopatient with real-time vocal and facial pain expressions as a controllable simulator platform for abdominal palpation training studies. Finally, we discuss the limitations of the proposed approach and suggest several future directions for improvements.

2.
IEEE Rev Biomed Eng ; 16: 514-529, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-35439140

RESUMO

Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators.


Assuntos
Procedimentos Cirúrgicos Robóticos , Robótica , Humanos , Retroalimentação Sensorial , Retroalimentação , Palpação/métodos , Simulação por Computador
3.
Sci Rep ; 12(1): 4200, 2022 03 10.
Artigo em Inglês | MEDLINE | ID: mdl-35273296

RESUMO

Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data-driven perception-based psychophysical method combined with the visuo-haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real-time display of six pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change [Formula: see text] and activation delay [Formula: see text]. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale from "strongly disagree" to "strongly agree". Each participant ([Formula: see text], 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed facial expressions rated as most appropriate by all participants comprise a higher rate of change and shorter delay from upper face AUs (around the eyes) to those in the lower face (around the mouth). In contrast, we found that transient parameter values of most appropriate-rated pain facial expressions, palpation forces, and delays between palpation actions varied across participant-simulated patient pairs according to gender and ethnicity. These findings suggest that gender and ethnicity biases affect palpation strategies and the perception of pain facial expressions displayed on MorphFace. We anticipate that our approach will be used to generate physical examination models with diverse patient demographics to reduce erroneous judgments in medical students, and provide focused training to address these errors.


Assuntos
Procedimentos Cirúrgicos Robóticos , Robótica , Expressão Facial , Feminino , Humanos , Masculino , Dor , Palpação
4.
Front Robot AI ; 8: 730946, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34738017

RESUMO

Communication delay represents a fundamental challenge in telerobotics: on one hand, it compromises the stability of teleoperated robots, on the other hand, it decreases the user's awareness of the designated task. In scientific literature, such a problem has been addressed both with statistical models and neural networks (NN) to perform sensor prediction, while keeping the user in full control of the robot's motion. We propose shared control as a tool to compensate and mitigate the effects of communication delay. Shared control has been proven to enhance precision and speed in reaching and manipulation tasks, especially in the medical and surgical fields. We analyse the effects of added delay and propose a unilateral teleoperated leader-follower architecture that both implements a predictive system and shared control, in a 1-dimensional reaching and recognition task with haptic sensing. We propose four different control modalities of increasing autonomy: non-predictive human control (HC), predictive human control (PHC), (shared) predictive human-robot control (PHRC), and predictive robot control (PRC). When analyzing how the added delay affects the subjects' performance, the results show that the HC is very sensitive to the delay: users are not able to stop at the desired position and trajectories exhibit wide oscillations. The degree of autonomy introduced is shown to be effective in decreasing the total time requested to accomplish the task. Furthermore, we provide a deep analysis of environmental interaction forces and performed trajectories. Overall, the shared control modality, PHRC, represents a good trade-off, having peak performance in accuracy and task time, a good reaching speed, and a moderate contact with the object of interest.

7.
Annu Int Conf IEEE Eng Med Biol Soc ; 2017: 909-912, 2017 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-29060020

RESUMO

In this paper, an anthropometric, active artificial prosthetic hand named UOMPro (University of Moratuwa Prosthetic) is proposed. The UOMPro hand is realized during research on developing affordable hand prostheses for use by people mainly in developing countries where purchasing high cost state-of-the-art commercial hand prostheses may be beyond their capacity. The proposed hand is developed with an affordable cost (<; 850 USD) and it consists of 6 Degrees of Freedom (DOF) including flexion/extension motions of five fingers and abduction/adduction motion of the thumb finger. Under actuated fingers are fabricated using a combination of 3D printed parts and CNC machined aluminum which addresses drawbacks in fully 3D printed hands. All components of the electronic control circuit which are responsible for low-level controlling of the hand are placed inside the hand where a simple serial communication interface is provided to link with high-level control methods. The implemented low-level controller can communicate with either a high-level controller that sends individual fingers position commands or a high-level controller which sends hand grip pattern commands. A set of experiments are conducted to validate the performance of the overall system and results are presented with potential future directions.


Assuntos
Mãos , Membros Artificiais , Dedos , Força da Mão , Humanos , Movimento , Próteses e Implantes , Desenho de Prótese
9.
Artigo em Inglês | MEDLINE | ID: mdl-24111343

RESUMO

Estimation of the correct motion intention of the user is very important for most of the Electromyography (EMG) based control applications such as prosthetics, power-assist exoskeletons, rehabilitation and teleoperation robots. On the other hand, safety and long term reliability are also vital for those applications, as they interact with human users. By considering these requirements, many EMG-based control applications have been proposed and developed. However, there are still many challenges to be addressed in the case of EMG based control systems. One of the challenges that had not been considered in such EMG-based control in common is the muscle fatigue. The muscle fatiguing effects of the user can deteriorate the effectiveness of the EMG-based control in the long run, which makes the EMG-based control to produce less accurate results. Therefore, in this study we attempted to develop a fuzzy rule based scheme to compensate the effects of muscle fatigues on EMG based control. Fuzzy rule based weights have been estimated based on time and frequency domain features of the EMG signals. Eventually, these weights have been used to modify the controller output according with the muscle fatigue condition in the muscles. The effectiveness of the proposed method has been evaluated by experiments.


Assuntos
Eletromiografia/métodos , Lógica Fuzzy , Fadiga Muscular/fisiologia , Adulto , Fenômenos Biomecânicos , Eletrodos , Humanos , Masculino , Movimento (Física) , Músculos/fisiologia , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA