Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros

Banco de datos
Tipo de estudio
Tipo del documento
Asunto de la revista
Intervalo de año de publicación
1.
3D Print Addit Manuf ; 10(5): 917-929, 2023 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-37886417

RESUMEN

Single-step 3D printing, which can manufacture complicated designs without assembly, has the potential to completely change our design perspective, and how 3D printing products, rather than printing static components, ready-to-use movable mechanisms become a reality. Existing 3D printing solutions are challenged by precision limitations, and cannot directly produce tightly mated moving surfaces. Therefore, joints must be designed with a sufficient gap between the components, resulting in joints and other mechanisms with imprecise motion. In this study, we propose a bio-inspired printable joint and apply it to a Single sTep 3D-printed Prosthetic hand (ST3P hand). We simulate the anatomical structure of the human finger joint and implement a cam effect that changed the distance between the contact surfaces through the elastic bending of the ligaments as the joint flexed. This bio-inspired design allows the joint to be single-step 3D printed and provides precise motion. The bio-inspired printable joint makes it possible for the ST3P hand to be designed as a lightweight (∼255 g), low-cost (∼$500) monolithic structure with nine finger joints and manufactured via single-step 3D printing. The ST3P hand takes ∼6 min to assemble, which is approximately one-tenth the assembly time of open-source 3D printed prostheses. The hand can perform basic hand tasks of activities of daily living by providing a pulling force of 48 N and grasp strength of 20 N. The simple manufacturing of the ST3P hand could help us take one step closer to realizing fully customized robotic prosthetic hands at low cost and effort.

2.
PLoS One ; 17(11): e0276669, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36441716

RESUMEN

Input-shaping control has received considerable research attention for suppressing residual vibrations. Although numerous studies have been conducted on designing input shapers with arbitrary robustness to modeling errors, no studies have focused on the design of input shapers with arbitrarily specified shaping times. In this study, a specified-duration (SD) shaper, which is an input shaper with an arbitrarily specified shaping time, and a systematic method to design an SD shaper using impulse vectors are proposed. As the specified shaping time increases, the SD shaper increases the number of impulses one by one according to the number of added derivative constraints, thereby improving robustness to modeling errors. The performance of the SD shaper was evaluated for a second-order system through computer simulations. The simulation results revealed that the SD shaper suppresses residual vibrations of the vibratory system at the specified shaping time. The validity of the SD shaper was experimentally verified using a horizontal beam vibration apparatus. The results of this study provide insight into the development of vibration suppression strategies with input shaping control.


Asunto(s)
Registros , Vibración , Humanos , Progresión de la Enfermedad , Simulación por Computador
3.
PLoS One ; 16(2): e0246102, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-33600496

RESUMEN

Soft robots have been extensively researched due to their flexible, deformable, and adaptive characteristics. However, compared to rigid robots, soft robots have issues in modeling, calibration, and control in that the innate characteristics of the soft materials can cause complex behaviors due to non-linearity and hysteresis. To overcome these limitations, recent studies have applied various approaches based on machine learning. This paper presents existing machine learning techniques in the soft robotic fields and categorizes the implementation of machine learning approaches in different soft robotic applications, which include soft sensors, soft actuators, and applications such as soft wearable robots. An analysis of the trends of different machine learning approaches with respect to different types of soft robot applications is presented; in addition to the current limitations in the research field, followed by a summary of the existing machine learning methods for soft robots.


Asunto(s)
Robótica/instrumentación , Diseño de Equipo , Humanos , Aprendizaje Automático Supervisado , Dispositivos Electrónicos Vestibles
4.
Soft Robot ; 6(2): 214-227, 2019 04.
Artículo en Inglés | MEDLINE | ID: mdl-30566026

RESUMEN

This article presents Exo-Glove Poly (EGP) II, a soft wearable robot for the hand with a glove that is completely constructed of polymer materials and that operates through tendon-driven actuation for use in spinal cord injury (SCI). EGP II can restore the ability to pinch and grasp any object for people with SCI in daily life. The design of the glove allows it to be compact and extends the range of hand sizes that can fit. A passive thumb structure was developed to oppose the thumb for improved grasping. To increase the robustness of the glove, EGP II was designed to have a minimal number of components using a single material. A kinematic model of the system was used to optimize the design parameters of an antagonistic tendon routing system on a single actuator for various hand sizes and repeated actuations. Experiments were conducted on two subjects with SCI to verify the grasping performance of EGP II. EGP II has a compact glove and an actuation system that can be placed on a desk or wheelchair, with the weights of 104 g and 1.14 kg, respectively.


Asunto(s)
Mano/fisiología , Polímeros/química , Robótica/instrumentación , Tendones/fisiología , Actividades Cotidianas , Diseño de Equipo/instrumentación , Dispositivo Exoesqueleto , Guantes Protectores , Fuerza de la Mano/fisiología , Humanos , Traumatismos de la Médula Espinal/fisiopatología , Dispositivos Electrónicos Vestibles
5.
Sci Robot ; 4(26)2019 01 30.
Artículo en Inglés | MEDLINE | ID: mdl-33137763

RESUMEN

To perceive user intentions for wearable robots, we present a learning-based intention detection methodology using a first-person-view camera.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA