Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Asunto principal
Intervalo de año de publicación
1.
Soft Robot ; 10(6): 1224-1240, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-37590485

RESUMEN

Data-driven methods with deep neural networks demonstrate promising results for accurate modeling in soft robots. However, deep neural network models rely on voluminous data in discovering the complex and nonlinear representations inherent in soft robots. Consequently, while it is not always possible, a substantial amount of effort is required for data acquisition, labeling, and annotation. This article introduces a data-driven learning framework based on synthetic data to circumvent the exhaustive data collection process. More specifically, we propose a novel time series generative adversarial network with a self-attention mechanism, Transformer TimeGAN (TTGAN) to precisely learn the complex dynamics of a soft robot. On top of that, the TTGAN is incorporated with a conditioning network that enables it to produce synthetic data for specific soft robot behaviors. The proposed framework is verified on a widely used pneumatic-based soft gripper as an exemplary experimental setup. Experimental results demonstrate that the TTGAN generates synthetic time series data with realistic soft robot dynamics. Critically, a combination of the synthetic and only partially available original data produces a data-driven model with estimation accuracy comparable to models obtained from using complete original data.

2.
Soft Robot ; 9(3): 591-612, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-34171965

RESUMEN

Sensory data are critical for soft robot perception. However, integrating sensors to soft robots remains challenging due to their inherent softness. An alternative approach is indirect sensing through an estimation scheme, which uses robot dynamics and available measurements to estimate variables that would have been measured by sensors. Nevertheless, developing an adequately effective estimation scheme for soft robots is not straightforward. First, it requires a mathematical model; modeling of soft robots is analytically demanding due to their complex dynamics. Second, it should perform multimodal sensing for both internal and external variables, with minimal sensors, and finally, it must be robust against sensor faults. In this article, we propose a recurrent neural network-based adaptive unscented Kalman filter (RNN-AUKF) architecture to estimate the proprioceptive state and exteroceptive unknown input of a pneumatic-based soft finger. To address the challenge in modeling soft robots, we adopt a data-driven approach using RNNs. Then, we interconnect the AUKF with an unknown input estimator to perform multimodal sensing using a single embedded flex sensor. We also prove mathematically that the estimation error is bounded with respect to sensor degradation (noise and drift). Experimental results show that the RNN-AUKF achieves a better overall performance in terms of accuracy and robustness against the benchmark method. The proposed scheme is also extended to a multifinger soft gripper and is robust against out-of-distribution sensor dynamics. The outcomes of this research have immense potentials in realizing a robust multimodal indirect sensing in soft robots.


Asunto(s)
Robótica , Modelos Teóricos , Redes Neurales de la Computación , Propiocepción , Robótica/métodos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA