Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Heliyon ; 10(4): e26042, 2024 Feb 29.
Artículo en Inglés | MEDLINE | ID: mdl-38390062

RESUMEN

In this paper, we present a new generation of omnidirectional automated guided vehicles (omniagv) used for transporting materials within a manufacturing factory with the ability to navigate autonomously and intelligently by interacting with the environment, including people and other entities. This robot has to be integrated into the operating environment without significant changes to the current facilities or heavy redefinitions of the logistics processes already running. For this purpose, different vision-based systems and advanced methods in mobile and cognitive robotics are developed and integrated. In this context, vision and perception are key factors. Different developed modules are in charge of supporting the robot during its navigation in the environment. Specifically, the localization module provides information about the robot pose by using visual odometry and wheel odometry systems. The obstacle avoidance module can detect obstacles and recognize some object classes for adaptive navigation. Finally, the tag detection module aids the robot during the picking phase of carts and provides information for global localization. The smart integration of vision and perception is paramount for effectively using the robot in the industrial context. Extensive qualitative and quantitative results prove the capability and effectiveness of the proposed AGV to navigate in the considered industrial environment.

2.
Small Methods ; 7(11): e2300447, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37670547

RESUMEN

In-flow phase-contrast tomography provides a 3D refractive index of label-free cells in cytometry systems. Its major limitation, as with any quantitative phase imaging approach, is the lack of specificity compared to fluorescence microscopy, thus restraining its huge potentialities in single-cell analysis and diagnostics. Remarkable results in introducing specificity are obtained through artificial intelligence (AI), but only for adherent cells. However, accessing the 3D fluorescence ground truth and obtaining accurate voxel-level co-registration of image pairs for AI training is not viable for high-throughput cytometry. The recent statistical inference approach is a significant step forward for label-free specificity but remains limited to cells' nuclei. Here, a generalized computational strategy based on a self-consistent statistical inference to achieve intracellular multi-specificity is shown. Various subcellular compartments (i.e., nuclei, cytoplasmic vacuoles, the peri-vacuolar membrane area, cytoplasm, vacuole-nucleus contact site) can be identified and characterized quantitatively at different phases of the cells life cycle by using yeast cells as a biological model. Moreover, for the first time, virtual reality is introduced for handling the information content of multi-specificity in single cells. Full fruition is proofed for exploring and interacting with 3D quantitative biophysical parameters of the identified compartments on demand, thus opening the route to a metaverse for 3D microscopy.


Asunto(s)
Inteligencia Artificial , Saccharomyces cerevisiae , Citometría de Flujo/métodos , Citoplasma , Microscopía Fluorescente
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...