Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
Nat Commun ; 11(1): 3426, 2020 Jul 09.
Artículo en Inglés | MEDLINE | ID: mdl-32647265

RESUMEN

The unequal distribution of volcanic products between the Earth-facing lunar side and the farside is the result of a complex thermal history. To help unravel the dichotomy, for the first time a lunar landing mission (Chang'e-4, CE-4) has targeted the Moon's farside landing on the floor of Von Kármán crater (VK) inside the South Pole-Aitken (SPA). We present the first deep subsurface stratigraphic structure based on data collected by the ground-penetrating radar (GPR) onboard the Yutu-2 rover during the initial nine months exploration phase. The radargram reveals several strata interfaces beneath the surveying path: buried ejecta is overlaid by at least four layers of distinct lava flows that probably occurred during the Imbrium Epoch, with thicknesses ranging from 12 m up to about 100 m, providing direct evidence of multiple lava-infilling events that occurred within the VK crater. The average loss tangent of mare basalts is estimated at 0.0040-0.0061.

2.
Sensors (Basel) ; 18(10)2018 Oct 20.
Artículo en Inglés | MEDLINE | ID: mdl-30347836

RESUMEN

In the study of indoor simultaneous localization and mapping (SLAM) problems using a stereo camera, two types of primary features-point and line segments-have been widely used to calculate the pose of the camera. However, many feature-based SLAM systems are not robust when the camera moves sharply or turns too quickly. In this paper, an improved indoor visual SLAM method to better utilize the advantages of point and line segment features and achieve robust results in difficult environments is proposed. First, point and line segment features are automatically extracted and matched to build two kinds of projection models. Subsequently, for the optimization problem of line segment features, we add minimization of angle observation in addition to the traditional re-projection error of endpoints. Finally, our model of motion estimation, which is adaptive to the motion state of the camera, is applied to build a new combinational Hessian matrix and gradient vector for iterated pose estimation. Furthermore, our proposal has been tested on EuRoC MAV datasets and sequence images captured with our stereo camera. The experimental results demonstrate the effectiveness of our improved point-line feature based visual SLAM method in improving localization accuracy when the camera moves with rapid rotation or violent fluctuation.

3.
Sensors (Basel) ; 16(8)2016 Aug 13.
Artículo en Inglés | MEDLINE | ID: mdl-27529256

RESUMEN

In the study of SLAM problem using an RGB-D camera, depth information and visual information as two types of primary measurement data are rarely tightly coupled during refinement of camera pose estimation. In this paper, a new method of RGB-D camera SLAM is proposed based on extended bundle adjustment with integrated 2D and 3D information on the basis of a new projection model. First, the geometric relationship between the image plane coordinates and the depth values is constructed through RGB-D camera calibration. Then, 2D and 3D feature points are automatically extracted and matched between consecutive frames to build a continuous image network. Finally, extended bundle adjustment based on the new projection model, which takes both image and depth measurements into consideration, is applied to the image network for high-precision pose estimation. Field experiments show that the proposed method has a notably better performance than the traditional method, and the experimental results demonstrate the effectiveness of the proposed method in improving localization accuracy.

4.
Sensors (Basel) ; 14(3): 4981-5003, 2014 Mar 11.
Artículo en Inglés | MEDLINE | ID: mdl-24618780

RESUMEN

Visual odometry provides astronauts with accurate knowledge of their position and orientation. Wearable astronaut navigation systems should be simple and compact. Therefore, monocular vision methods are preferred over stereo vision systems, commonly used in mobile robots. However, the projective nature of monocular visual odometry causes a scale ambiguity problem. In this paper, we focus on the integration of a monocular camera with a laser distance meter to solve this problem. The most remarkable advantage of the system is its ability to recover a global trajectory for monocular image sequences by incorporating direct distance measurements. First, we propose a robust and easy-to-use extrinsic calibration method between camera and laser distance meter. Second, we present a navigation scheme that fuses distance measurements with monocular sequences to correct the scale drift. In particular, we explain in detail how to match the projection of the invisible laser pointer on other frames. Our proposed integration architecture is examined using a live dataset collected in a simulated lunar surface environment. The experimental results demonstrate the feasibility and effectiveness of the proposed method.


Asunto(s)
Algoritmos , Astronautas , Rayos Láser , Dispositivos Ópticos , Orientación , Visión Monocular , Calibración , Procesamiento de Imagen Asistido por Computador
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA