Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(20)2023 Oct 18.
Artigo em Inglês | MEDLINE | ID: mdl-37896662

RESUMO

Estimating depth from images is a common technique in 3D perception. However, dealing with non-Lambertian materials, e.g., transparent or specular, is still nowadays an open challenge. However, to overcome this challenge with deep stereo matching networks or monocular depth estimation, data sets with non-Lambertian objects are mandatory. Currently, only few real-world data sets are available. This is due to the high effort and time-consuming process of generating these data sets with ground truth. Currently, transparent objects must be prepared, e.g., painted or powdered, or an opaque twin of the non-Lambertian object is needed. This makes data acquisition very time consuming and elaborate. We present a new measurement principle for how to generate a real data set of transparent and specular surfaces without object preparation techniques, which greatly reduces the effort and time required for data collection. For this purpose, we use a thermal 3D sensor as a reference system, which allows the 3D detection of transparent and reflective surfaces without object preparation. In addition, we publish the first-ever real stereo data set, called TranSpec3D, where ground truth disparities without object preparation were generated using this measurement principle. The data set contains 110 objects and consists of 148 scenes, each taken in different lighting environments, which increases the size of the data set and creates different reflections on the surface. We also show the advantages and disadvantages of our measurement principle and data set compared to the Booster data set (generated with object preparation), as well as the current limitations of our novel method.

2.
Appl Opt ; 60(8): 2362-2371, 2021 Mar 10.
Artigo em Inglês | MEDLINE | ID: mdl-33690336

RESUMO

Three-dimensional (3D) shape measurement systems based on diffuse reflection of projected structured light do not deliver reliable data when measuring glossy, transparent, absorbent, or translucent objects. In recent years, we have developed a method based on stereo recording with infrared cameras and projection of areal aperiodic sinusoidal thermal patterns to detect such objects. However, the measurements took longer than 10 s, up to minutes; moreover, the measurement accuracy was improvable. Now, we have succeeded in both drastically reducing measurement time and significantly increasing measurement quality. This finally provides a technique for reliably measuring transparent objects, e.g., in series production. We demonstrate measurement examples achieved within 1 s and with 3D standard deviations less than 10 µm.

3.
Light Sci Appl ; 7: 71, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30302242

RESUMO

Aperiodic sinusoidal patterns that are cast by a GOBO (GOes Before Optics) projector are a powerful tool for optically measuring the surface topography of moving or deforming objects with very high speed and accuracy. We optimised the first experimental setup that we were able to measure inflating car airbags at frame rates of more than 50 kHz while achieving a 3D point standard deviation of ~500 µm. Here, we theoretically investigate the method of GOBO projection of aperiodic sinusoidal fringes. In a simulation-based performance analysis, we examine the parameters that influence the accuracy of the measurement result and identify an optimal pattern design that yields the highest measurement accuracy. We compare the results with those that were obtained via GOBO projection of phase-shifted sinusoidal fringes. Finally, we experimentally verify the theoretical findings. We show that the proposed technique has several advantages over conventional fringe projection techniques, as the easy-to-build and cost-effective GOBO projector can provide a high radiant flux, allows high frame rates, and can be used over a wide spectral range.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA