Your browser doesn't support javascript.
loading
Underwater Structured Light Stripe Center Extraction with Normalized Grayscale Gravity Method.
Li, Shuaishuai; Gao, Xiang; Xie, Zexiao.
Afiliação
  • Li S; College of Engineering, Ocean University of China, Qingdao 266100, China.
  • Gao X; Key Laboratory of Ocean Engineering of Shandong Province, Qingdao 266100, China.
  • Xie Z; Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China.
Sensors (Basel) ; 23(24)2023 Dec 15.
Article em En | MEDLINE | ID: mdl-38139687
ABSTRACT
The non-uniform reflectance characteristics of object surfaces and underwater environment disturbances during underwater laser measurements can have a great impact on laser stripe center extraction. Therefore, we propose a normalized grayscale gravity method to address this problem. First, we build an underwater structured light dataset for different illuminations, turbidity levels, and reflective surfaces of the underwater object and compare several state-of-the-art semantic segmentation models, including Deeplabv3, Deeplabv3plus, MobilenetV3, Pspnet, and FCNnet. Based on our comparison, we recommend PSPnet for the specific task of underwater structured light stripe segmentation. Second, in order to accurately extract the centerline of the extracted light stripe, the gray level values are normalized to eliminate the influence of noise and light stripe edge information on the centroids, and the weights of the cross-sectional extremes are increased to increase the function convergence for better robustness. Finally, the subpixel-structured light center points of the image are obtained by bilinear interpolation to improve the image resolution and extraction accuracy. The experimental results show that the proposed method can effectively eliminate noise interference while exhibiting good robustness and self-adaptability.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article