Revolutionizing Robotic Depalletizing: AI-Enhanced Parcel Detecting with Adaptive 3D Machine Vision and RGB-D Imaging for Automated Unloading.
Sensors (Basel)
; 24(5)2024 Feb 24.
Article
em En
| MEDLINE
| ID: mdl-38475009
ABSTRACT
Detecting parcels accurately and efficiently has always been a challenging task when unloading from trucks onto conveyor belts because of the diverse and complex ways in which parcels are stacked. Conventional methods struggle to quickly and accurately classify the various shapes and surface patterns of unordered parcels. In this paper, we propose a parcel-picking surface detection method based on deep learning and image processing for the efficient unloading of diverse and unordered parcels. Our goal is to develop a systematic image processing algorithm that emphasises the boundaries of parcels regardless of their shape, pattern, or layout. The core of the algorithm is the utilisation of RGB-D technology for detecting the primary boundary lines regardless of obstacles such as adhesive labels, tapes, or parcel surface patterns. For cases where detecting the boundary lines is difficult owing to narrow gaps between parcels, we propose using deep learning-based boundary line detection through the You Only Look at Coefficients (YOLACT) model. Using image segmentation techniques, the algorithm efficiently predicts boundary lines, enabling the accurate detection of irregularly sized parcels with complex surface patterns. Furthermore, even for rotated parcels, we can extract their edges through complex mathematical operations using the depth values of the specified position, enabling the detection of the wider surfaces of the rotated parcels. Finally, we validate the accuracy and real-time performance of our proposed method through various case studies, achieving mAP (50) values of 93.8% and 90.8% for randomly sized and rotationally covered boxes with diverse colours and patterns, respectively.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
Sensors (Basel)
Ano de publicação:
2024
Tipo de documento:
Article