Your browser doesn't support javascript.
loading
Deep learning for real-time multi-class segmentation of artefacts in lung ultrasound.
Howell, Lewis; Ingram, Nicola; Lapham, Roger; Morrell, Adam; McLaughlan, James R.
Afiliação
  • Howell L; School of Computing, University of Leeds, Leeds, LS2 9JT, UK; School of Electronic and Electrical Engineering, University of Leeds, Leeds, LS2 9JT, UK.
  • Ingram N; Leeds Institute of Medical Research, University of Leeds, St James' University Hospital, Leeds, LS9 7TF, UK.
  • Lapham R; Radiology Department, Leeds Teaching Hospital Trust, Leeds General Infirmary, Leeds, LS1 3EX, UK.
  • Morrell A; Radiology Department, Leeds Teaching Hospital Trust, Leeds General Infirmary, Leeds, LS1 3EX, UK.
  • McLaughlan JR; School of Electronic and Electrical Engineering, University of Leeds, Leeds, LS2 9JT, UK; Leeds Institute of Medical Research, University of Leeds, St James' University Hospital, Leeds, LS9 7TF, UK. Electronic address: j.r.mclaughlan@leeds.ac.uk.
Ultrasonics ; 140: 107251, 2024 May.
Article em En | MEDLINE | ID: mdl-38520819
ABSTRACT
Lung ultrasound (LUS) has emerged as a safe and cost-effective modality for assessing lung health, particularly during the COVID-19 pandemic. However, interpreting LUS images remains challenging due to its reliance on artefacts, leading to operator variability and limiting its practical uptake. To address this, we propose a deep learning pipeline for multi-class segmentation of objects (ribs, pleural line) and artefacts (A-lines, B-lines, B-line confluence) in ultrasound images of a lung training phantom. Lightweight models achieved a mean Dice Similarity Coefficient (DSC) of 0.74, requiring fewer than 500 training images. Applying this method in real-time, at up to 33.4 frames per second in inference, allows enhanced visualisation of these features in LUS images. This could be useful in providing LUS training and helping to address the skill gap. Moreover, the segmentation masks obtained from this model enable the development of explainable measures of disease severity, which have the potential to assist in the triage and management of patients. We suggest one such semi-quantitative measure called the B-line Artefact Score, which is related to the percentage of an intercostal space occupied by B-lines and in turn may be associated with the severity of a number of lung conditions. Moreover, we show how transfer learning could be used to train models for small datasets of clinical LUS images, identifying pathologies such as simple pleural effusions and lung consolidation with DSC values of 0.48 and 0.32 respectively. Finally, we demonstrate how such DL models could be translated into clinical practice, implementing the phantom model alongside a portable point-of-care ultrasound system, facilitating bedside assessment and improving the accessibility of LUS.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Ultrassonografia / Artefatos / Imagens de Fantasmas / Aprendizado Profundo / COVID-19 / Pulmão Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Ultrassonografia / Artefatos / Imagens de Fantasmas / Aprendizado Profundo / COVID-19 / Pulmão Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article