Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Int J Comput Assist Radiol Surg ; 14(9): 1611-1617, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31363983

RESUMO

PURPOSE: Manual feedback from senior surgeons observing less experienced trainees is a laborious task that is very expensive, time-consuming and prone to subjectivity. With the number of surgical procedures increasing annually, there is an unprecedented need to provide an accurate, objective and automatic evaluation of trainees' surgical skills in order to improve surgical practice. METHODS: In this paper, we designed a convolutional neural network (CNN) to classify surgical skills by extracting latent patterns in the trainees' motions performed during robotic surgery. The method is validated on the JIGSAWS dataset for two surgical skills evaluation tasks: classification and regression. RESULTS: Our results show that deep neural networks constitute robust machine learning models that are able to reach new competitive state-of-the-art performance on the JIGSAWS dataset. While we leveraged from CNNs' efficiency, we were able to minimize its black-box effect using the class activation map technique. CONCLUSIONS: This characteristic allowed our method to automatically pinpoint which parts of the surgery influenced the skill evaluation the most, thus allowing us to explain a surgical skill classification and provide surgeons with a novel personalized feedback technique. We believe this type of interpretable machine learning model could integrate within "Operation Room 2.0" and support novice surgeons in improving their skills to eventually become experts.


Assuntos
Competência Clínica , Retroalimentação , Cirurgia Geral/educação , Cirurgia Geral/instrumentação , Aprendizado de Máquina , Redes Neurais de Computação , Fenômenos Biomecânicos , Análise por Conglomerados , Humanos , Cadeias de Markov , Modelos Estatísticos , Movimento (Física) , Análise de Regressão , Procedimentos Cirúrgicos Robóticos , Cirurgiões
2.
Artif Intell Med ; 91: 3-11, 2018 09.
Artigo em Inglês | MEDLINE | ID: mdl-30172445

RESUMO

OBJECTIVE: The analysis of surgical motion has received a growing interest with the development of devices allowing their automatic capture. In this context, the use of advanced surgical training systems makes an automated assessment of surgical trainee possible. Automatic and quantitative evaluation of surgical skills is a very important step in improving surgical patient care. MATERIAL AND METHOD: In this paper, we present an approach for the discovery and ranking of discriminative and interpretable patterns of surgical practice from recordings of surgical motions. A pattern is defined as a series of actions or events in the kinematic data that together are distinctive of a specific gesture or skill level. Our approach is based on the decomposition of continuous kinematic data into a set of overlapping gestures represented by strings (bag of words) for which we compute comparative numerical statistic (tf-idf) enabling the discriminative gesture discovery via its relative occurrence frequency. RESULTS: We carried out experiments on three surgical motion datasets. The results show that the patterns identified by the proposed method can be used to accurately classify individual gestures, skill levels and surgical interfaces. We also present how the patterns provide a detailed feedback on the trainee skill assessment. CONCLUSIONS: The proposed approach is an interesting addition to existing learning tools for surgery as it provides a way to obtain a feedback on which parts of an exercise have been used to classify the attempt as correct or incorrect.


Assuntos
Gestos , Reconhecimento Automatizado de Padrão/métodos , Procedimentos Cirúrgicos Operatórios/educação , Algoritmos , Fenômenos Biomecânicos , Competência Clínica , Feedback Formativo , Humanos , Análise e Desempenho de Tarefas , Estudos de Tempo e Movimento
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...