Can Natural Language Processing and Artificial Intelligence Automate The Generation of Billing Codes From Operative Note Dictations?
Global Spine J
; 13(7): 1946-1955, 2023 Sep.
Article
em En
| MEDLINE
| ID: mdl-35225694
STUDY DESIGN: Retrospective Cohort Study. OBJECTIVES: Using natural language processing (NLP) in combination with machine learning on standard operative notes may allow for efficient billing, maximization of collections, and minimization of coder error. This study was conducted as a pilot study to determine if a machine learning algorithm can accurately identify billing Current Procedural Terminology (CPT) codes on patient operative notes. METHODS: This was a retrospective analysis of operative notes from patients who underwent elective spine surgery by a single senior surgeon from 9/2015 to 1/2020. Algorithm performance was measured by performing receiver operating characteristic (ROC) analysis, calculating the area under the ROC curve (AUC) and the area under the precision-recall curve (AUPRC). A deep learning NLP algorithm and a Random Forest algorithm were both trained and tested on operative notes to predict CPT codes. CPT codes generated by the billing department were compared to those generated by our model. RESULTS: The random forest machine learning model had an AUC of .94 and an AUPRC of .85. The deep learning model had a final AUC of .72 and an AUPRC of .44. The random forest model had a weighted average, class-by-class accuracy of 87%. The LSTM deep learning model had a weighted average, class-by-class accuracy 0f 59%. CONCLUSIONS: Combining natural language processing with machine learning is a valid approach for automatic generation of CPT billing codes. The random forest machine learning model outperformed the LSTM deep learning model in this case. These models can be used by orthopedic or neurosurgery departments to allow for efficient billing.
Texto completo:
1
Base de dados:
MEDLINE
Tipo de estudo:
Observational_studies
/
Prognostic_studies
/
Risk_factors_studies
Idioma:
En
Ano de publicação:
2023
Tipo de documento:
Article