RESUMEN
BACKGROUND: The occurrences of acute complications arising from hypoglycemia and hyperglycemia peak as young adults with type 1 diabetes (T1D) take control of their own care. Continuous glucose monitoring (CGM) devices provide real-time glucose readings enabling users to manage their control proactively. Machine learning algorithms can use CGM data to make ahead-of-time risk predictions and provide insight into an individual's longer term control. METHODS: We introduce explainable machine learning to make predictions of hypoglycemia (<70 mg/dL) and hyperglycemia (>270 mg/dL) up to 60 minutes ahead of time. We train our models using CGM data from 153 people living with T1D in the CITY (CGM Intervention in Teens and Young Adults With Type 1 Diabetes)survey totaling more than 28 000 days of usage, which we summarize into (short-term, medium-term, and long-term) glucose control features along with demographic information. We use machine learning explanations (SHAP [SHapley Additive exPlanations]) to identify which features have been most important in predicting risk per user. RESULTS: Machine learning models (XGBoost) show excellent performance at predicting hypoglycemia (area under the receiver operating curve [AUROC]: 0.998, average precision: 0.953) and hyperglycemia (AUROC: 0.989, average precision: 0.931) in comparison with a baseline heuristic and logistic regression model. CONCLUSIONS: Maximizing model performance for glucose risk prediction and management is crucial to reduce the burden of alarm fatigue on CGM users. Machine learning enables more precise and timely predictions in comparison with baseline models. SHAP helps identify what about a CGM user's glucose control has led to predictions of risk which can be used to reduce their long-term risk of complications.
Asunto(s)
Radioterapia/tendencias , Encuestas y Cuestionarios , Adulto , Niño , Ensayos Clínicos como Asunto , Humanos , Radiometría , Reino UnidoRESUMEN
OBJECTIVES: The aim of this study was to evaluate and benchmark the performance characteristics of the General Electric (GE) Discovery Molecular Imaging (MI) Digital Ready (DR) PET/CT. MATERIALS AND METHODS: Performance evaluation against the National Electrical Manufacturers Association (NEMA) 2012 standard was performed on three GE Discovery MI DR PET/CT systems installed across different UK centres. The Discovery MI DR performance was compared with the Siemens Biograph mCT Flow, Phillips Ingenuity TF and GE Discovery 690 fully analogue PET/CT systems. In addition, as the Discovery MI DR is upgradable to the Digital MI with silicon photomultipliers, performance characteristics between analogue and digital were compared with assess potential benefits of a system upgrade. RESULTS: The average NEMA results across three Discovery MI DR scanners were: sensitivity 7.3 cps/kBq, spatial resolution full-width-half-maximum radial 5.5 mm, tangential 4.5 mm and axial 6 mm at 10 cm from the centre of the field-of-view, peak noise equivalent count rate 142 kcps, scatter fraction 37.1%, contrast recovery coefficients from the International Electrotechnical Commission phantom ranged from 52 to 87% for 10-37-mm diameter spheres. CONCLUSION: All three Discovery MI DR systems tested in this study exceeded the manufacturer's NEMA specification, yet variability between scanners was noted. Discovery MI DR showed similar performance to Discovery 690 and Ingenuity TF, but lower sensitivity and spatial resolution than Biograph mCT Flow. The Discovery MI DR showed lower spatial resolution and contrast recovery than the 20-cm field-of-view Digital MI.
Asunto(s)
Tomografía Computarizada por Tomografía de Emisión de Positrones/normas , Sociedades , Estándares de ReferenciaRESUMEN
OBJECTIVES: Both qualitative and quantitative analysis in nuclear medicine can be undermined by Poisson noise in low-count clinical images. Whilst the conventional smoothing filters are typically used do reduce noise, they also degrade the image structure. Fourier block noise reduction (FBNR) is an adaptive filtering approach, which attempts to reduce image noise and maintain image resolution and structure. METHODS: Although a degree of automated flexibility is possible using conventional stationary pre-filtering, e.g. using a total image count-dependent Metz filter, resolution and contrast is degraded across the image. Adaptive non-stationary filtering has been applied by others in an attempt to maintain structure whilst reducing noise: instead of analysing the whole image, only a subset is used to determine each pixel's correction. Whilst the new software algorithm FBNR shares some common components with other adaptive non-stationary filters, it expressly includes the Poisson noise model within a simple and robust algorithm that can be applied to a diverse range of clinical studies. RESULTS AND CONCLUSIONS: No additional artefacts were seen post-application of FBNR during evaluation using simulated and clinical images. Mean normalised error values indicate FBNR processing is equivalent to obtaining an unprocessed image with at least 2.5 times the number of counts.
Asunto(s)
Algoritmos , Aumento de la Imagen/métodos , Interpretación de Imagen Asistida por Computador/métodos , Procesamiento de Señales Asistido por Computador , Tomografía Computarizada de Emisión de Fotón Único/métodos , Análisis de Fourier , Humanos , Fantasmas de Imagen , Distribución de Poisson , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Tomografía Computarizada de Emisión de Fotón Único/instrumentaciónRESUMEN
OBJECTIVE: To acquire data from a 123I filled Alderson phantom on different gamma cameras types and compare the relative uptake results from processing using the QuantiSPECT program (GE Healthcare). METHODS: A DaTSCAN phantom was filled using the standard protocol and imaged on seven different gamma camera types and on two identical cameras of the same type. The standard GE Healthcare protocols for the given cameras were used. Aliquots of the striatum and brain background were counted in a gamma counter to determine variations in filling concentration. All the raw DaTSCAN SPECT data was imported into QuantiSPECT and processed by the three different algorithms (two box, three box and crescent) to determine the relative uptake in the striatum. Inter-operater and intra-operator variation was also determined. RESULTS: The 10% variation in filling concentration found across the sites was compensated for in the final results. There was a 5-15% variation between cameras depending on the processing algorithm used. There was an intra-operator variation of between 5 and 12% which reflected the proportion of operator intervention within the processing method. There was no statistical variation between operators. CONCLUSIONS: The transfer of a DaTSCAN database between camera types is feasible, but ideally all data would be acquired on a single camera type and phantom data used to normalize the database accordingly.
Asunto(s)
Encéfalo/diagnóstico por imagen , Cámaras gamma , Procesamiento de Imagen Asistido por Computador/métodos , Algoritmos , Encéfalo/patología , Calibración , Computadores , Humanos , Radioisótopos de Yodo , Modelos Estadísticos , Variaciones Dependientes del Observador , Fantasmas de Imagen , Control de Calidad , Reproducibilidad de los Resultados , Tomografía Computarizada de Emisión de Fotón Único/métodosRESUMEN
BACKGROUND: The quantification of DaTSCAN images can be used as an adjunct to visual assessment to differentiate between Parkinson's syndrome and essential tremor. Many programs have been written to assess the relative uptake in the striatum. AIM: To compare two of the commercially available programs: QuantiSPECT, which analyses isolated data in two dimensions, and BRASS, which performs three-dimensional processing referencing a normal image template. METHOD: Twenty-two patients (11 with Parkinson's syndrome and 11 with essential tremor) were visually assessed by two nuclear medicine consultants. The patient data were then processed using two commercial programs to determine the relative uptake in the striatum. A comparison of the results from the programs was performed, together with a comparison with the visual assessment. The inter-operator and intra-operator variabilities were also ascertained. RESULTS: All programs and processing methods could distinguish between Parkinson's syndrome and essential tremor. There was also a good correlation between the results from the three- and two-dimensional methods. The intra-operator and inter-operator variabilities were dependent on the amount of operator intervention. CONCLUSION: Both programs allowed statistical differentiation between Parkinson's syndrome and essential tremor. Strict operator protocols are needed with QuantiSPECT to reduce inter- and intra-operator variation. The three-dimensional method (BRASS) gave greater concordance than the two-dimensional method (QuantiSPECT) with the visual assessment, but at a cost of increased operator time.
Asunto(s)
Temblor Esencial/diagnóstico por imagen , Temblor Esencial/diagnóstico , Cámaras gamma , Procesamiento de Imagen Asistido por Computador/métodos , Enfermedad de Parkinson/diagnóstico por imagen , Enfermedad de Parkinson/diagnóstico , Algoritmos , Humanos , Modelos Estadísticos , Variaciones Dependientes del Observador , Fantasmas de Imagen , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Tomografía Computarizada de Emisión de Fotón ÚnicoRESUMEN
OBJECTIVES: Conventional extremity dose monitoring in nuclear medicine, using thermoluminescent dosimeters, provides a convenient method of determining integral doses from a series of procedures. Although semiconductor extremity probes are able to add time information and allow doses from individual procedures to be determined, it can be difficult to relate individual operations to the dose-time curve. Solutions to this problem have been identified and developed. METHODS: A novel software tool (Extremity Dose Information Package, EDIP) has been developed that uniquely combines and synchronizes two audiovisual and extremity probe data-streams. The value of this extra information was assessed by acquiring audiovisual and extremity dose information in nuclear medicine and radiopharmacy settings. RESULTS: The ability of the software tool to synchronize audiovisual and dose data-streams was verified. Preliminary studies of handling techniques in radiopharmacy and radioiodine administrations using this tool showed areas in which techniques could be adapted to reduce extremity doses, which would have been difficult or impossible to identify using the dose-time information alone. CONCLUSIONS: This low-cost multimedia extremity dose monitoring package can be used, for example, to aid staff training and pinpoint issues with current operating procedures within a clinical nuclear medicine department. Its unique ability to combine and synchronize audiovisual and dosimetry data is also likely to be of benefit to other industries handling unsealed radioactive materials.
Asunto(s)
Monitoreo de Radiación/métodos , Radiometría/métodos , Programas Informáticos , Dosimetría Termoluminiscente/métodos , Técnicas de Diagnóstico por Radioisótopo , Humanos , Medicina Nuclear/métodos , Dosis de Radiación , Protección Radiológica , Semiconductores , Programas Informáticos/economía , Factores de TiempoRESUMEN
A novel method for registering sequential SPECT scans (4DRRT) is described, whereby all sequential scans acquired in the course of a therapy or a pre-therapy tracer study may be registered in one pass. The method assumes that a monoexponential decay function can be fitted to the series of sequential SPECT scans. Multiple volumes, presenting with different decay rates, are fitted with different mono-exponential functions. The MSSE (mean sum of squared errors in the least-squares fit algorithm), over the volume used for registration, is the cost function minimized at registration. Simulated data were used to assess the effect of thresholding, smoothing, noise and the multi-exponential nature of the four-dimensional (4D) SPECT studies on the performance of 4DRRT, resulting in three-dimensional (3D) residual registration errors <3.5 mm. The 4DRRT method was then compared to the following 3D registration methods: the correlation coefficient, the sum of absolute differences, the variance of image ratios and the mutual information. The comparisons, using both simulated and clinical data, were based on the standard deviation of the effective decay time distribution, generated from the registered 4D dataset, and showed that image registration using 4DRRT is simpler and more robust compared to the 3D techniques, especially when multiple tumour sites with different decay rates are present.
Asunto(s)
Procesamiento de Imagen Asistido por Computador/métodos , Radiometría/métodos , Planificación de la Radioterapia Asistida por Computador/métodos , Algoritmos , Diagnóstico por Imagen , Relación Dosis-Respuesta en la Radiación , Dosis de Radiación , Factores de TiempoRESUMEN
The limitations of traditional targeted radionuclide therapy (TRT) dosimetry can be overcome by using voxel-based techniques. All dosimetry techniques are reliant on a sequence of quantitative emission and transmission data. The use of (131)I, for example, with NaI or mIBG, presents additional quantification challenges beyond those encountered in low-energy NM diagnostic imaging, including dead-time correction and additional photon scatter and penetration in the camera head. The Royal Marsden Dosimetry Package (RMDP) offers a complete package for the accurate processing and analysis of raw emission and transmission patient data. Quantitative SPECT reconstruction is possible using either FBP or OS-EM algorithms. Manual, marker- or voxel-based registration can be used to register images from different modalities and the sequence of SPECT studies required for 3-D dosimetry calculations. The 3-D patient-specific dosimetry routines, using either a beta-kernel or voxel S-factor, are included. Phase-fitting each voxel's activity series enables more robust maps to be generated in the presence of imaging noise, such as is encountered during late, low-count scans or when there is significant redistribution within the VOI between scans. Error analysis can be applied to each generated dose-map. Patients receiving (131)I-mIBG, (131)I-NaI, and (186)Re-HEDP therapies have been analyzed using RMDP. A Monte-Carlo package, developed specifically to address the problems of (131)I quantification by including full photon interactions in a hexagonal-hole collimator and the gamma camera crystal, has been included in the dosimetry package. It is hoped that the addition of this code will lead to improved (131)I image quantification and will contribute towards more accurate 3-D dosimetry.
Asunto(s)
Dosificación Radioterapéutica , Tomografía Computarizada de Emisión de Fotón Único/métodos , Humanos , Procesamiento de Imagen Asistido por Computador , Imagenología Tridimensional , Radioisótopos de Yodo , Método de Montecarlo , Tomografía Computarizada por Rayos XRESUMEN
For targeted radionuclide therapy, the level of activity to be administered is often determined from whole-body dosimetry performed on a pre-therapy tracer study. The largest potential source of error in this method is due to inconsistent or inaccurate activity retention measurements. The main aim of this study was to develop a simple method to quantify the uncertainty in the absorbed dose due to these inaccuracies. A secondary aim was to assess the effect of error propagation from the results of the tracer study to predictive absorbed dose estimates for the therapy as a result of using different radionuclides for each. Standard error analysis was applied to the MIRD schema for absorbed dose calculations. An equation was derived to describe the uncertainty in the absorbed dose estimate due solely to random errors in activity-time data, requiring only these data as input. Two illustrative examples are given. It is also shown that any errors present in the dosimetry calculations following the tracer study will propagate to errors in predictions made for the therapy study according to the ratio of the respective effective half-lives. If the therapy isotope has a much longer physical half-life than the tracer isotope (as is the case, for example, when using 123I as a tracer for 131I therapy) the propagation of errors can be significant. The equations derived provide a simple means to estimate two potentially large sources of error in whole-body absorbed dose calculations.