Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Más filtros













Base de datos
Intervalo de año de publicación
1.
Br J Ophthalmol ; 2024 Jun 05.
Artículo en Inglés | MEDLINE | ID: mdl-38839251

RESUMEN

BACKGROUND/AIMS: The aim of this study was to develop and evaluate digital ray, based on preoperative and postoperative image pairs using style transfer generative adversarial networks (GANs), to enhance cataractous fundus images for improved retinopathy detection. METHODS: For eligible cataract patients, preoperative and postoperative colour fundus photographs (CFP) and ultra-wide field (UWF) images were captured. Then, both the original CycleGAN and a modified CycleGAN (C2ycleGAN) framework were adopted for image generation and quantitatively compared using Frechet Inception Distance (FID) and Kernel Inception Distance (KID). Additionally, CFP and UWF images from another cataract cohort were used to test model performances. Different panels of ophthalmologists evaluated the quality, authenticity and diagnostic efficacy of the generated images. RESULTS: A total of 959 CFP and 1009 UWF image pairs were included in model development. FID and KID indicated that images generated by C2ycleGAN presented significantly improved quality. Based on ophthalmologists' average ratings, the percentages of inadequate-quality images decreased from 32% to 18.8% for CFP, and from 18.7% to 14.7% for UWF. Only 24.8% and 13.8% of generated CFP and UWF images could be recognised as synthetic. The accuracy of retinopathy detection significantly increased from 78% to 91% for CFP and from 91% to 93% for UWF. For retinopathy subtype diagnosis, the accuracies also increased from 87%-94% to 91%-100% for CFP and from 87%-95% to 93%-97% for UWF. CONCLUSION: Digital ray could generate realistic postoperative CFP and UWF images with enhanced quality and accuracy for overall detection and subtype diagnosis of retinopathies, especially for CFP.\ TRIAL REGISTRATION NUMBER: This study was registered with ClinicalTrials.gov (NCT05491798).

2.
JAMA Ophthalmol ; 141(11): 1045-1051, 2023 Nov 01.
Artículo en Inglés | MEDLINE | ID: mdl-37856107

RESUMEN

Importance: Retinal diseases are the leading cause of irreversible blindness worldwide, and timely detection contributes to prevention of permanent vision loss, especially for patients in rural areas with limited medical resources. Deep learning systems (DLSs) based on fundus images with a 45° field of view have been extensively applied in population screening, while the feasibility of using ultra-widefield (UWF) fundus image-based DLSs to detect retinal lesions in patients in rural areas warrants exploration. Objective: To explore the performance of a DLS for multiple retinal lesion screening using UWF fundus images from patients in rural areas. Design, Setting, and Participants: In this diagnostic study, a previously developed DLS based on UWF fundus images was used to screen for 5 retinal lesions (retinal exudates or drusen, glaucomatous optic neuropathy, retinal hemorrhage, lattice degeneration or retinal breaks, and retinal detachment) in 24 villages of Yangxi County, China, between November 17, 2020, and March 30, 2021. Interventions: The captured images were analyzed by the DLS and ophthalmologists. Main Outcomes and Measures: The performance of the DLS in rural screening was compared with that of the internal validation in the previous model development stage. The image quality, lesion proportion, and complexity of lesion composition were compared between the model development stage and the rural screening stage. Results: A total of 6222 eyes in 3149 participants (1685 women [53.5%]; mean [SD] age, 70.9 [9.1] years) were screened. The DLS achieved a mean (SD) area under the receiver operating characteristic curve (AUC) of 0.918 (0.021) (95% CI, 0.892-0.944) for detecting 5 retinal lesions in the entire data set when applied for patients in rural areas, which was lower than that reported at the model development stage (AUC, 0.998 [0.002] [95% CI, 0.995-1.000]; P < .001). Compared with the fundus images in the model development stage, the fundus images in this rural screening study had an increased frequency of poor quality (13.8% [860 of 6222] vs 0%), increased variation in lesion proportions (0.1% [6 of 6222]-36.5% [2271 of 6222] vs 14.0% [2793 of 19 891]-21.3% [3433 of 16 138]), and an increased complexity of lesion composition. Conclusions and Relevance: This diagnostic study suggests that the DLS exhibited excellent performance using UWF fundus images as a screening tool for 5 retinal lesions in patients in a rural setting. However, poor image quality, diverse lesion proportions, and a complex set of lesions may have reduced the performance of the DLS; these factors in targeted screening scenarios should be taken into consideration in the model development stage to ensure good performance.


Asunto(s)
Aprendizaje Profundo , Enfermedades de la Retina , Humanos , Femenino , Anciano , Sensibilidad y Especificidad , Fondo de Ojo , Retina/diagnóstico por imagen , Retina/patología , Enfermedades de la Retina/diagnóstico por imagen , Enfermedades de la Retina/patología
3.
Nat Med ; 28(9): 1883-1892, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-36109638

RESUMEN

The storage of facial images in medical records poses privacy risks due to the sensitive nature of the personal biometric information that can be extracted from such images. To minimize these risks, we developed a new technology, called the digital mask (DM), which is based on three-dimensional reconstruction and deep-learning algorithms to irreversibly erase identifiable features, while retaining disease-relevant features needed for diagnosis. In a prospective clinical study to evaluate the technology for diagnosis of ocular conditions, we found very high diagnostic consistency between the use of original and reconstructed facial videos (κ ≥ 0.845 for strabismus, ptosis and nystagmus, and κ = 0.801 for thyroid-associated orbitopathy) and comparable diagnostic accuracy (P ≥ 0.131 for all ocular conditions tested) was observed. Identity removal validation using multiple-choice questions showed that compared to image cropping, the DM could much more effectively remove identity attributes from facial images. We further confirmed the ability of the DM to evade recognition systems using artificial intelligence-powered re-identification algorithms. Moreover, use of the DM increased the willingness of patients with ocular conditions to provide their facial images as health information during medical treatment. These results indicate the potential of the DM algorithm to protect the privacy of patients' facial images in an era of rapid adoption of digital health technologies.


Asunto(s)
Inteligencia Artificial , Privacidad , Algoritmos , Confidencialidad , Cara , Humanos , Estudios Prospectivos
4.
Eye (Lond) ; 36(8): 1681-1686, 2022 08.
Artículo en Inglés | MEDLINE | ID: mdl-34345030

RESUMEN

BACKGROUND: Retinal exudates and/or drusen (RED) can be signs of many fundus diseases that can lead to irreversible vision loss. Early detection and treatment of these diseases are critical for improving vision prognosis. However, manual RED screening on a large scale is time-consuming and labour-intensive. Here, we aim to develop and assess a deep learning system for automated detection of RED using ultra-widefield fundus (UWF) images. METHODS: A total of 26,409 UWF images from 14,994 subjects were used to develop and evaluate the deep learning system. The Zhongshan Ophthalmic Center (ZOC) dataset was selected to compare the performance of the system to that of retina specialists in RED detection. The saliency map visualization technique was used to understand which areas in the UWF image had the most influence on our deep learning system when detecting RED. RESULTS: The system for RED detection achieved areas under the receiver operating characteristic curve of 0.994 (95% confidence interval [CI]: 0.991-0.996), 0.972 (95% CI: 0.957-0.984), and 0.988 (95% CI: 0.983-0.992) in three independent datasets. The performance of the system in the ZOC dataset was comparable to that of an experienced retina specialist. Regions of RED were highlighted by saliency maps in UWF images. CONCLUSIONS: Our deep learning system is reliable in the automated detection of RED in UWF images. As a screening tool, our system may promote the early diagnosis and management of RED-related fundus diseases.


Asunto(s)
Aprendizaje Profundo , Drusas Retinianas , Exudados y Transudados , Fondo de Ojo , Humanos , Retina/diagnóstico por imagen , Drusas Retinianas/diagnóstico
5.
Asia Pac J Ophthalmol (Phila) ; 10(3): 234-243, 2021 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-34224468

RESUMEN

ABSTRACT: Teleophthalmology, a subfield of telemedicine, has recently been widely applied in ophthalmic disease management, accelerated by ubiquitous connectivity via mobile computing and communication applications. Teleophthalmology has strengths in overcoming geographic barriers and broadening access to medical resources, as a supplement to face-to-face clinical settings. Eyes, especially the anterior segment, are one of the most researched superficial parts of the human body. Therefore, ophthalmic images, easily captured by portable devices, have been widely applied in teleophthalmology, boosted by advancements in software and hardware in recent years. This review aims to revise current teleophthalmology applications in the anterior segment and other diseases from a temporal and spatial perspective, and summarize common scenarios in teleophthalmology, including screening, diagnosis, treatment, monitoring, postoperative follow-up, and tele-education of patients and clinical practitioners. Further, challenges in the current application of teleophthalmology and the future development of teleophthalmology are discussed.


Asunto(s)
Oftalmopatías , Oftalmología , Telemedicina , Ojo , Oftalmopatías/diagnóstico , Oftalmopatías/terapia , Humanos , Tamizaje Masivo
6.
Invest Ophthalmol Vis Sci ; 62(2): 35, 2021 02 01.
Artículo en Inglés | MEDLINE | ID: mdl-33620373

RESUMEN

Purpose: To investigate environmental factors associated with corneal morphologic changes. Methods: A cross-sectional study was conducted, which enrolled adults of the Han ethnicity aged 18 to 44 years from 20 cities. The cornea-related morphology was measured using an ocular anterior segment analysis system. The geographic indexes of each city and meteorological indexes of daily city-level data from the past 40 years (1980-2019) were obtained. Correlation analyses at the city level and multilevel model analyses at the eye level were performed. Results: In total, 114,067 eyes were used for analysis. In the correlation analyses at the city level, the corneal thickness was positively correlated with the mean values of precipitation (highest r [correlation coefficient]: >0.700), temperature, and relative humidity (RH), as well as the amount of annual variation in precipitation (r: 0.548 to 0.721), and negatively correlated with the mean daily difference in the temperature (DIF T), duration of sunshine, and variance in RH (r: -0.694 to 0.495). In contrast, the anterior chamber (AC) volume was negatively correlated with the mean values of precipitation, temperature, RH, and the amount of annual variation in precipitation (r: -0.672 to -0.448), and positively associated with the mean DIF T (r = 0.570) and variance in temperature (r = 0.507). In total 19,988 eyes were analyzed at the eye level. After adjusting for age, precipitation was the major explanatory factor among the environmental factors for the variability in corneal thickness and AC volume. Conclusions: Individuals who were raised in warm and wet environments had thicker corneas and smaller AC volumes than those from cold and dry ambient environments. Our findings demonstrate the role of local environmental factors in corneal-related morphology.


Asunto(s)
Córnea/anatomía & histología , Enfermedades de la Córnea/diagnóstico , Exposición a Riesgos Ambientales , Adolescente , Adulto , China/epidemiología , Enfermedades de la Córnea/epidemiología , Estudios Transversales , Femenino , Humanos , Incidencia , Masculino , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA