Your browser doesn't support javascript.
loading
Evaluation of the Content, Quality, and Readability of Patient Accessible Online Resources Regarding Cataracts.
Patel, Annika J; Kloosterboer, Amy; Yannuzzi, Nicolas A; Venkateswaran, Nandini; Sridhar, Jayanth.
Afiliação
  • Patel AJ; Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami Miller School of Medicine, Miami, Florida.
  • Kloosterboer A; Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami Miller School of Medicine, Miami, Florida.
  • Yannuzzi NA; Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami Miller School of Medicine, Miami, Florida.
  • Venkateswaran N; Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami Miller School of Medicine, Miami, Florida.
  • Sridhar J; Department of Ophthalmology, Bascom Palmer Eye Institute, University of Miami Miller School of Medicine, Miami, Florida.
Semin Ophthalmol ; 36(5-6): 384-391, 2021 Aug 18.
Article em En | MEDLINE | ID: mdl-33634726
ABSTRACT

PURPOSE:

To evaluate the content quality, accuracy, and readability of commonly visited websites by cataract patients contemplating cataract surgery.

SETTING:

Freely available online information.

DESIGN:

Cross-sectional study.

METHODS:

Ten websites were evaluated in a cross-sectional study for content analysis using a grading sheet of 40 questions individually scored by three ophthalmologists. JAMA benchmarks were used to assess the quality. An online readability tool, Readable, was used to assess the readability.

RESULTS:

There was a significant difference between the content and accuracy of each website according to a Kruskal-Wallis test (H = 22.623, P = .007). The average score for all websites using the grading sheet was 90.85 out of 160 points, or 57% (SD 29.93, CI 95%±17.69). There was no significant correlation between website rank on Google.com and content quality of the website (r = 0.049, P = .894). No websites complied with all 4 JAMA criteria for authorship. There was no significant correlation between content quality of each website and number of JAMA requirements met (r = -0.563, P = .09). The average Flesch Reading Ease Score for all websites was 52.64 (SD 11.94, CI 95%±7.40), and the average Mean Reading Grade was 10.72 (SD 1.58, CI 95%±0.98). There was a significant difference in Mean Reading Grades between websites (H = 23.703, P = .005). There was no significant correlation between content quality of the website and Mean Reading Grade (r = -0.552, P = .098).

CONCLUSION:

Commonly accessed online resources on cataracts and cataract surgery are insufficient to provide patients with a clear and complete understanding of their condition as well as available medical and surgical treatment options.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Catarata / Compreensão Tipo de estudo: Observational_studies / Prevalence_studies / Risk_factors_studies Limite: Humans Idioma: En Ano de publicação: 2021 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Catarata / Compreensão Tipo de estudo: Observational_studies / Prevalence_studies / Risk_factors_studies Limite: Humans Idioma: En Ano de publicação: 2021 Tipo de documento: Article