Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
J Orthop Sci ; 2023 Aug 18.
Artigo em Inglês | MEDLINE | ID: mdl-37599135

RESUMO

BACKGROUND: This study aimed to quantify the readability and quality of online patient resources on knee osteoarthritis and lumbar spinal stenosis in Japan. METHODS: Three search engines (Google, Yahoo, and Bing) were searched for the terms knee osteoarthritis and lumbar spinal stenosis. The first 30 websites of each search were screened. Duplicate websites and those unrelated to the searched diseases were excluded. The remaining 125 websites (62 on knee osteoarthritis, 63 on lumbar spinal stenosis) were analyzed. The text readability was assessed using two web-based programs (Obi-3 and Readability Research Lab) and lexical density. Website quality was evaluated using the DISCERN score, Clear Communication Index, and Journal of American Medical Association benchmark criteria. RESULTS: Readability scores were high, indicating that the texts were difficult to understand. Only 24 (19%) and six (5%) websites were classified as average difficulty readability according to Obi-3 and Readability Research Lab, respectively. The overall quality of information was low, with only four (3%) being rated as having sufficient quality based on the Clear Communication Index and Journal of American Medical Association benchmark criteria. None of the websites satisfied the DISCERN quality criteria. CONCLUSIONS: Patient information on Japanese websites regarding knee osteoarthritis and lumbar spinal stenosis were difficult to understand. Moreover, the quality of the websites was insufficient. Orthopaedic surgeons should contribute to the creation of high-quality easy-to-read websites to facilitate patient-physician communication.

2.
J Neurol Surg B Skull Base ; 83(Suppl 2): e54-e59, 2022 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-35832957

RESUMO

Objectives This article evaluates the completeness and accuracy of YouTube videos related to endoscopic transsphenoidal surgery (ETS) as a source for patient information. Design YouTube was searched using relevant terms pertaining to ETS. Videos were evaluated independently by two physician reviewers experienced in ETS. Video demographics including uploader source along with validity scores based on predetermined checklists were captured. Setting Internet. Participants Not applicable. Main Outcome Measures A novel ETS scoring checklist, the modified DISCERN criteria, and Journal of the American Medical Association (JAMA) benchmark score were used to measure completeness and accuracy of videos. video power index (VPI) was calculated to reflect popularity. Intraclass correlation coefficient was calculated for rater agreement. Results Seventy-nine videos were included in final scoring and analysis. The ETS score, DISCERN, JAMA, and mean VPI across all included videos were 5.0 ± 2.7, 2.4 ± 0.83, 2.19 ± 0.62, and 8.92 ± 18.1, respectively. Based on the ETS score checklist, 31 (39%) of the videos were rated as poor, 30 (38%) were moderately useful, 17 (22%) were useful, and 1 (1%) was exceptional. There was a significant positive correlation between the ETS, DISCERN, and JAMA scores ( p < 0.001), but no correlation with VPI and the validity scores. There were no significant differences comparing validity scores based on the uploader source. Conclusion YouTube videos related to ETS have limited usefulness and poor overall validity for patient information. Clinicians should direct patients to other validated sources of information and aim to improve the comprehensiveness of ETS-related videos.

3.
J Child Orthop ; 15(3): 291-297, 2021 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-34211606

RESUMO

PURPOSE: To determine the availability and readability of online patient information (OPI) provided by paediatric hospitals in the United States using clubfoot as a model condition. METHODS: The websites of the top 95 paediatric hospitals identified using US News & World Report were included. The names of paediatric hospitals and the terms "clubfoot", "clubfeet" and "talipes equinovarus" were entered into the Google search engine. Readability was assessed using five validated metrics and the composite grade level (CGL). The number of unpaid monthly visits was calculated with the Ahrefs Organic Traffic Score (OTS) tool. Data for paediatric hospitals were compared with the same metrics for the top ten Google search results. RESULTS: Of 95 paediatric hospitals, 29 (30.5%) did not have at least one web page dedicated to clubfoot. The 128 web pages representing 66 paediatric hospitals had an average CGL of 9.4, representing a readability level requiring some high school education. The mean OTS for all paediatric hospitals was 116 estimated visits per month, which was significantly less than that for the top ten Google clubfoot search results (3035.1; p < 0.0001). CONCLUSION: Paediatric hospital web pages on clubfoot were visited much less frequently than those from the top ten Google search results. Only two web pages (1.6%) from paediatric hospitals offered OPI on clubfoot that met the American Medical Association recommended reading level (sixth-grade level). Paediatric hospitals should create OPI on clubfoot with appropriate readability and accessibility for patient families. LEVEL OF EVIDENCE: N/A.

4.
Eur J Pediatr ; 179(12): 1881-1891, 2020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-32894353

RESUMO

An increasing number of individuals use the Internet to obtain health information. However, online health information is unregulated and highly variable. We aimed to assess the readability, understandability, and quality of online information available for "chest pain in children." This analysis was performed in January 2020, by inputting the search term "chest pain in children" into Google. The 180 search results were evaluated/categorized. The readability was assessed using the Flesch reading ease score, the Gunning FOG readability score, the Flesch-Kincaid grade level, the Coleman-Liau score, the Simple Measure of Gobbledygook readability score, the Fry readability score, and the automated readability index (ARI). The quality was assessed through the Journal of the American Medical Association (JAMA) benchmark criteria. The understandability was evaluated by the Patient Education Materials Assessment Tool (PEMAT) for this study. Sixty-five websites were analyzed (academic and hospital websites (n = 30), physicians and health information websites (n = 35)). Among all websites, the average reading grade level was 9.99. There was no statistical difference between the two groups for the average readability level (p: 0.645). The mean PEMAT score for all websites was 65.09%. There was no statistical difference between the two groups for the average PEMAT score (p: 0.945). For both groups, the understandability score was below 70%. The average JAMA benchmark score was 2.43 ± 1.06, with a statistically significant difference between the academic and hospital websites (2.07 ± 0.91) and physician and health information websites (2.74 ± 1.09, p: 0.009).Conclusion: The readability of online materials available for patients regarding "chest pain in children" was significantly higher than the grade 6 recommended by the National Institutes of Health. The current online health information related to pediatric chest pain may be too difficult for the average reader to read. The quality and understandability were not good for both groups. Improving the readability, understandability, and quality of pediatric health-related online materials has the potential to reduce parental anxiety, improve baseline medical knowledge, and even enhance the physician-parent alliance.


Assuntos
Dor no Peito , Compreensão , Pais , Criança , Humanos , Internet , Educação de Pacientes como Assunto , Estados Unidos
5.
Cardiol Young ; 30(3): 328-336, 2020 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-31875800

RESUMO

OBJECTIVE: Murmurs are abnormal audible heart sounds produced by turbulent blood flow. Therefore, murmurs in a child may be a source of anxiety for family members. Families often use online materials to explore possible reasons for these murmurs, given the accessibility of information on the Internet. In this study, we evaluated the quality, understandability, readability, and popularity of online materials about heart murmur. METHODS: An Internet search was performed for "heart murmur" using the Google search engine. The global quality score (on a scale of 1 to 5, corresponding to poor to excellent quality) and Health on the Net code were used to measure the quality of information presented. The understandability of the web pages identified was measured using the Patient Education Materials Assessment Tool (score range from 0 to 100%, scores below 70% reflect poor performance). The readability of each web pages was assessed using four validated indices: the Flesch Reading Ease Score, the Flesch-Kincaid Grade Level, the Gunning Frequency of Gobbledygook, and the Simple Measure of Gobbledygook. The ALEXA traffic tool was used to reference domains' popularity and visibility. RESULTS: We identified 230 English-language patient educational materials that discussed heart murmur. After exclusion, a total of 86 web pages were evaluated for this study. The average global quality score was 4.34 (SD = 0.71; range from 3 to 5) indicating that the quality of information of most websites was good. Only 14 (16.3%) websites had Health on the Net certification. The mean understandability score for all Internet-based patient educational materials was 74.6% (SD = 12.8%; range from 31.2 to 93.7%). A score suggesting these Internet-based patient educational materials were "easy to understand". The mean readability levels of all patient educational materials were higher than the recommended sixth-grade reading level, according to all indices applied. This means that the level of readability is difficult. The average grade level for all web pages was 10.4 ± 1.65 (range from 7.53 to 14.13). The Flesch-Kincaid Grade level was 10 ± 1.81, the Gunning Frequency of Gobbledygook level was 12.1 ± 1.85, and the Simple Measure of Gobbledygook level was 9.1 ± 1.38. The average Flesch Reading Ease Score was 55 ± 9.1 (range from 32.4 to 72.9). CONCLUSION: We demonstrated that web pages describing heart murmurs were understandable and high quality. However, the readability level of the websites was above the recommended sixth-grade reading level. Readability of written materials from online sources need to be improved. However, care must be taken to ensure that the information of web pages is of a high quality and understandable.


Assuntos
Letramento em Saúde/normas , Sopros Cardíacos , Internet/normas , Educação de Pacientes como Assunto/normas , Leitura , Letramento em Saúde/métodos , Humanos , Educação de Pacientes como Assunto/métodos , Materiais de Ensino/normas
6.
J Voice ; 34(2): 302.e15-302.e20, 2020 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-30241922

RESUMO

OBJECTIVE: Vocal fold nodules are benign vocal fold lesions that can adversely affect quality of life. Differential diagnosis and treatment modalities of this disease are variable and patients often tend to use online materials to learn the insights. Access to knowledge via Internet is very easy; however, it is important to choose wisely because false and biased information might lead the patient to an inappropriate decision. In this study, we have evaluated the quality, readability, and understandability of online materials for vocal fold nodules. METHODS: An Internet search was performed for "Vocal fold nodule," "vocal fold nodule treatment," and "voice therapy for vocal fold nodule" by using Google search engine. Readability of each website was evaluated by using www.readable.io. Understandability and actionability of pages were measured by using the Patient Education Materials Assessment Tool (PEMAT). In the end, DISCERN instrument was used to measure the quality of information presented. RESULTS: After exclusion, total of 26 web pages were evaluated during the study. Four web pages graded as A level, 5 as B level, 11 as C level, and 5 as D level for language use. Average grade level for all of the web pages is 11.14 ± 1.75. Overall understandability score was found 59.0+ 12.1 (26.7-77.1), and overall quality score was measured 34.95 + 6.58 (53.75-26.5). CONCLUSION: The quality, readability, and understandability of the written materials are very low and in order for patients to read and learn from the online sources, contents of the written materials should be revised.


Assuntos
Compreensão , Conhecimentos, Atitudes e Prática em Saúde , Letramento em Saúde , Internet , Doenças da Laringe , Educação de Pacientes como Assunto , Prega Vocal , Humanos , Doenças da Laringe/diagnóstico , Doenças da Laringe/fisiopatologia , Doenças da Laringe/terapia , Materiais de Ensino , Prega Vocal/fisiopatologia
7.
Appl Radiat Oncol ; 7(2): 26-30, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-34169120

RESUMO

OBJECTIVE: Patients are turning to the Internet more often for cancer-related information. Oncology organizations need to ensure that appropriately written information is available for patients online. The aim of this study was to determine whether the readability of radiation oncology online patient information (OPI) provided by RTAnswers (RTAnswers.org, created by the American Society for Radiation Oncology) is written at a sixth-grade level as recommended by the National Institutes of Health (NIH), the U.S Department of Health and Human Services (HHS), and the American Medical Association (AMA). METHODS: RTanswers.org was accessed and online patient-oriented brochures for 13 specific disease sites were analyzed. Readability of OPI from RTAnswers was assessed using 10 common readability tests: New Dale-Chall Test, Flesch Reading Ease Score, Coleman-Liau Index, Flesch-Kinkaid Grade Level, FORCAST test, Fry Score, Simple Measure of Gobbledygook, Gunning Frequency of Gobbledygook, New Fog Count, and Raygor Readability Estimate. RESULTS: A composite grade level of readability was constructed using the 8 readability measures that provide a single grade-level output. The grade levels computed by each of these 8 tests were highly correlated (SI alpha = 0.98). The composite grade level for these disease site-specific brochures was 11.6 ± 0.83, corresponding to a senior in high school, significantly higher than the target sixth-grade level (p < 0.05) recommended by the NIH, HHS, and AMA. CONCLUSION: Patient educational material provided by RTAnswers.org is written significantly above the target reading level. Simplifying and rewording this information could improve patients' understanding of radiation therapy and improve treatment adherence and outcomes.

8.
Int J Surg ; 12(3): 205-8, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24380751

RESUMO

INTRODUCTION: Controversy exists relating to carotid endarterectomy (CEA) versus carotid artery stenting (CAS). We aimed to assess the quality of online patient information relating to both. METHODS: The Google search engine was searched for "carotid endarterectomy" and "carotid stenting". The first 50 webpages returned were assessed. The Gunning Fog Index (GFI) and Flesch Reading Ease Score (FRES) were calculated to assess readability. The LIDA tool (Minervation Ltd., Oxford, U.K.) was used to assess accessibility, usability and reliability. RESULTS: 20% (n = 10) of the webpages returned for CEA were from peer reviewed sources with 34% (n = 17) posted by hospitals or health services. Comparatively, for CAS, 40% (n = 20) were peer reviewed with 16% (n = 8) posted by hospitals or health services. GFI and FRES scores indicated webpages for both CEA and CAS had poor general readability. Webpages for CEA were easier to read than those for CAS (mean FRES difference of 6.7 (95% CI 0.51 to 12.93, p = 0.03). Median LIDA scores demonstrated acceptable reliability, accessibility and usability of information for both CEA and CAS webpages. The more readable webpages were not associated with higher LIDA scores for either CEA or CAS webpages. CONCLUSION: Webpages providing information on carotid disease management must be made more readable. Online information currently available to patients regarding CAS is more difficult to read and comprehend than CEA.


Assuntos
Estenose das Carótidas , Informação de Saúde ao Consumidor/normas , Endarterectomia das Carótidas , Internet , Ferramenta de Busca , Compreensão , Informação de Saúde ao Consumidor/métodos , Humanos , Stents
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA