Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
1.
BMC Med Inform Decis Mak ; 19(1): 4, 2019 01 09.
Artículo en Inglés | MEDLINE | ID: mdl-30626390

RESUMEN

BACKGROUND: New Specific Application Domain (SAD) heuristics or design principles are being developed to guide the design and evaluation of mobile applications in a bid to improve on the usability of these applications. This is because the existing heuristics are rather generic and are often unable to reveal a large number of mobile usability issues related to mobile specific interfaces and characteristics. Mobile Electronic Data Capturing Forms (MEDCFs) are one of such applications that are being used to collect health data particularly in hard to reach areas, but with a number of usability challenges especially when used in rural areas by semi literate users. Existing SAD design principles are often not used to evaluate mobile forms because their focus on features specific to data capture is minimal. In addition, some of these lists are extremely long rendering them difficult to use during the design and development of the mobile forms. The main aim of this study therefore was to generate a usability evaluation checklist that can be used to design and evaluate Mobile Electronic Data Capturing Forms in a bid to improve their usability. We also sought to compare the novice and expert developers' views regarding usability criteria. METHODS: We conducted a literature review in August 2016 using key words on articles and gray literature, and those with a focus on heuristics for mobile applications, user interface designs of mobile devices and web forms were eligible for review. The data bases included the ACM digital library, IEEE-Xplore and Google scholar. We had a total of 242 papers after removing duplicates and a total of 10 articles which met the criteria were finally reviewed. This review resulted in an initial usability evaluation checklist consisting of 125 questions that could be adopted for designing MEDCFs. The questions that handled the five main categories in data capture namely; form content, form layout, input type, error handling and form submission were considered. A validation study was conducted with both novice and expert developers using a validation tool in a bid to refine the checklist which was based on 5 criteria. The criteria for the validation included utility, clarity, question naming, categorization and measurability, with utility and measurability having a higher weight respectively. We then determined the proportion of participants who agreed (scored 4 or 5), disagreed (scored 1 or 2) and were neutral (scored 3) to a given criteria regarding a particular question for each of the experts and novice developers. Finally, we selected questions that had an average of 85% agreement (scored 4 or 5) across all the 5 criteria by both novice and expert developers. 'Agreement' stands for capturing the same views or sentiments about the perceived likeness of an evaluation question. RESULTS: The validation study reduced the initial 125 usability evaluation questions to 30 evaluation questions with the form layout category having the majority questions. Results from the validation showed higher levels of affirmativeness from the expert developers compared to those of the novice developers across the different criteria; however the general trend of agreement on relevance of usability questions was similar across all the criteria for the developers. The evaluation questions that were being validated were found to be useful, clear, properly named and categorized, however the measurability of the questions was found not to be satisfactory by both sets of developers. The developers attached great importance to the use of appropriate language and to the visibility of the help function, but in addition expert developers felt that indication of mandatory and optional fields coupled with the use of device information like the Global Positioning System (GPS) was equally important. And for both sets of developers, utility had the highest scores while measurability scored least. CONCLUSION: The generated checklist indicated the design features the software developers found necessary to improve the usability of mobile electronic data collection tools. In the future, we thus propose to test the effectiveness of the measure for suitability and performance based on this generated checklist, and test it on the end users (data collectors) with a purpose of picking their design requirements. Continuous testing with the end users will help refine the checklist to include only that which is most important in improving the data collectors' experience.


Asunto(s)
Lista de Verificación/normas , Sistemas de Información Geográfica , Aplicaciones de la Informática Médica , Aplicaciones Móviles , Validación de Programas de Computación , Heurística , Humanos , Diseño de Software
2.
JMIR Hum Factors ; 6(1): e11852, 2019 Mar 22.
Artículo en Inglés | MEDLINE | ID: mdl-30900995

RESUMEN

BACKGROUND: Mobile data collection systems are often difficult to use for nontechnical or novice users. This can be attributed to the fact that developers of such tools do not adequately involve end users in the design and development of product features and functions, which often creates interaction challenges. OBJECTIVE: The main objective of this study was to assess the guidelines for form design using high-fidelity prototypes developed based on end-user preferences. We also sought to investigate the association between the results from the System Usability Scale (SUS) and those from the Study Tailored Evaluation Questionnaire (STEQ) after the evaluation. In addition, we sought to recommend some practical guidelines for the implementation of the group testing approach particularly in low-resource settings during mobile form design. METHODS: We developed a Web-based high-fidelity prototype using Axure RP 8. A total of 30 research assistants (RAs) evaluated this prototype in March 2018 by completing the given tasks during 1 common session. An STEQ comprising 13 affirmative statements and the commonly used and validated SUS were administered to evaluate the usability and user experience after interaction with the prototype. The STEQ evaluation was summarized using frequencies in an Excel sheet while the SUS scores were calculated based on whether the statement was positive (user selection minus 1) or negative (5 minus user selection). These were summed up and the score contributions multiplied by 2.5 to give the overall form usability from each participant. RESULTS: Of the RAs, 80% (24/30) appreciated the form progress indication, found the form navigation easy, and were satisfied with the error messages. The results gave a SUS average score of 70.4 (SD 11.7), which is above the recommended average SUS score of 68, meaning that the usability of the prototype was above average. The scores from the STEQ, on the other hand, indicated a 70% (21/30) level of agreement with the affirmative evaluation statements. The results from the 2 instruments indicated a fair level of user satisfaction and a strong positive association as shown by the Pearson correlation value of .623 (P<.01). CONCLUSIONS: A high-fidelity prototype was used to give the users experience with a product they would likely use in their work. Group testing was done because of scarcity of resources such as costs and time involved especially in low-income countries. If embraced, this approach could help assess user needs of the diverse user groups. With proper preparation and the right infrastructure at an affordable cost, usability testing could lead to the development of highly usable forms. The study thus makes recommendations on the practical guidelines for the implementation of the group testing approach particularly in low-resource settings during mobile form design.

3.
Stud Health Technol Inform ; 251: 93-96, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-29968610

RESUMEN

Data collectors collect health data using Mobile Electronic Data Collection Forms (MEDCFs) particularly in hard to reach areas. However, the usability and user acceptance of these forms by the data collectors are seldom considered, and yet these have an implication on the quality of the data collected and on the health decisions thereof. In this study we aimed at collecting the design preferences the data collectors felt would improve their data collection experience. For that purpose, a mid-fidelity prototype was used to accomplish six tasks and a semi-structured usability questionnaire was given. Forty eight data collectors from Uganda participated in the study between December 2017 and January 2018, indicating their preferences after interacting with the prototype. The results included a detailed feedback regarding the presentation of the forms content, form navigation, error handling, data input, and visualization of progress status. Involvement of users and other stakeholders in the design of MEDCFs using the User Centred Design (UCD) will presumably enhance usability of the data collection forms.


Asunto(s)
Procesamiento Automatizado de Datos , Encuestas y Cuestionarios , Interfaz Usuario-Computador , Retroalimentación , Uganda
4.
Stud Health Technol Inform ; 238: 72-75, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-28679890

RESUMEN

Mobile Electronic Data Collection Tools (MEDCTs) are created by form developers to collect data. Usability being one of the top quality attributes is of great concern to developers of any interactive applications. However, little is known about the form developers' understanding of usability, how they measure usability and their limitations in designing for usability. We conducted an empirical study where we aimed at getting the developers' views on usability by interviewing 8 form developers. These are creators of forms used for data collection. We found that developers knew about usability, but it was not their main focus during form development. Challenges included constraining deadlines, software limitations and the insufficient communication with the field users to establish the usability needs. Furthermore, the methods used to evaluate the usability of created forms varied amongst developers and these included in-house evaluations and feedback from piloting sessions with end users.


Asunto(s)
Comunicación , Recolección de Datos , Programas Informáticos , Aplicaciones Móviles
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA