Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Hum Factors ; 55(1): 157-82, 2013 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-23516800

RESUMO

OBJECTIVE: The goal of this project is to evaluate a new auditory cue, which the authors call spearcons, in comparison to other auditory cues with the aim of improving auditory menu navigation. BACKGROUND: With the shrinking displays of mobile devices and increasing technology use by visually impaired users, it becomes important to improve usability of non-graphical user interface (GUI) interfaces such as auditory menus. Using nonspeech sounds called auditory icons (i.e., representative real sounds of objects or events) or earcons (i.e., brief musical melody patterns) has been proposed to enhance menu navigation. To compensate for the weaknesses of traditional nonspeech auditory cues, the authors developed spearcons by speeding up a spoken phrase, even to the point where it is no longer recognized as speech. METHOD: The authors conducted five empirical experiments. In Experiments 1 and 2, they measured menu navigation efficiency and accuracy among cues. In Experiments 3 and 4, they evaluated learning rate of cues and speech itself. In Experiment 5, they assessed spearcon enhancements compared to plain TTS (text to speech: speak out written menu items) in a two-dimensional auditory menu. RESULTS: Spearcons outperformed traditional and newer hybrid auditory cues in navigation efficiency, accuracy, and learning rate. Moreover, spearcons showed comparable learnability as normal speech and led to better performance than speech-only auditory cues in two-dimensional menu navigation. CONCLUSION: These results show that spearcons can be more effective than previous auditory cues in menu-based interfaces. APPLICATION: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.


Assuntos
Estimulação Acústica/métodos , Percepção Auditiva , Telefone Celular/tendências , Computadores de Mão/tendências , Interface Usuário-Computador , Adolescente , Análise de Variância , Telefone Celular/instrumentação , Sinais (Psicologia) , Apresentação de Dados , Feminino , Humanos , Masculino , Som , Fala , Adulto Jovem
2.
Acta Psychol (Amst) ; 137(3): 309-17, 2011 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-21504835

RESUMO

A mental scanning paradigm was used to examine the representation of nonspeech sounds in working memory. Participants encoded sonifications - nonspeech auditory representations of quantitative data - as either verbal lists, visuospatial images, or auditory images. The number of tones and overall frequency changes in the sonifications were also manipulated to allow for different hypothesized patterns of reaction times across encoding strategies. Mental scanning times revealed different patterns of reaction times across encoding strategies, despite the fact that all internal representations were constructed from the same nonspeech sound stimuli. Scanning times for the verbal encoding strategy increased linearly as the number of items in the verbal representation increased. Scanning times for the visuospatial encoding strategy were generally slower and increased as the metric distance (derived metaphorically from frequency change) in the mental image increased. Scanning times for the auditory imagery strategy were faster and closest to the veridical durations of the original stimuli. Interestingly, the number of items traversed in scanning a representation significantly affected scanning times across all encoding strategies. Results suggested that nonspeech sounds can be flexibly represented, and that a universal per-item scanning cost persisted across encoding strategies. Implications for cognitive theory, the mental scanning paradigm, and practical applications are discussed.


Assuntos
Percepção Auditiva/fisiologia , Imaginação/fisiologia , Memória de Curto Prazo/fisiologia , Estimulação Acústica , Adolescente , Feminino , Humanos , Masculino , Tempo de Reação/fisiologia , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA