RESUMO
The association of color and sound helps human cognition through a synergetic effect like intersensory facilitation. Although soft human-machine interfaces (HMIs) providing unisensory expression have been widely developed, achieving synchronized optic and acoustic expression in one device system has been relatively less explored. It is because their operating principles are different in terms of materials, and implementation has mainly been attempted through structural approaches. Here, a deformable sound display is developed that generates multiple colored lights with large sound at low input voltage. The device is based on alternating-current electroluminescence (ACEL) covered with perovskite composite films. A sound wave is created by a polymer matrix of the ACEL, while simultaneously, various colors are produced by the perovskite films and the blue electroluminescence (EL) emitted from the phosphors in the ACEL. By patterning different colored perovskite films onto the ACELs, associating the color and the sound is successfully demonstrated by a piano keyboard and a wearable interactive device.
RESUMO
Typical handheld controllers for interaction in virtual reality (VR) have fixed shapes and sizes, regardless of what visual objects they represent. Resolving this crossmodal incongruence with a shape-changing interface is our long-term goal. In this paper, we seek to find a length perception model that considers the moment of inertia (MOI) and diameter of a handheld object based on the concept of dynamic touch. Such models serve as a basis for computational algorithms for shape changing. We carried out two perceptual experiments. In Experiment 1, we measured the perceived lengths of 24 physical objects with different MOIs and diameters. Then we obtained a length perception model to reproduce the desired perceived length with a handheld controller. In Experiment 2, we validated our model in a crossmodal matching scenario, where a visual rod was matched to a haptic rod in terms of the perceived length. Our results contribute to understanding the relationship between the perceived length and physical properties of a handheld object and designing shape-changing algorithms to render equivalent visual and haptic sensory cues for length perception in VR.