RESUMO
Quasiperiodicity is a form of spatial order that has been observed in quasicrystalline matter but not light. We construct a quasicrystalline surface out of a light emitting diode. Using a nanoscale waveguide as a microscope (NSOM), we directly image the light field at the surface of the diode. Here we show, using reciprocal space representations of the images, that the light field is quasiperiodic. We explain the structure of the light field with wave superposition. Periodic ordering is limited to at most six-fold symmetry. The light field exhibits 12-fold quasisymmetry, showing order while disproving periodicity. This demonstrates that a new class, consisting of projections from hyperspace, exists in the taxonomy of light ordering.
RESUMO
Always-on, upper-body input from sensors like accelerometers, infrared cameras, and electromyography hold promise to enable accessible gesture input for people with upper-body motor impairments. When these sensors are distributed across the person's body, they can enable the use of varied body parts and gestures for device interaction. Personalized upper-body gestures that enable input from diverse body parts including the head, neck, shoulders, arms, hands and fingers and match the abilities of each user, could be useful for ensuring that gesture systems are accessible. In this work, we characterize the personalized gesture sets designed by 25 participants with upper-body motor impairments and develop design recommendations for upper-body personalized gesture interfaces. We found that the personalized gesture sets that participants designed were highly ability-specific. Even within a specific type of disability, there were significant differences in what muscles participants used to perform upper-body gestures, with some pre-dominantly using shoulder and upper-arm muscles, and others solely using their finger muscles. Eight percent of gestures that participants designed were with their head, neck, and shoulders, rather than their hands and fingers, demonstrating the importance of tracking the whole upper-body. To combat fatigue, participants performed 51% of gestures with their hands resting on or barely coming off of their armrest, highlighting the importance of using sensing mechanisms that are agnostic to the location and orientation of the body. Lastly, participants activated their muscles but did not visibly move during 10% of the gestures, demonstrating the need for using sensors that can sense muscle activations without movement. Both inertial measurement unit (IMU) and electromyography (EMG) wearable sensors proved to be promising sensors to differentiate between personalized gestures. Personalized upper-body gesture interfaces that take advantage of each person's abilities are critical for enabling accessible upper-body gestures for people with upper-body motor impairments.