RESUMO
BACKGROUND: Virtual reality (VR) has been used as a technological medium to deliver mirror therapy interventions with people after stroke in numerous applications with promising results. The recent emergence of affordable, off-the-shelf head-mounted displays (like the Oculus Rift or HTC Vive) has opened the possibility for novel and cost-effective approaches for immersive mirror therapy interventions. We have developed one such system, ART-VR, which allows people after stroke to carry out a clinically-validated mirror therapy protocol in an immersive virtual environment and within a clinical setting. METHODS: A case cohort of 11 people with upper limb paresis following first time stroke at an in-patient rehabilitation facility received three interventions over a one week period. Participants carried out the BeST mirror therapy protocol using our immersive VR system as an adjunct therapy to their standard rehabilitation program. Our clinical feasibility study investigated intervention outcomes, virtual reality acceptance and user experience. RESULTS: The results show that the combination of an immersive VR system and mirror therapy protocol is feasible for clinical use. 9 out of 11 participants showed some improvement of their affected hand after the intervention. The vast majority of the participants (9/11) reported experiencing some psycho-physical effects, such as tingling or paraesthesia, in the affected limb during the intervention. CONCLUSIONS: Our findings show that immersive VR-based mirror therapy is feasible and shows effects comparable to those of conventional mirror therapy. Trial Registration Trial was registered with the ISRCTN Registry (ISRCTN34011164) on December 3, 2021, retrospectively.
Assuntos
Reabilitação do Acidente Vascular Cerebral , Acidente Vascular Cerebral , Terapia de Exposição à Realidade Virtual , Realidade Virtual , Estudos de Viabilidade , Humanos , Terapia de Espelho de Movimento , Estudos Retrospectivos , Acidente Vascular Cerebral/terapia , Reabilitação do Acidente Vascular Cerebral/métodos , Terapia de Exposição à Realidade Virtual/métodosRESUMO
INTRODUCTION: Electronic nicotine delivery systems (ENDS) are used to aid smoking cessation attempts; however, many smokers continue to smoke while using an ENDS (dual use). Although uncertainty remains regarding whether specific ENDS patterns hinder or support successful smoking cessation, recent advances in "smart" technology allow passive and active recording of behaviors in real time, enabling more detailed insights into how smoking and vaping patterns may coevolve. We describe patterns of ENDS initiation, and subsequent use, including any changes in cigarette consumption, among daily smokers using a "smart" ENDS (S-ENDS) to quit smoking. METHOD: An 8-week long mixed-methods feasibility study used Bluetooth-enabled S-ENDS that passively recorded real-time device use by participants (n = 11). Daily surveys administered via smartphones collected data on self-reported cigarette consumption. RESULTS: All 11 participants were dual users, at least initially, during their quit attempt. We observed three provisional vaping and smoking patterns: immediate and intensive ENDS initiation coupled with immediate, dramatic, and sustained smoking reduction, leading to smoking abstinence; gradual ENDS uptake with gradual smoking reductions, leading to daily dual use throughout the study period; and ENDS experimentation with return to exclusive smoking. For six participants, the patterns observed in week 1 were similar to the vaping and smoking patterns observed throughout the rest of the study period. CONCLUSION: Technological advances now allow fine-grained description of ENDS use and smoking patterns. Larger and longer studies describing smoking-to-vaping patterns, and estimating associations with smoking outcomes, could inform ENDS-specific cessation advice promoting full transition from smoking to exclusive ENDS use. IMPLICATIONS: The use of an S-ENDS that recorded real-time device use among daily smokers engaged in a quit attempt provides insight into patterns and trajectories of dual use (continuing to smoke while using ENDS), and the possible associations between ENDS initiation, subsequent use, and smoking cessation outcomes. Such work could support more targeted cessation counseling and technical advice for smokers using ENDS to quit smoking, reduce the risk of users developing long-term dual use patterns, and enhance the contributions ENDS may make to reducing smoking prevalence.
Assuntos
Sistemas Eletrônicos de Liberação de Nicotina , Abandono do Hábito de Fumar/métodos , Fumar Tabaco/terapia , Vaping/terapia , Estudos de Viabilidade , Humanos , Autorrelato , Fumantes/estatística & dados numéricosRESUMO
This work introduces off-axis layered displays, the first approach to stereoscopic direct-view displays with support for focus cues. Off-axis layered displays combine a head-mounted display with a traditional direct-view display for encoding a focal stack and thus, for providing focus cues. To explore the novel display architecture, we present a complete processing pipeline for the real-time computation and post-render warping of off-axis display patterns. In addition, we build two prototypes using a head-mounted display in combination with a stereoscopic direct-view display, and a more widely available monoscopic direct-view display. In addition we show how extending off-axis layered displays with an attenuation layer and with eye-tracking can improve image quality. We thoroughly analyze each component in a technical evaluation and present examples captured through our prototypes.
RESUMO
A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.
RESUMO
We present computational phase-modulated eyeglasses, a see-through optical system that modulates the view of the user using phase-only spatial light modulators (PSLM). A PSLM is a programmable reflective device that can selectively retardate, or delay, the incoming light rays. As a result, a PSLM works as a computational dynamic lens device. We demonstrate our computational phase-modulated eyeglasses with either a single PSLM or dual PSLMs and show that the concept can realize various optical operations including focus correction, bi-focus, image shift, and field of view manipulation, namely optical zoom. Compared to other programmable optics, computational phase-modulated eyeglasses have the advantage in terms of its versatility. In addition, we also presents some prototypical focus-loop applications where the lens is dynamically optimized based on distances of objects observed by a scene camera. We further discuss the implementation, applications but also discuss limitations of the current prototypes and remaining issues that need to be addressed in future research.
RESUMO
In recent years, the development of Augmented Reality (AR) frameworks made AR application development widely accessible to developers without AR expert background. With this development, new application fields for AR are on the rise. This comes with an increased need for visualization techniques that are suitable for a wide range of application areas. It becomes more important for a wider audience to gain a better understanding of existing AR visualization techniques. In this article we provide a taxonomy of existing works on visualization techniques in AR. The taxonomy aims to give researchers and developers without an in-depth background in Augmented Reality the information to successively apply visualization techniques in Augmented Reality environments. We also describe required components and methods and analyze common patterns.
RESUMO
BACKGROUND: Providing timely follow-up care for patients with inflammatory bowel disease in remission is important but often difficult because of resource limitations. Using smartphones to communicate symptoms and biomarkers is a potential alternative. We aimed to compare outpatient management using 2 smartphone apps (IBDsmart for symptoms and IBDoc for fecal calprotectin monitoring) vs standard face-to-face care. We hypothesized noninferiority of quality of life and symptoms at 12 months plus a reduction in face-to-face appointments in the smartphone app group. METHODS: Inflammatory bowel disease outpatients (previously seen more often than annually) were randomized to smartphone app or standard face-to-face care over 12 months. Quality of life and symptoms were measured quarterly for 12 months. Acceptability was measured for gastroenterologists and patients at 12 months. RESULTS: One hundred people (73 Crohn's disease, 49 male, average age 35 years) consented and completed baseline questionnaires (50 in each group). Intention-to-treat and per-protocol analyses revealed noninferiority of quality of life and symptom scores at 12 months. Outpatient appointment numbers were reduced in smartphone app care (P < 0.001). There was no difference in number of surgical outpatient appointments or number of disease-related hospitalizations between groups. Adherence to IBDsmart (50% perfect adherence) was slightly better than adherence to IBDoc (30% perfect adherence). Good acceptability was reported among most gastroenterologists and patients. CONCLUSIONS: Remote symptom and fecal calprotectin monitoring is effective and acceptable. It also reduces the need for face-to-face outpatient appointments. Patients with mild-to-moderate disease who are not new diagnoses are ideal for this system. CLINICAL TRIAL REGISTRATION NUMBER: ACTRN12615000342516.
Assuntos
Assistência ao Convalescente/métodos , Doenças Inflamatórias Intestinais/terapia , Aplicativos Móveis , Avaliação de Sintomas/métodos , Telemedicina/métodos , Adulto , Assistência Ambulatorial/estatística & dados numéricos , Fezes/química , Feminino , Gastroenterologistas/estatística & dados numéricos , Humanos , Análise de Intenção de Tratamento , Complexo Antígeno L1 Leucocitário/análise , Masculino , Aceitação pelo Paciente de Cuidados de Saúde/estatística & dados numéricos , Qualidade de Vida , Indução de Remissão , Smartphone , Inquéritos e QuestionáriosRESUMO
AIM: Best practice management for rheumatoid arthritis (RA) involves regular clinical assessment of RA disease activity. This is not achievable with current rheumatology systems of care. We aimed to use opinions from people with RA and their specialist rheumatology healthcare professionals to inform development of a mobile app for people with RA for recording their disease activity data for potential integration into clinical service, and assess usability of the app. METHOD: In phase 1 we interviewed nine people with RA and seven healthcare professionals. In phase 2 we developed an app with professional software developers. In phase 3 we evaluated app usability for people with RA using the System Usability Scale (SUS). RESULTS: Interview data showed four themes regarding functionality and implementation of a patient-held app in RA care: (a) variable app acceptance and readiness; (b) app use to reduce barriers; (c) pros and cons of patient-reported outcomes; and (d) allocation of clinics by need. The app developed has high usability in people with RA using the app on their own device for a month (SUS 79.5, n = 16) or using the app on a study device for 10 minutes (SUS 83, n = 100). CONCLUSION: People with RA and healthcare professionals have clearly identified features, benefits and risks of an app for self-assessment of RA and incorporation into clinical care. An app developed informed by these opinions has high usability. Next steps are development and validation of a method of patient-performed joint counts, and implementation, with evaluation, in the clinical setting.
Assuntos
Artrite Reumatoide/diagnóstico , Atenção à Saúde/métodos , Pessoal de Saúde , Monitorização Fisiológica/métodos , Software , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Inquéritos e QuestionáriosRESUMO
The mobility and ubiquity of mobile head-mounted displays make them a promising platform for telepresence research as they allow for spontaneous and remote use cases not possible with stationary hardware. In this work we present a system that provides immersive telepresence and remote collaboration on mobile and wearable devices by building a live spherical panoramic representation of a user's environment that can be viewed in real time by a remote user who can independently choose the viewing direction. The remote user can then interact with this environment as if they were actually there through intuitive gesture-based interaction. Each user can obtain independent views within this environment by rotating their device, and their current field of view is shared to allow for simple coordination of viewpoints. We present several different approaches to create this shared live environment and discuss their implementation details, individual challenges, and performance on modern mobile hardware; by doing so we provide key insights into the design and implementation of next generation mobile telepresence systems, guiding future research in this domain. The results of a preliminary user study confirm the ability of our system to induce the desired sense of presence in its users.
Assuntos
Realidade Virtual , Dispositivos Eletrônicos Vestíveis , Gráficos por Computador , Sistemas Computacionais , Gestos , Humanos , Orientação Espacial , Comportamento Social , Interface Usuário-Computador , Comunicação por VideoconferênciaRESUMO
We present a display for optical see-through near-eye displays based on light attenuation, a new paradigm that forms images by spatially subtracting colors of light. Existing optical see-through head-mounted displays (OST-HMDs) form virtual images in an additive manner-they optically combine the light from an embedded light source such as a microdisplay into the users' field of view (FoV). Instead, our light attenuation display filters the color of the real background light pixel-wise in the users' see-through view, resulting in an image as a spatial color filter. Our image formation is complementary to existing light-additive OST-HMDs. The core optical component in our system is a phase-only spatial light modulator (PSLM), a liquid crystal module that can control the phase of the light in each pixel. By combining PSLMs with polarization optics, our system realizes a spatially programmable color filter. In this paper, we introduce our optics design, evaluate the spatial color filter, consider applications including image rendering and FoV color control, and discuss the limitations of the current prototype.
Assuntos
Realidade Virtual , Birrefringência , Cor , Gráficos por Computador , Desenho de Equipamento , Movimentos Oculares , Humanos , Dispositivos Ópticos , Fenômenos Ópticos , Interface Usuário-ComputadorRESUMO
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
RESUMO
People with chronic conditions like rheumatoid arthritis (RA) self-manage on a day to day basis. They may be able to assess disease activity and communicate this via an app to their healthcare team to enable clinical review for medical management at the most appropriate times. This work describes the succesful co-design of a patient-held app for monitoring and communication of RA disease activity.
Assuntos
Artrite Reumatoide/terapia , Participação do Paciente , Telemedicina , Doença Crônica , Humanos , Equipe de Assistência ao PacienteRESUMO
BACKGROUND: Rheumatoid arthritis (RA) is a chronic inflammatory arthritis requiring long-term treatment with regular monitoring by a rheumatologist to achieve good health outcomes. Since people with RA may wish to monitor their own disease activity with a smartphone app, it is important to understand the functions and quality of apps for this purpose. OBJECTIVE: The aim of our study was to assess the features and quality of apps to assist people to monitor their RA disease activity by (1) summarizing the available apps, particularly the instruments used for measurement of RA disease activity; (2) comparing the app features with American College of Rheumatology and European League against Rheumatism (ACR and EULAR) guidelines for monitoring of RA disease activity; and (3) rating app quality with the Mobile App Rating Scale (MARS). METHODS: Systematic searches of the New Zealand iTunes and Google Play app stores were used to identify all apps for monitoring of RA disease activity that could be used by people with RA. The apps were described by both key metadata and app functionality. App adherence with recommendations for monitoring of RA disease activity in clinical practice was evaluated by identifying whether apps included calculation of a validated composite disease activity measure and recorded results for future retrieval. App quality was assessed by 2 independent reviewers using the MARS. RESULTS: The search identified 721 apps in the Google Play store and 216 in the iTunes store, of which 19 unique apps met criteria for inclusion (8 from both app stores, 8 iTunes, and 3 Google Play). In total, 14 apps included at least one validated instrument measuring RA disease activity; 7 of 11 apps that allowed users to enter a joint count used the standard 28 swollen and tender joint count; 8 apps included at least one ACR and EULAR-recommended RA composite disease activity (CDA) measure; and 10 apps included data storage and retrieval. Only 1 app, Arthritis Power, included both an RA CDA measure and tracked data, but this app did not include the standard 28 tender and swollen joint count. The median overall MARS score for apps was 3.41/5. Of the 6 apps that scored ≥4/5 on the overall MARS rating, only 1 included a CDA score endorsed by ACR and EULAR; however, this app did not have a data tracking function. CONCLUSIONS: This review found a lack of high-quality apps for longitudinal assessment of RA disease activity. Current apps fall into two categories: simple calculators primarily for rheumatologists and data tracking tools for people with RA. The latter do not uniformly collect data using validated instruments or composite disease activity measures. There is a need for appropriate, high-quality apps for use by rheumatologists and patients together in co-management of RA.
RESUMO
Optical see-through head-mounted displays are currently seeing a transition out of research labs towards the consumer-oriented market. However, whilst availability has improved and prices have decreased, the technology has not matured much. Most commercially available optical see-through head mounted displays follow a similar principle and use an optical combiner blending the physical environment with digital information. This approach yields problems as the colors for the overlaid digital information can not be correctly reproduced. The perceived pixel colors are always a result of the displayed pixel color and the color of the current physical environment seen through the head-mounted display. In this paper we present an initial approach for mitigating the effect of color-blending in optical see-through head-mounted displays by introducing a real-time radiometric compensation. Our approach is based on a novel prototype for an optical see-through head-mounted display that allows the capture of the current environment as seen by the user's eye. We present three different algorithms using this prototype to compensate color blending in real-time and with pixel-accuracy. We demonstrate the benefits and performance as well as the results of a user study. We see application for all common Augmented Reality scenarios but also for other areas such as Diminished Reality or supporting color-blind people.
RESUMO
Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicles position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the users view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.