ترغب بنشر مسار تعليمي؟ اضغط هنا

Noncontact Thermal and Vibrotactile Display Using Focused Airborne Ultrasound

56   0   0.0 ( 0 )
 نشر من قبل Takaaki Kamigaki Dr
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In a typical mid-air haptics system, focused airborne ultrasound provides vibrotactile sensations to localized areas on a bare skin. Herein, a method for displaying thermal sensations to hands where mesh fabric gloves are worn is proposed. The gloves employed in this study are commercially available mesh fabric gloves with sound absorption characteristics, such as cotton work gloves without any additional devices such as Peltier elements. The method proposed in this study can also provide vibrotactile sensations by changing the ultrasonic irradiation pattern. In this paper, we report basic experimental investigations on the proposed method. By performing thermal measurements, we evaluate the local heat generation on the surfaces of both the glove and the skin by focused airborne ultrasound irradiation. In addition, we performed perceptual experiments, thereby confirming that the proposed method produced both thermal and vibrotactile sensations. Furthermore, these sensations were selectively provided to a certain extent by changing the ultrasonic irradiation pattern. These results validate the effectiveness of our method and its feasibility in mid-air haptics applications.



قيم البحث

اقرأ أيضاً

This study seeks to understand conditions under which virtual gratings produced via vibrotaction and friction modulation are perceived as similar and to find physical origins in the results. To accomplish this, we developed two single-axis devices, o ne based on electroadhesion and one based on out-of-plane vibration. The two devices had identical touch surfaces, and the vibrotactile device used a novel closed-loop controller to achieve precise control of out-of-plane plate displacement under varying load conditions across a wide range of frequencies. A first study measured the perceptual intensity equivalence curve of gratings generated under electroadhesion and vibrotaction across the 20-400Hz frequency range. A second study assessed the perceptual similarity between two forms of skin excitation given the same driving frequency and same perceived intensity. Our results indicate that it is largely the out-of-plane velocity that predicts vibrotactile intensity relative to shear forces generated by friction modulation. A high degree of perceptual similarity between gratings generated through friction modulation and through vibrotaction is apparent and tends to scale with actuation frequency suggesting perceptual indifference to the manner of fingerpad actuation in the upper frequency range.
Objective: Evaluate the feasibility and potential impacts on hand function using a wearable stimulation device (the VTS Glove) which provides mechanical, vibratory input to the affected limb of chronic stroke survivors. Methods: A double-blind, ran domized, controlled feasibility study including sixteen chronic stroke survivors (mean age: 54; 1-13 years post-stroke) with diminished movement and tactile perception in their affected hand. Participants were given a wearable device to take home and asked to wear it for three hours daily over eight weeks. The device intervention was either (1) the VTS Glove, which provided vibrotactile stimulation to the hand, or (2) an identical glove with vibration disabled. Participants were equally randomly assigned to each condition. Hand and arm function were measured weekly at home and in local physical therapy clinics. Results: Participants using the VTS Glove showed significantly improved Semmes-Weinstein monofilament exam, reduction in Modified Ashworth measures in the fingers, and some increased voluntary finger flexion, elbow and shoulder range of motion. Conclusions: Vibrotactile stimulation applied to the disabled limb may impact tactile perception, tone and spasticity, and voluntary range of motion. Wearable devices allow extended application and study of stimulation methods outside of a clinical setting.
Recent research has proposed teleoperation of robotic and aerial vehicles using head motion tracked by a head-mounted display (HMD). First-person views of the vehicles are usually captured by onboard cameras and presented to users through the display panels of HMDs. This provides users with a direct, immersive and intuitive interface for viewing and control. However, a typically overlooked factor in such designs is the latency introduced by the vehicle dynamics. As head motion is coupled with visual updates in such applications, visual and control latency always exists between the issue of control commands by head movements and the visual feedback received at the completion of the attitude adjustment. This causes a discrepancy between the intended motion, the vestibular cue and the visual cue and may potentially result in simulator sickness. No research has been conducted on how various levels of visual and control latency introduced by dynamics in robots or aerial vehicles affect users performance and the degree of simulator sickness elicited. Thus, it is uncertain how much performance is degraded by latency and whether such designs are comfortable from the perspective of users. To address these issues, we studied a prototyped scenario of a head motion controlled quadcopter using an HMD. We present a virtual reality (VR) paradigm to systematically assess the effects of visual and control latency in simulated drone control scenarios.
71 - Gregoire Cattan 2020
A brain-computer interface (BCI) based on electroencephalography (EEG) is a promising technology for enhancing virtual reality (VR) applications-in particular, for gaming. We focus on the so-called P300-BCI, a stable and accurate BCI paradigm relying on the recognition of a positive event-related potential (ERP) occurring in the EEG about 300 ms post-stimulation. We implemented a basic version of such a BCI displayed on an ordinary and affordable smartphone-based head-mounted VR device: that is, a mobile and passive VR system (with no electronic components beyond the smartphone). The mobile phone performed the stimuli presentation, EEG synchronization (tagging) and feedback display. We compared the ERPs and the accuracy of the BCI on the VR device with a traditional BCI running on a personal computer (PC). We also evaluated the impact of subjective factors on the accuracy. The study was within-subjects, with 21 participants and one session in each modality. No significant difference in BCI accuracy was found between the PC and VR systems, although the P200 ERP was significantly wider and larger in the VR system as compared to the PC system.
Virtual Reality (VR) headsets can open opportunities for users to accomplish complex tasks on large virtual displays, using compact setups. However, interacting with large virtual displays using existing interaction techniques might cause fatigue, es pecially for precise manipulations, due to the lack of physical surfaces. We designed VXSlate, an interaction technique that uses a large virtual display, as an expansion of a tablet. VXSlate combines a users headmovement, as tracked by the VR headset, and touch interaction on the tablet. The users headmovement position both a virtual representation of the tablet and of the users hand on the large virtual display. The users multi-touch interactions perform finely-tuned content manipulations.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا