ترغب بنشر مسار تعليمي؟ اضغط هنا

115 - Jingbo Zhao , Ruize An , Ruolin Xu 2021
Hand gesture is a new and promising interface for locomotion in virtual environments. While several previous studies have proposed different hand gestures for virtual locomotion, little is known about their differences in terms of performance and use r preference in virtual locomotion tasks. In the present paper, we presented three different hand gesture interfaces and their algorithms for locomotion, which are called the Finger Distance gesture, the Finger Number gesture and the Finger Tapping gesture. These gestures were inspired by previous studies of gesture-based locomotion interfaces and are typical gestures that people are familiar with in their daily lives. Implementing these hand gesture interfaces in the present study enabled us to systematically compare the differences between these gestures. In addition, to compare the usability of these gestures to locomotion interfaces using gamepads, we also designed and implemented a gamepad interface based on the Xbox One controller. We compared these four interfaces through two virtual locomotion tasks. These tasks assessed their performance and user preference on speed control and waypoints navigation. Results showed that user preference and performance of the Finger Distance gesture were comparable to that of the gamepad interface. The Finger Number gesture also had close performance and user preference to that of the Finger Distance gesture. Our study demonstrates that the Finger Distance gesture and the Finger Number gesture are very promising interfaces for virtual locomotion. We also discuss that the Finger Tapping gesture needs further improvements before it can be used for virtual walking.
Recent research has proposed teleoperation of robotic and aerial vehicles using head motion tracked by a head-mounted display (HMD). First-person views of the vehicles are usually captured by onboard cameras and presented to users through the display panels of HMDs. This provides users with a direct, immersive and intuitive interface for viewing and control. However, a typically overlooked factor in such designs is the latency introduced by the vehicle dynamics. As head motion is coupled with visual updates in such applications, visual and control latency always exists between the issue of control commands by head movements and the visual feedback received at the completion of the attitude adjustment. This causes a discrepancy between the intended motion, the vestibular cue and the visual cue and may potentially result in simulator sickness. No research has been conducted on how various levels of visual and control latency introduced by dynamics in robots or aerial vehicles affect users performance and the degree of simulator sickness elicited. Thus, it is uncertain how much performance is degraded by latency and whether such designs are comfortable from the perspective of users. To address these issues, we studied a prototyped scenario of a head motion controlled quadcopter using an HMD. We present a virtual reality (VR) paradigm to systematically assess the effects of visual and control latency in simulated drone control scenarios.
Head gesture is a natural means of face-to-face communication between people but the recognition of head gestures in the context of virtual reality and use of head gesture as an interface for interacting with virtual avatars and virtual environments have been rarely investigated. In the current study, we present an approach for real-time head gesture recognition on head-mounted displays using Cascaded Hidden Markov Models. We conducted two experiments to evaluate our proposed approach. In experiment 1, we trained the Cascaded Hidden Markov Models and assessed the offline classification performance using collected head motion data. In experiment 2, we characterized the real-time performance of the approach by estimating the latency to recognize a head gesture with recorded real-time classification data. Our results show that the proposed approach is effective in recognizing head gestures. The method can be integrated into a virtual reality system as a head gesture interface for interacting with virtual worlds.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا