Do you want to publish a course? Click here

Evaluating Performance and Gameplay of Virtual Reality Sickness Techniques in a First-Person Shooter Game

84   0   0.0 ( 0 )
 Added by Diego Monteiro
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

In virtual reality (VR) games, playability and immersion levels are important because they affect gameplay, enjoyment, and performance. However, they can be adversely affected by VR sickness (VRS) symptoms. VRS can be minimized by manipulating users perception of the virtual environment via the head-mounted display (HMD). One extreme example is the Teleport mitigation technique, which lets users navigate discretely, skipping sections of the virtual space. Other techniques are less extreme but still rely on controlling what and how much users see via the HMD. This research examines the effect on players performance and gameplay of these mitigation techniques in fast-paced VR games. Our focus is on two types of visual reduction techniques. This study aims to identify specifically the trade-offs these techniques have in a first-person shooter game regarding immersion, performance, and VRS. The main contributions in this paper are (1) a deeper understanding of one of the most popular techniques (Teleport) when it comes to gameplay; (2) the replication and validation of a novel VRS mitigation technique based on visual reduction; and (3) a comparison of their effect on players performance and gameplay.



rate research

Read More

In this article we describe Hack.VR, an object-oriented programming game in virtual reality. Hack.VR uses a VR programming language in which nodes represent functions and node connections represent data flow. Using this programming framework, players reprogram VR objects such as elevators, robots, and switches. Hack.VR has been designed to be highly interactable both physically and semantically.
225 - Luis Valente 2016
This paper proposes the concept of live-action virtual reality games as a new genre of digital games based on an innovative combination of live-action, mixed-reality, context-awareness, and interaction paradigms that comprise tangible objects, context-aware input devices, and embedded/embodied interactions. Live-action virtual reality games are live-action games because a player physically acts out (using his/her real body and senses) his/her avatar (his/her virtual representation) in the game stage, which is the mixed-reality environment where the game happens. The game stage is a kind of augmented virtuality; a mixed-reality where the virtual world is augmented with real-world information. In live-action virtual reality games, players wear HMD devices and see a virtual world that is constructed using the physical world architecture as the basic geometry and context information. Physical objects that reside in the physical world are also mapped to virtual elements. Live-action virtual reality games keeps the virtual and real-worlds superimposed, requiring players to physically move in the environment and to use different interaction paradigms (such as tangible and embodied interaction) to complete game activities. This setup enables the players to touch physical architectural elements (such as walls) and other objects, feeling the game stage. Players have free movement and may interact with physical objects placed in the game stage, implicitly and explicitly. Live-action virtual reality games differ from similar game concepts because they sense and use contextual information to create unpredictable game experiences, giving rise to emergent gameplay.
We present PhyShare, a new haptic user interface based on actuated robots. Virtual reality has recently been gaining wide adoption, and an effective haptic feedback in these scenarios can strongly support users sensory in bridging virtual and physical world. Since participants do not directly observe these robotic proxies, we investigate the multiple mappings between physical robots and virtual proxies that can utilize the resources needed to provide a well rounded VR experience. PhyShare bots can act either as directly touchable objects or invisible carriers of physical objects, depending on different scenarios. They also support distributed collaboration, allowing remotely located VR collaborators to share the same physical feedback.
Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so called cybersickness. Triggered by a sensory conflict between the visual and vestibular systems, cybersickness symptoms cause discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree VR. In 360-degrees VR experiences, movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may potentially induce cybersickness. We have evaluated whether an Artificial Intelligence (AI) software designed to supplement the VR experience with artificial 6-degree-of-freedom motion may reduce sensory conflict, and therefore cybersickness. Explicit (questionnaires) and implicit (physiological responses) measurements were used to measure cybersickness symptoms during and after VR exposure. Our results confirmed a reduction in feelings of nausea during the AI supplemented 6-degree-of-freedom motion VR. Through improving the congruency between visual and vestibular cues, users can experience more engaging, immersive and safe virtual reality, which is critical for the application of VR in educational, medical, cultural and entertainment settings.
With the popularity of online access in virtual reality (VR) devices, it will become important to investigate exclusive and interactive CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) designs for VR devices. In this paper, we first present four traditional two-dimensional (2D) CAPTCHAs (i.e., text-based, image-rotated, image-puzzled, and image-selected CAPTCHAs) in VR. Then, based on the three-dimensional (3D) interaction characteristics of VR devices, we propose two vrCAPTCHA design prototypes (i.e., task-driven and bodily motion-based CAPTCHAs). We conducted a user study with six participants for exploring the feasibility of our two vrCAPTCHAs and traditional CAPTCHAs in VR. We believe that our two vrCAPTCHAs can be an inspiration for the further design of CAPTCHAs in VR.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا