ترغب بنشر مسار تعليمي؟ اضغط هنا

Some Lessons Learned Running Virtual Reality Experiments Out of the Laboratory

73   0   0.0 ( 0 )
 نشر من قبل Anthony Steed
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In the past twelve months, our team has had to move rapidly from conducting most of our user experiments in a laboratory setting, to running experiments in the wild away from the laboratory and without direct synchronous oversight from an experimenter. This has challenged us to think about what types of experiment we can run, and to improve our tools and methods to allow us to reliably capture the necessary data. It has also offered us an opportunity to engage with a more diverse population than we would normally engage with in the laboratory. In this position paper we elaborate on the challenges and opportunities, and give some lessons learned from our own experience.

قيم البحث

اقرأ أيضاً

We report on the initial phase of an ongoing, multi-stage investigation of how to incorporate Virtual Reality (VR) technology in teaching introductory astronomy concepts. Our goal was to compare the efficacy of VR vs. conventional teaching methods us ing one specific topic, Moon phases and eclipses. After teaching this topic to an ASTRO 101 lecture class, students were placed into three groups to experience one of three additional activities: supplemental lecture, hands-on activity, or VR experience. All students were tested before and after their learning activity. Although preliminary, our results can serve as a useful guide to expanding the role of VR in the astronomy classroom.
225 - Luis Valente 2016
This paper proposes the concept of live-action virtual reality games as a new genre of digital games based on an innovative combination of live-action, mixed-reality, context-awareness, and interaction paradigms that comprise tangible objects, contex t-aware input devices, and embedded/embodied interactions. Live-action virtual reality games are live-action games because a player physically acts out (using his/her real body and senses) his/her avatar (his/her virtual representation) in the game stage, which is the mixed-reality environment where the game happens. The game stage is a kind of augmented virtuality; a mixed-reality where the virtual world is augmented with real-world information. In live-action virtual reality games, players wear HMD devices and see a virtual world that is constructed using the physical world architecture as the basic geometry and context information. Physical objects that reside in the physical world are also mapped to virtual elements. Live-action virtual reality games keeps the virtual and real-worlds superimposed, requiring players to physically move in the environment and to use different interaction paradigms (such as tangible and embodied interaction) to complete game activities. This setup enables the players to touch physical architectural elements (such as walls) and other objects, feeling the game stage. Players have free movement and may interact with physical objects placed in the game stage, implicitly and explicitly. Live-action virtual reality games differ from similar game concepts because they sense and use contextual information to create unpredictable game experiences, giving rise to emergent gameplay.
Bionic vision is a rapidly advancing field aimed at developing visual neuroprostheses (bionic eyes) to restore useful vision to people who are blind. However, a major outstanding challenge is predicting what people see when they use their devices. Th e limited field of view of current devices necessitates head movements to scan the scene, which is difficult to simulate on a computer screen. In addition, many computational models of bionic vision lack biological realism. To address these challenges, we propose to embed biologically realistic models of simulated prosthetic vision (SPV) in immersive virtual reality (VR) so that sighted subjects can act as virtual patients in real-world tasks.
We present PhyShare, a new haptic user interface based on actuated robots. Virtual reality has recently been gaining wide adoption, and an effective haptic feedback in these scenarios can strongly support users sensory in bridging virtual and physica l world. Since participants do not directly observe these robotic proxies, we investigate the multiple mappings between physical robots and virtual proxies that can utilize the resources needed to provide a well rounded VR experience. PhyShare bots can act either as directly touchable objects or invisible carriers of physical objects, depending on different scenarios. They also support distributed collaboration, allowing remotely located VR collaborators to share the same physical feedback.
We propose a new approach for interaction in Virtual Reality (VR) using mobile robots as proxies for haptic feedback. This approach allows VR users to have the experience of sharing and manipulating tangible physical objects with remote collaborators . Because participants do not directly observe the robotic proxies, the mapping between them and the virtual objects is not required to be direct. In this paper, we describe our implementation, various scenarios for interaction, and a preliminary user study.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا