No Arabic abstract
Effective data visualization is a key part of the discovery process in the era of big data. It is the bridge between the quantitative content of the data and human intuition, and thus an essential component of the scientific path from data into knowledge and understanding. Visualization is also essential in the data mining process, directing the choice of the applicable algorithms, and in helping to identify and remove bad data from the analysis. However, a high complexity or a high dimensionality of modern data sets represents a critical obstacle. How do we visualize interesting structures and patterns that may exist in hyper-dimensional data spaces? A better understanding of how we can perceive and interact with multi dimensional information poses some deep questions in the field of cognition technology and human computer interaction. To this effect, we are exploring the use of immersive virtual reality platforms for scientific data visualization, both as software and inexpensive commodity hardware. These potentially powerful and innovative tools for multi dimensional data visualization can also provide an easy and natural path to a collaborative data visualization and exploration, where scientists can interact with their data and their colleagues in the same visual space. Immersion provides benefits beyond the traditional desktop visualization tools: it leads to a demonstrably better perception of a datascape geometry, more intuitive data understanding, and a better retention of the perceived relationships in the data.
Bionic vision is a rapidly advancing field aimed at developing visual neuroprostheses (bionic eyes) to restore useful vision to people who are blind. However, a major outstanding challenge is predicting what people see when they use their devices. The limited field of view of current devices necessitates head movements to scan the scene, which is difficult to simulate on a computer screen. In addition, many computational models of bionic vision lack biological realism. To address these challenges, we propose to embed biologically realistic models of simulated prosthetic vision (SPV) in immersive virtual reality (VR) so that sighted subjects can act as virtual patients in real-world tasks.
Spherical coordinate systems, which are ubiquitous in astronomy, cannot be shown without distortion on flat, two-dimensional surfaces. This poses challenges for the two complementary phases of visual exploration -- making discoveries in data by looking for relationships, patterns or anomalies -- and publication -- where the results of an exploration are made available for scientific scrutiny or communication. This is a long-standing problem, and many practical solutions have been developed. Our allskyVR approach provides a workflow for experimentation with commodity virtual reality head-mounted displays. Using the free, open source S2PLOT programming library, and the A-Frame WebVR browser-based framework, we provide a straightforward way to visualise all-sky catalogues on a user-centred, virtual celestial sphere. The allskyVR distribution contains both a quickstart option, complete with a gaze-based menu system, and a fully customisable mode for those who need more control of the immersive experience. The software is available for download from: https://github.com/cfluke/allskyVR
We propose a new approach for interaction in Virtual Reality (VR) using mobile robots as proxies for haptic feedback. This approach allows VR users to have the experience of sharing and manipulating tangible physical objects with remote collaborators. Because participants do not directly observe the robotic proxies, the mapping between them and the virtual objects is not required to be direct. In this paper, we describe our implementation, various scenarios for interaction, and a preliminary user study.
We report on an exploratory project aimed at performing immersive 3D visualization of astronomical data, starting with spectral-line radio data cubes from galaxies. This work is done as a collaboration between the Department of Physics and Astronomy and the Department of Computer Science at the University of Manitoba. We are building our prototype using the 3D engine Unity, because of its ease of use for integration with advanced displays such as a CAVE environment, a zSpace tabletop, or virtual reality headsets. We address general issues regarding 3D visualization, such as: load and convert astronomy data, perform volume rendering on the GPU, and produce physically meaningful visualizations using principles of visual literacy. We discuss some challenges to be met when designing a user interface that allows us to take advantage of this new way of exploring data. We hope to lay the foundations for an innovative framework useful for all astronomers who use spectral line data cubes, and encourage interested parties to join our efforts. This pilot project addresses the challenges presented by frontier astronomy experiments, such as the Square Kilometre Array and its precursors.
We introduce Blocks, a mobile application that enables people to co-create AR structures that persist in the physical environment. Using Blocks, end users can collaborate synchronously or asynchronously, whether they are colocated or remote. Additionally, the AR structures can be tied to a physical location or can be accessed from anywhere. We evaluated how people used Blocks through a series of lab and field deployment studies with over 160 participants, and explored the interplay between two collaborative dimensions: space and time. We found that participants preferred creating structures synchronously with colocated collaborators. Additionally, they were most active when they created structures that were not restricted by time or place. Unlike most of todays AR experiences, which focus on content consumption, this work outlines new design opportunities for persistent and collaborative AR experiences that empower anyone to collaborate and create AR content.