Learning about AR/VR
Subjects connected with AR and VR
Virtual Transportation for Immersive Systems
- Details
- By Paulo Menezes
- Category: Augmented & Virtual Reality Module
- Hits: 761
I. INTRODUCTION
Computer games have since long explored ways of inducing
the notion of traveling or moving along the game scenarios,
because it make them much more attractive to people, than
static ones. Even in a relatively small scenario it is more
attractive to the player being able to become an active observer,
than seeing, eventually several animated characters, from a
fixed pose. This is the main reason why the Virtual Reality
(VR) concept has received so much attention, as being based
on the creation of first person views, the user is supposed to
be able to actively explore the virtual environment.
In usual games viewed via a computer screen, the viewpoint
motion has been explored to create first person viewing
experiences, but as these tend to induce motion sickness there
are always alternative views like bird’s eye views, or follower
views. First person views are more used in piloting or driving
situations, where a cockpit, or car structure are used to create
views from the inside of the plane, or car, through a wind
shield. For the driving case, the observer may execute small
moves with respect to the car, but it is the car that moves with
respect to the world, and the latter movements become much
more important than the former ones. On another side, when
using a head mounted display (HMD), the observation point is
much more egocentric, and either we create the virtual cockpit
sensation [1], or the perceived movements should be exactly
related with the ones that the observer intentionally executes,
in order to avoid cybersickness effects.
From previous studies we found that in VR, people start
to feel nauseous in certain situations, namely when a motion
of a scene is presented while the user is standing still. This
occurs because of a disparity relation between our visual and
vestibular system, as the first perceives movement on the
scene but the second perceived information is that our body is not physically moving [2]. With this in mind, we have been
searching for solutions to support navigation VR scenarios in
order to take full advantage of its capabilities, while trying to
avoid any type of induced discomfort.
II. NATURAL MOTION IN VR
Walking is the most natural way for humans to move and
as such it is expectable as to use it to support locomotion in
VR. Although until recently it was hard (or very expensive)
to capture the user’s walking motion to use it in to create
corresponding sensation inside a VR system, recently introduced
tracking systems have brought as very interesting (and
low cost) solutions. Using them, it is possible to have users
walking along virtual environments. This possibility, although
being very interesting, has serious limitations to its use, which
arise from the length of HMD cables used by most systems and
that limit the extent of the motion, and the possible presence of
physical obstacles, like walls, furniture, or people, that unless
have a corresponding representation inside the VR worlds may
lead to unexpected collisions with the possible consequent
injuries.
The ideal would be therefore to enable the user to walk and
have that fully captured and mapped into the VR envirnment,
but without physically moving and thus without any problem
with cable lengths or obstacles. The Omni platform was
recently introduced with the purpose of enabling the user to
literally walk in a VR world. Here, the user being fixed on
top of the platform by an harness and using special shoes,
can walk on the platform, being his steps translated into the
corresponding displacements in the VR environment.
But humans have developed transportation systems, to increase
traveling velocities and/or reduce the efforts, especially
for long distance/duration motions. By consequence, and for
the sake of realism, it is natural that similar experiences are to
be brought into immersive systems. By consequence it is quite
natural to use devices that aim at replicating the sensations
of driving a car, piloting a plane, or other, but also others
that will create new experiences. As examples there are some
recent devices that have been proposed for VR interaction,
like Icaros [3] and Birdly [4] which were designed to create
bird-like flying experiences.
This work presents another solution for enabling user realistic
control of displacements in a VR system. Aiming to
provide the user with tools that allow him to navigate through
endless virtual environments without feeling motion sickness.
III. APPARATUS
Observing the world with our focus on travelling mechanisms,
we could identify situations where information captured
from visual and vestibular system doesn’t match and in general
people don’t feel nauseous, such as driving a vehicle or riding
a bicycle. Our hypothesis is that if device can be used to enable
us to sense some the movement effects in a way that we can
anticipate or control the movement coupled with visual cues
we will not experience motion sickness.
A. Developed Platform
To achieve our goal we developed a system based on the
control mechanics of a SegwayTM. It is composed by a rotating
platform with a handlebar and steering bar. Tilting the handle
bar left or right will rotate the platform and the virtual view
accordingly, while tilting front and back will produce a small
vibration on the platform and move forward and backward in
the virtual environment. For the visual system we are using
the Head Mounted Display (HMD) Oculus Rift DK2 (Fig. 1).
B. Virtual Environment
To demonstrate and test the system we build a virtual
environment with some obstacles where the user can experience
our proposed locomotion mechanism (Fig. 2). Using the
developed platform and the tracking system of Oculus Rift
DK2 (including the camera) the user is able to freely move
and look around the virtual scene.
C. Physiological Data
In order to better understand what users are feeling while
using the system we keep track of their physiological activity,
such as Electrocardiography (ECG), Electrodermal Activity
(EDA), Body Acceleration (BA) and Body Temperature (BT),
for later process and analysis. The bio-signals data collecting
device used was the BiTalino, a low cost toolkit especially
designed for this propose. This opens the possibility of using
such data for automatically detect user discomfort through the
analysis if variations in some of the parameters.
IV. CONCLUSION
This demonstration aims at showing that cyber-sickness can
be reduced by the use of a system capable to provide the right
motion feedback to user. It simulates all the movements in
the virtual environment (e.g. Segway mechanics) providing a
synchronous relation between visual and vestibular systems.
The instructions for building the necessary setup, as well as
the software, can be obtained from [5] .
REFERENCES
[1] J. C. G. Sanchez, B. Patr˜ao, L. Almeida, J. Perez, P. Menezes, J. Dias, and
P. Sanz, “Design and evaluation of a natural interface for remote operation
of underwater robots,,” IEEE Computer Graphics and Applications,
vol. PP, no. 99, 2015.
[2] B. Patr˜ao, S. N. Pedro, and P. Menezes, “How to deal with virtual reality
sickness,” in EPCGI’2015: The 22nd Portuguese Conf. on Computer
Graphics and Interaction, Coimbra, Portugal, 2015.
[3] Icaros Team, “Icaros.” [Online]. Available: http://www.icaros.net/
[4] Somniacs, “Birdly.” [Online]. Available: http://www.somniacs.co/
[5] P. Menezes et al. Learn ar/vr related subjects. [Online]. Available:
http://orion.isr.uc.pt/index.php/arvr