Locomotion in VR – The Science behind it

L

Overview

In VR, the concept of standard locomotion controls is turned on its head. Most of the time, traditional game controls don’t work well in virtual reality games. Lateral movement with a thumbstick or a keyboard can trigger motion sickness in a lot of people. Your inner ear controls your sense of balance and spatial awareness (vestibular system), if what your inner ear perceives is different from what your eyes perceive (vestibular mismatch) you can lose your balance or get dizzy. A vestibular mismatch can even trigger nausea or vomiting in extreme cases.

Vestibular mismatch problems became apparent in the early days of Oculus VR development. VR developers quickly discovered what triggers motion sickness, and several devs started working on solutions for the problem. So far, we’ve encountered no less than a half-dozen different locomotion mechanics. None of them are perfect, but each one has its merits. If nothing else, the variety of options illustrates the ingenuity of the VR developer community.

How your body knows it’s moving

The human body is an interesting contraption because most of what you know about where you are in the world is based on a combination of sensory inputs that your brain processes and integrates to form a picture of your location and orientation. In no particular order…

Your eyes tell you lots about the world. They give you information about form and color, and about depth and distance. The latter is important for VR, as the observation of a parallax effect gives your brain information about your location based on that of objects in your environment.

Example of Parallax effect

For VR developers this isn’t a big issue, since most objects within VR are 3D models placed in fixed space. The challenge here has to do with motion, and how the user moves around in 3D space and what this feels like. For more about why this is challenging, we have to look inside your ears.

Your Ears contain something called Vestibular System, which is a group of components (bones, tubes, hairs, etc.) within the inner ear that contributes to your sense of balance and locomotion. In a nutshell, your inner ear contains a chamber full of fluid with small hairs called cilia affixed to the inside wall. When your body moves the liquid sloshes around and stimulates the cilia, and their motion tells your brain what is happening to your body (for a more detailed overview, feel free to read more on Wikipedia).

These cilia are able to detect translations and rotations (movement and turning), that give your body information about:

  1. Orientation — Are you upside down or sideways?
  2. Movement — You’re moving, but in which direction?
  3. Acceleration — What is the rate or your movement or rotation?

A cognitive process called Proprioception lets you know where your body parts are without needing to look at them. If you shut your eyes, it’s pretty easy to tell where your hands and feel are, right?

Combining visual inputs, vestibular inputs and cognitive processes like proprioception tell us everything we need to know about where we are in the world.

The challenge for VR is that humans are evolved to actually move through physical space, and when one of our sensory systems tell us where moving and another does not, sensory misalignment occurs and our brains get confused.

https://cdn-images-1.medium.com/max/900/1*4Ayo6lCADyPxEHAIIbiMqg.gif

Passive vs. Active Movement

To deal with this sensory misalignment, there are a few things being done in the VR space so far, and these user experiences break down into two categories: Passive and Active Movement.

Passive Movement refers to movement that your own body does not control. An example of the former would be moving forward in a First Person Shooter (FPS) by using a thumbstick on a controller. In this instance, your eyes tell you-you’re moving forward at a certain velocity and acceleration, but your vestibular system does not, so people tend to feel a bit ‘off’ after a while.

Active Movement is exactly what it sounds like – when your body is in control of the movement. The best example of this is using room sensors for Vive or Rift and physically walking around a room, crouching down and standing again. These experiences rarely lead to sickness, unless the VR environment deliberately moves objects in a funky way.

There is a spectrum of movement of course. Even when engaging in Passive Movement while playing an FPS, you’re able to look around, and your brain can sense that movement through both senses, but the second you’re moving passively and looking around actively, sensory misalignment can occur.

Strategies for VR Movement

To deal with these types of movement, and to alleviate motion sickness in VR, there are currently three main strategies:

Physically Walking Around: One of the best strategies, if you have access to the technology. This strategy reduces vection and motion sickness, giving the experience a more realistic feel.

Hand-based Controllers: As outlined above, not the best strategy. Your ears won’t know you’re moving, so if your users are sitting down while in VR, they need to be prepared for some vection.

Teleporting: This strategy is present in many FPS style games, where you’re able to physically walk around the environment (with room sensors), but if you want to move beyond the confines of your physical space, you’re able to use controllers to pick a point within your visual range and teleport to it, and from this point, walk around again. Of course if you don’t have room sensors, this strategy is limited to just standing in place, then teleporting to the next spot, and repeat.


The Google Daydream Labs people have researched and documented out some key findings that should be kept in mind when looking at locomotion in VR:

1. Constant velocity. Locomotion in VR can cause motion sickness when there’s a conflict between a person’s vision and their sense of balance. For example, if you see images showing you accelerating through space, like on a roller coaster, but you’re actually sitting stationary in a room, then your vision and vestibular system will disagree. A way to mitigate this is to use constant velocity during locomotion. Although acceleration can be used to produce more realistic transitions, constant velocity is far more comfortable than acceleration in VR.

2. Tunneling. Tunneling is a technique used with first-person locomotion (such as walking) where, during movement, the camera is cropped and a stable grid is displayed in your peripheral vision. This is analogous to watching first-person locomotion on a television set.

Even though TV shows and movies contain moving images with acceleration, most people don’t experience motion sickness while watching TV. This is perhaps because the TV only takes up a small part of your field of view and your peripheral vision is grounded by a stationary room. VR developers can simulate this by showing people a visual “tunnel” while they’re moving in a 3D environment. We also found it helps to fade the tunnel effect in and out to avoid making it a distraction. We used this approach in Google Earth VR in a feature called Comfort Mode.

3. Teleportation. Teleportation is a locomotion technique for apps using first-person perspective that allows you to near-instantaneously move to a target location. This technique reduces the simulator sickness that many people feel when the virtual camera moves. However, it also makes it harder for people to maintain spatial context—“where am I, and how did I get here?” We found there are subtle things that can ease the transition and improve context. For example, Google Street View on Daydream fades before and after teleportation. Also, when you teleport to a new location, the app quickly moves the entire scene toward you to convey directional motion. This effect is called “implied motion.”

4. Rotation. It’s often tempting to design a VR experience where we assume that people will be either standing or sitting in a swivel chair. Unfortunately, hardware limitations or physical constraints may not allow for full 360-degree rotation. To make sure people can get where they want to go in a VR environment, consider giving them the ability to rotate themselves within the virtual space. Continuous and animated rotations tend to induce motion sickness. Instead, we’ve found that discrete, instantaneous rotations of about 10-20 degrees feel comfortable and provide sufficient visual context to keep people oriented.

References:

Rob Jagnow. 2017. Daydream Labs: Locomotion in VR. [ONLINE] Available at: https://www.blog.google/products/google-vr/daydream-labs-locomotion-vr/. [Accessed 2 December 2017].

Really Virtual. (2016). Locomotion in Virtual Reality games – Teleport vs Controller. [Online Video]. 15 June 2016. Available from: https://www.youtube.com/watch?v=3XSCtOt2ieY. [Accessed: 1 November 2017].

Bumble. (2016). Locomotion in VR: Overview of different locomotion methods on HTC Vive. [Online Video]. 17 June 2016. Available from: https://www.youtube.com/watch?v=p0YxzgQG2-E. [Accessed: 1 November 2017].

About the author

Add comment

By Arjun

Recent Posts

Recent Comments

Archives

Categories

Meta

Arjun