Roomscale virtual reality (VR) technology is a wonderful thing. It allows you to explore a virtual world with your own feet, being able to wander round a room and interact with objects as if you were really there. While the HTC Vive system for example can cover an area of 15ft x 15ft everyone doesn’t necessarily have that amount of space to work with, meaning walls and other furniture can quickly be bumped into if not careful. So during the GPU Technology Conference (GTC) 2018, NVIDIA Research demonstrated a new technique its been working on in collaboration with Adobe and Stony Brook University to make physical areas seem much bigger in VR.
Called Saccadic Redirected Walking, the technique utilises a quirk in your eyes where involuntary movements temporarily blind you a few times per second. These movements, known as saccades, are imperceptible because they last only tens of milliseconds.
So during those fractions of a second the technique rotates the scene ever so slightly. What this does without the user noticing it is guide them on a physical path that’s ever so slightly different to the one they’re viewing in the virtual world. As shown in the image above, this means the physical path of the player can be small whilst in VR it can seem far larger, imagine walking round a grand hall just in your living room. This also helps with avoiding objects like walls, or other players if systems are setup up nearby.
NVIDIA is demoing the technique this week at the VR Village using Quadro GPUs, HTC Vive and SMI eye tracking, with guests able to walk around a huge virtual Alice in Wonderland-like chess board with pieces the size of people, all within a 15×15 foot booth.
The teams will be taking the research to SIGGRAPH later this year to present a paper on the technique. As development progresses and further details released, VRFocus will keep you updated.