fbpx

Working with Virtual Worlds: nDreams Dev Diary #3

We take a lot of common features in videogames for granted. It’s easy to think that transitioning an experience to virtual reality (VR) should be as easy as converting said title to 3D and then slapping on a headset but that just isn’t the case. Instead, developers need to rethink seemingly simple concepts such as height and movement and even staples such as heads-up displays (HUD).

That’s what nDream’s Jackie Tetley is here to talk about in this week’s nDream developer diary, documenting the creation of its first virtual reality (VR) project for the Oculus Rift and Project Morpheus headsets. The team is currently working hard to get its fast-approach E3 2014 demo up to scratch, which has meant some out of the office hours for some this week.

ndreamsgame_1

Jackie Tetley, nDreams: HUDs (aka heads-up displays – things like reticules, health counters, ammo data etc that overlay your view of the game) for VR require considerable thought.

Barring cyborgs from the future, humans don’t have data overlaying their vision. It can therefore affect immersion when you litter things all over the HUD in VR. It works better to place elements in the world, where possible. For example if your character was up against a time constraint, she could glance down at her wristwatch or look at a wall display.

Another thing to consider is point of focus. While reading this, think for a second about how much of what you see is actually in your peripheral vision. In order to see it clearly you have to switch your focus, by swivelling your eyes or turning your head.

VR reflects this closely. The centre of your view is clear, blurring out to the edges so you have to turn your head to focus on elements in the periphery. No eye tracking yet, but people are experimenting.

What this means for us is that the familiar approach of carefully tucking HUD elements to the edges of the screen just isn’t viable. However neither is placing everything in the centre of the screen, unless you want the player to see little else.

So what has this to do with our E3 demo progress this week? Naturally it impacts the entire game, but as the E3 demo will be completed soon, we’ve been assessing our implementation and are looking forward to getting feedback from demo users next month.

We had a good voice recording session earlier in the week, and the lines have already been dropped into the demo. I mostly work in-office, so it was a pleasant change to be in a studio watching actors at work. It also gave me a chance to work on my explanation of VR to non-gamers, and to try (and fail) at actor-appropriate small talk.

Next week will see polishing continue in earnest, and no doubt there will be a few “lively discussions” about how many mm to the left something should be moved, or whether a line of dialogue would be better triggered 0.1 seconds earlier. Try to contain your excitement folks!