Mixed reality (MR) can quite often be a pain to try to carry out, especially as it is such an effective way of showing what is really happening in virtual reality (VR). Owlchemy Labs, creators of Job Simulator and Rick and Morty Simulator: Virtual Rick-ality, have decided to take it upon themselves to totally rethink about how MR can be achieved, and they have come up with a solution.
What this solution is called in long is Depth-based Realtime In-app Mixed Reality Compositing. If you couldn’t guess by the name, all of the hard work happens while in-engine using a custom shader and custom plugin to green screen the user and get them straight into the engine.
There are six listed advantages: per-pixel depth, no extra software to stream, no compositing complications, static or tracked dolly camera mode, all in one machine, doesn’t require wearing a head-mounted display (HMD), and it gives automatic lighting.
As explained by Owlchemy Labs itself: “Using a stereo depth camera (in this case a ZED Stereo Camera), we record both video and depth data of the user on a green screen at 1080p at 30fps. The stereo depth camera is essential since infrared based cameras (Kinect, etc) can and will interfere with many VR tracking solutions. We then pump the stereo data in real-time into Unity using a custom plugin and a custom shader to cutout and depth sort the user directly in the engine renderer, which yields this amazing result. Not only is there no external application required to set this up, but you do not even need to be wearing a HMD for this to work.”
There are plans to flesh this idea out and bring it to users. Until then, check back with VRFocus for the latest in VR.