NVIDIA Launches VRWorks 360 Video SDK 2.0
More support, faster stitching, better recording and streaming
When they are not preparing new lines of virtual reality (VR) ready GPUs with more cores, more processing speed and more raw power than ever before. And when they’re not filing a whole range of VR related trademarks connected to said graphics cards. NVIDIA can still often be found involved in immersive technology on the software side, one way or another.
Primarily that is through acts involving NVIDIA VRWorks, NVIDIA’s development kit for VR devs, which provides a suite of APIs, libraries and engines to enable high-end graphics performance when creating things under the VR umbrella. Their latest update sees, once again, improvements made to their 360 Video SDK which has already gotten companies like STRIVR and Pixvana excited.
“When you experience a situation as if you are actually there, learning retention rates can soar.” Commented Chief Technology Officer of STRIVR, Brian Meek, to NVIDIA’s official blog. “The new Warp 360 will help ensure our customers stay fully immersed, and the functionality and performance that Turing brings to VRWorks can’t be beat.”
The new version 2.0 update accelerates the speed of stitching together 360 degree videos as well as a host of other features that mean recording and streaming in 360 degrees becomes a lot easier. It also adds additional support for NVIDIA CUDA 10 and of course the most recent additions to their GPU line-ups.
Ambisonic Audio – increases the immersiveness of 360-degree videos by enabling 3D, omnidirectional audio such that the perceived direction of sound sources change when viewers modify their orientation.
Custom Region of Interest Stitch – enables adaptive stitching by defining the desired field of view rather than stitching a complete panorama. This enables new use cases such as 180-degree video while reducing execution time and improving performance.
Improved Mono Stitch – increases robustness and improves image quality for equatorial camera rigs. Multi-GPU setups are now supported for up to 2x scaling.
Moveable Seams – manually adjusts the seam location in the region of overlap between two cameras to preserve visual fidelity, particularly when objects are close to the camera.
New Depth-Based Mono Stitch – uses depth-based alignment to improve the stitching quality in scenes with objects close to the camera rig and improves the quality across the region of overlap between two cameras. This option is more computationally intensive than moving the seams, but provides a more robust result with complex content.
Warp 360 – provides highly optimized image warping and distortion removal by converting images between a number of 360 projection formats, including perspective, fisheye and equirectangular. It can transform equirectangular stitched output into a projection format such as cubemap to reduce streaming bandwidth, leading to increased performance.