Back in June ManoMotion released a software development kit (SDK) to allow developers to include add hand gestures into any virtual reality (VR), augmented reality (AR) or mixed reality (MR), applications. One of the biggest AR apps to launch this year was Apple’s ARKit. Now the computer vision specialist has included support into its SDK.
ManoMotion’s gesture technology uses a standard 2D camera to recognise and track many of the 27 degrees of freedom (DOF) of motion in a hand, all in real-time. So now ARKit developers will be able to include their hands in projects rather than just tapping on a screen, being able to pick up AR objects.
The current version features a set of predefined gestures, such as point, push, pinch, swipe and grab, offering a range of interactive possibilities depending to what they want to achieve, or allow the users to do.
“Up until now, there has been a very painful limitation to the current state of AR technology – the inability to interact intuitively in depth with augmented objects in 3D space,” said Daniel Carlman, co-founder and CEO of ManoMotion in a statement. “Introducing gesture control to the ARKit, and being the first in the market to show proof of this, for that matter, is a tremendous milestone for us. We’re eager to see how developers create and potentially redefine interaction in Augmented Reality!”
To begin with ManoMotion’s SDK will initially be made available for Unity iOS, followed by Native iOS in subsequent updates. Developers interested in using ManoMotion’s SDK with ARKit should visit: https://www.manomotion.com/get-started/.
In addition to ARKit, Google’s recently announced ARCore will also see ManoMotion integration, with a release date coming in the near future.
VRFocus will continue its coverage of ManoMotion, reporting back with the latest updates.