The technical aspects of bringing mobile VR have long been frustrating developers. Chief among the challenges of bringing VR to the smartphone-owning masses is how to track the motion of users without the need for bulky hardware add-ons such as sensors.
Clay VR has launched its solution to this key problem, with its SDK for VR/AR game developers. Users of apps using SDK will be able to see their hands and use them to interact with virtual worlds, using only the phones built-in camera.
The phone’s camera tracks the users hand position in 3D, the software also recognises hand gestures in real-time.
This means that developers will be able to add further degrees of interactivity and complexity to their mobile VR and AR experiences.
AI gesture recognition
To make gesture recognition and 3D motion tracking possible without the need for bulky hardwar, Clay uses a combination of their patented Z Buffering software and advanced AI.
AI is used for gesture recognition, meaning that this high-functioning software only has a CPU overhead of 9% on an iPhone 7.
The SDK library of gestures comes with a library of hand gestures that developers can make use of, with the ability to add more based on their requirements.
The company claims that there is only a 4ms latency between a gesture being made and the software recognising.
“As new-borns, it takes months before we can see properly, but we’re grasping and waving from the moment we’re born,” said Clay CEO Thomas Amilien.
“Our hands are fundamental to how we experience reality. Adding this element to mobile VR, making it feel natural and most of all, accessible for everyone, changes everything about what’s possible in the industry.”