Good Morning Web 3 - guides and resources for brands and individuals to jump into the next phase of the internet

ManoMotion Releases SDK for Developers to Incorporate Hand Gestures Into VR

The SDK supports both Native iOS and Android.

Computer vision specialist ManoMotion has announced the launch of its software development kit (SDK) for developers, enabling them to add hand gestures into any virtual reality (VR), augmented reality (AR) or mixed reality (MR), applications they create.

Up until this point, ManoMotion has been working with customers on a purely one-on-one basis, but with the SDK’s release far more developers to get their hands in and on the company’s technology.

It’ll be offering its SDK in a freemium model, tiered to fit different customer needs. The SDK will allow people to see their hands and move objects in a VR/AR/MR space, using the left or right. Dynamic gestures, such as swipes and clicks, can be added for the manipulation of menus and displays, while predefined gestures, such as point, push, pinch, swipe and grab can also be included.

manomotion screenshot

“The launch of our SDK is a significant milestone in our company?s history,” said Daniel Carlman, co-founder and CEO of ManoMotion in a statement. “It marks the start of a new community and knowledge base around gesture technology, to which ManoMotion will show undying commitment and contribution. We can?t wait to see what developers create!”

ManoMotion’s 3D real-time gesture recognition technology uses a standard 2D camera to recognise and track many of the 27 degrees of freedom (DOF) of motion in a hand. It also tracks depth and handles dynamic gestures (such as swipes, clicking, tapping, grab and release, etc), whilst taking up a small footprint on CPUs, memory, and battery consumption.

The SDK supports both Native iOS and Android and comes with a Unity game engine plugin for both iOS and Android. Head to the ManoMotion website to apply.

For any further updates on ManoMotion, keep reading VRFocus.

Related Posts