Patented solution uses natural body movement to reduce sensory imbalances and offer higher-quality VR, AR and FPV drone piloting experiences
MONKEYmedia, Inc., an independent R&D lab with a 25-year track record of developing award-winning, user-friendly technology experiences, today announced the launch of its patented body-based navigation solution (BodyNav™) for hands-free virtual reality interactions. BodyNav leverages the existing on-board sensors of smartphones and advanced 3D headsets in novel and unanticipated ways to engage the body’s innate center of gravity. This human-centered interaction approach reduces motion sickness artifacts and enhances navigation abilities in virtual and augmented realities (VR/AR), as well as first-person view (FPV) drone aviation contexts.
Rob Bamforth, Principal Analyst at research and analysis company Quocirca, explains that “making the physical experience align with human expectations, as I experienced with MONKEYmedia’s BodyNav technology, is critical not only for an effective VR experience, but also for avoiding digitally induced motion sickness.”
The Motion Sickness Challenge
Motion sickness has long been a complaint amongst virtual reality gamers and drone pilots. Traditional stereoscopic headset interfaces use multiple sensor axes (e.g. rotate left/right, pivot up/down, tip left/right) to establish viewer orientation, while requiring handheld controllers (e.g., joysticks, gamepads, keyboards, etc.) for locomotion. Visually “moving” through space while in a sedentary posture creates sensory imbalances that can cause dizziness and nausea in the viewer. Oculus’ former Chief Scientist goes so far as to call hand controllers “sickness generators.” Addressing this problem, MONKEYmedia’s patented, hands-free BodyNav technology creates more intuitive virtual interactions by remapping control axes to accomplish both orientation and locomotion with natural body movement. This provides the organic equilibrium needed to circumvent sensory imbalances.
Without any custom hardware, BodyNav uses distinct sensor axes for independent functions to maintain equilibrium in the body’s proprioceptive system. Viewers simply lean, using either their head or torso, to move themselves through virtual spaces, or to move their drones through remote physical spaces. This allows the sensory receptors, which receive stimuli internally and relate to the body’s position and movement, to properly engage with virtual or remote content, synchronizing visual and vestibular senses and reducing motion sickness-inducing factors.
“MONKEYmedia has been on the forefront of interface design and invention since the first wave of VR innovation in the early ’90s, and we’ve learned that motion sickness in VR has more to do with human interaction than raw hardware capabilities,” said Eric Bear, co-founder of MONKEYmedia and co-inventor of BodyNav. “The launch of BodyNav comes after more than 20 years of experimental research and development. The resulting technology creates a sense of agency in viewers that fosters deeper connections with content, characters and 3D data. We’re excited to open doors for developers to provide more enjoyable and inspiring experiences to consumers in a variety of contexts.”
BodyNav can be readily adapted to modernize a user experience with just a few lines of code. For developers like Garriott, it enhances first-person gaming by freeing the hands from managing avatar movement to focus on other tasks. It will also amplify multi-camera performances and sporting events, control of remote vehicles, video conferencing and telepresence applications, street-view maps, augmented reality, architectural simulations, and 3D user interfaces for browsing data models, documents and images – all while keeping users comfortable, engaged and entertained. For more information about how to incorporate BodyNavtechnology into a VR or drone piloting user experience, please contact email@example.com.