Mayo Clinic May Have Just Solved One Of Virtual Reality’s Biggest Problems
There’s no denying the power virtual reality headsets like the Oculus Rift andHTC Vive possess to make you feel physically present in a non-physical world,but there’s one drawback that degrades its appeal and presents a real obstacle toVR software developers: motion sickness. Apparently the Mayo Clinic has been hunkered down for 11 years trying to solve this very problem, and today they’re announcing the commercial availability of the technology they’ve been developing. It’s a bit difficult to wrap your head around, but it’s incredibly exciting.
First let’s outline why so many of us can suffer motion sickness in VR. While it’snot as common as it was a few years ago with early Oculus Rift dev kits, itdefinitely restricts the kind of games and experiences being developed right now.VR sickness (sometimes called simulator sickness) results from a visual andvestibular mismatch. I’ve made the mistake of associating it more with a conflictbetween what your eye is seeing and what your brain believes, but that’s notentirely accurate according to Mayo Clinic scientists.
A simple illustration of how Mayo Clinic’s GVS technology works | Image Credit: Mayo Clinic
You hear a lot about VR needing to deliver a solid 90fps framerate. Some neatadvances have been made on the hardware side of VR technology such as Oculus’Asynchronous Time Warp, which is a fancy way of saying they can generate extraframes when and if a game can’t maintain that desired 90fs sweet spot. Thishelps reduce judder, which contributes to motion sickness.
But the motion sickness is actually happening because our vestibular system — acomplicated sensory system in our inner ear that provides balance and spatialorientation — is out of whack. When we walk a character through a room in a VRgame without walking ourselves, a mismatch happens because we don’t feel thatmotion represented in 3D space. Our brain instantly notices that discrepancybetween what you’re seeing and what you’re feeling. It’s why you see a lot of earlyVR games using “blinking” to teleport a player from one spot to the next insteadof physically walking them there. It’s why, right now, we’ll never see a traditionalshooter like Call of Duty or Destiny ported to VR.
The Mayo Clinic has patented a new technology called Galvanic VestibularStimulation (GVS), which synchronizes your inner ear to what a person isviewing. Using that Call of Duty scenario, imagine feeling the sensation ofsprinting toward cover, of jumping up to a rooftop, or even something as extremeas jumping off that rooftop. What about a movie like Avatar where the viewer canactually feel the sensation of flight? Imagine what this could do for a VR rollercoaster simulator like NoLimits 2, just as one existing example.
Stepanek goes on to mention that GVS can alleviate balance disorders like vertigoand improve balance overall.
Los Angeles-based startup vMocion has secured the exclusive global license to use Mayo Clinic’s GVS technology in commercial products, and that’s where things get real. The platform they’ve developed can be integrated into existing operating systems, devices like VR or AR glasses, smartphones, and TVs. Are presentative for the company tells me that vMocion’s platform can use any existing game to create that sense of motion. Game developers technically wouldn’t need to add any additional code to their games, provided the platform they’re developing for supports vMocion’s technology. It would automatically sync movement seen onscreen to four stimulation points, thus delivering that believable sensation of movement to the inner ear.