Even if you are a true VR evangelist with a tattooed Oculus or Vive logo on a shoulder, you have to admit current-gen headsets will never get mass adoption. Because for now, Virtual Reality means being tied up to an expensive PC in the prison cell of a Guardian system.
It’s just not that virtual freedom we were all dreaming about.
However, there is one crucial thing that we forgot. It’s called “progress.” And it smashes everything on its way.
In less than 2 years of VR-consumer-version era headsets became completely wireless, field of view doubled, and the resolution tripled.
In fact, VR technologies are evolving much faster than consumer versions of headsets can be released.
In the series of weekly articles called “Behold The Next Generation VR Technology” I will guide you through the world of the latest and most promising tech that will finally make VR the next computing platform.
Mostly on the early stage of development, all this tech will be implemented in consumer version headsets during the next 10 years. Some sooner, some later.
I divided this series into parts — one part for every vital aspect of VR technology. This one is about:
What a beautiful feeling it must be to kick a virtual enemy. My number one dream every time I play SuperHOT — their glassy red little butts are begging to be kicked.
Unfortunately, full-body tracking is not available to regular players, because merely no such hardware is shipped with current-gen VR HMDs.
However, game developers have already learned how to work around this limitation. In most of VR experiences, they just cut the legs and torso of your avatar (not literally, of course). No legs = no problem.
Others are more generous. They use IK — Inverse Kinematics. It’s an animation technique that defines how every part of the virtual body must move. For example, if I move both of my virtual hands to the left, so should my virtual torso.
Current-gen VR systems basically have only 3 trackable points: the headset itself and a pair of controllers. That gives us head and arms motion — not so much to track the whole body’s movements. What IK does is that it tries to mimic the gestures of the entire body considering only this 3 points motion data.
While the results of IK are quite impressive, it faces another problem: your brain is smart (oh, yes!). It knows very precisely where your body parts are.
To confirm that close your eyes and try to reach your nose with a forefinger. Did it work? So, now you know what I’m talking about.
No matter what method developers use, this ability of the brain leads us to only one conclusion: even the most advanced workaround technology won’t be precise enough to fool it.
Does it mean there won’t be any immersive butt kicking games anytime soon?
Hell no! A couple of great solutions are already being developed.
Add that to the HTC Vive itself and two Vive controllers, and you will get a smooth multi-point trackable system. It can generate much more motion data and finally track legs and torso.
Although it’s great for low-budget motion capture and you can already buy it, it has some flaws as a product. You see, spending a shit ton of time on putting on sensors everytime you want to play some VR-butt-kicking-game will get you frustrated very quickly. Or not. Time will tell.
Anyway, there’s an alternative to this solution.
I always believed that the best technology is the technology that is invisible. The one that solves the problem and no actions needed.
Like Kinect (or iPhone X Face ID sensor) it projects thousands of light dots in the Infrared spectrum (it’s not visible to the human eye). This carpet of dots forms a pattern that is visible to Vico VR. Depending on how some areas of the pattern are small or big the sensor understands how far it is, so is the surface it hits. That’s how Vico VR determines the depth of the scene.
But how does it recognize the player’s pas? Here’s where the machine learning comes in. I am not sure what algorithm Vico VR uses, but usually, developers train the neural network to separate parts of the body from other objects or each other in the depth data they receive from the sensor.
When the algorithm recognizes the player’s body parts, it can build a skeleton — an unaltered component of every character’s 3D model. It’s the skeleton that moves the avatar’s virtual body.
The final part is where the motion of generated skeleton transfers to the avatar’s skeleton. When the player raises his arm, this movement is sent to the avatar’s skeleton, and your virtual self raises its arm. That’s how it works. Simplified, but anyway you got the idea.
Cool, right? You can seamlessly transfer all your kicking-butt combos and smooth dancing moves straight to your avatar. No cords, no suits needed. Just one sensor.
However, despite the fact that it may sound like a dream come true this branch of the motion technology is still in its early days. It suffers from noise and delays. The first company to show the precise and fast enough sensor to transfer your movements at a believable level will win the market. Or at least will be purchased by the vacuum-cleaner-like Oculus.
But that’s not all. There’s another frictionless technology: a radar. Yes, a millimeter-wave radar chip that can detect even elusive movement. It’s made by Google jointly with Infineon and is called Project Soli.
A nail-size chip emits electromagnetic waves (Radar Frequency waves) in a beam. Objects around reflect some portion back. Then this data is being processed with algorithms that sort out the shape, speed, rotation, distance and even material properties.
They also claim this tech can recognize gestures at frame rates up to 10,000 frames per second. Well, not bad.
You won’t find an enemy submarine with this tech, it can be a great alternative to the previous two solutions, although yet very crude. For now, it works only with finger motions due to Soli’s Project purposes. BUT in his paper, Soli’s creator —Ivan Poupyrev — says that full-body tracking with RF waves “is not impossible.” And there are definitely some experiments with the technology in MIT showing radar can detect human postures even through walls. All you have to do is again to deal with the noise.
So, we’ve got 3 solutions: wearable sensors, Kinect-like standalone sensor, and a radar chip. Which one do you like best? What technology will finally present full-body tracking to the market?
I believe feeling the presence of your body in a virtual world will change the experience forever. Being able to interact with the environment (or each other?) with hands, legs and other virtual body parts will launch the firework of immersive experiences to the next level.
Unquestionably, it is only a matter of time until our avatars will talk like us, move like us. But will they look like us? Will see in the next episode.