Authors: Paulius Jurcys and Whitney DeLuca
Bloomberg has just reported that Apple is working on a new depth-sensing technology which will be introduced to iPhones as soon as 2019. This improvement will lead to the installment of a depth-sensing technology in the rear side of the phone. As a result, such hardware will enable phone users to have even more startling augmented and virtual reality experiences.
In particular, depth-sensing sensors will give each of us a possibility to make three-dimensional captures of the space surrounding us and open the possibilities of communication in virtual reality space. Just imagine: you will be able to pull out a phone of your pocket and spontaneously capture the space around you and interact with it.
Rumors about Apple’s plans to implement this new depth-sensing technology have been heard among industry specialists for a while. Yet, it was not known when such hardware will actually appear on iPhones. In early 2017, there were some hopes that to-be-released iPhones would contain such depth-sensing hardware. But as the time went by, by Tim Cook and his company revealed their intentions to focus on augmented reality instead and halt the introduction of depth-sensing hardware until 2019.
It should be noted, however, that the new iPhone X has a truly spectacular face recognition technology. Unfortunately, that hardware is installed in the front of the phone and performs only one function: to unlock the phone. This TrueDepth sensor system which uses the structured-light technique: the laser projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D face image for authentication.
Apple’s TrueDepth technology to unlock iPhone X
“We’re already seeing things that will transform the way you work, play, connect and learn.” (Tim Cook)
TrueDepth system used in iPhone X contains four components: an ordinary camera, infrared camera, dot projector; and flood illuminator. Had this technology been implemented also on the rear side of iPhone X, we would already be able to enjoy true virtual reality experiences. Yet, recent reports from Apple show that the company has been struggling with the timely manufacturing of iPhone X. Such delays seem to be caused due to required precision in assembling TrueDepth sensors.
This said, we should note, that (besides unlocking the device) TrueDepth technology can only be used for such novel purposes as controlling emoji facial expressions. So VR/AR enthusiasts and companies working in this field have to armor themselves with patience till 2019 when similar hardware will appear on the rear side of mobile devices.
Apple is not the first hi-tech company considering the potential of 3D sensing technologies. In fact, Google was leading the 3D sensor race with Project Tango since 2014. Tango then put it the brakes on after Apple announced ARkit which would bring AR-lite experiences 100s of millions of iPhones in one fell swoop. Google then followed suit with ARcore in an effort to harness the massive Android install base. Now it is Apple that is following suit by bringing rear-facing sensors to their devices.
Tango technology contains a set of sensors and computer vision software which enables mobile devices to understand space and motion like humans do. Tango technology has three capabilities: motion tracking, depth perception, and area learning. Hence, Tango-powered devices can understand users’ motion as they move through an area, detect how far away objects are from the floor, walls, and other surfaces, and remember the key visual features of physical space in three-dimensional space.
Tango depth-sensing hardware is currently implemented in two commercially distributed phones: Lenovo Phab 2 Pro (released in 2016) and ASUS Zenfone AR (released in July 2017).
There are endless possibilities with depth-sensing technology
Over time those depth sensors will improve and be able to capture longer distances. Currently, TrueDepth recognition sensors installed in iPhone X can capture distances of two-three feet, whereas Tango-powered Asus Zenfone AR hardware is able to capture objects that are up to 12–15 feet away from the device. The distance of capturing is going to increase as the depth-sensing technology is improved.
According to Bloomberg, Apple has started discussions with Infineon AG (the same German company which collaborated with Google in developing Tango) as well as Sony Corp, STMicroelectronics NV, and Panasonic Corp. Apple’s projection to release mobile devices with depth-sensing capabilities in 2019 seems to be reasonable considering the time required for testing.
Bridging the gap between now and 2019
This news about Apple’s plans to add depth-sensing technology to iPhones in 2019 serves as a huge incentive to VR/AR industry. When most software developers were crafting a strategy on how to fill the gap between now and mass adoption of 3D sensor-enabled mobile devices, they now find themselves in a position of having sufficient time to get their products to market in time for this impending mass adoption.
In other words, startups working on VR/AR projects have now much more certainty in raising capital and developing VR/AR apps. According to Superdata estimates, the market size for VR could be between $30B — $215B in 2020. Other estimates are even more astounding: according to Statista, the market for AR and VR in 2021 is going to be $215B.
These estimates about the VR/AR market size are hard to assess. Generally, they are based on the assumption of the economy of scale. So far there have been sold only … of Lenovo Phab 2 Pro phones and even less ASUS Zenfones. Currently, companies in the mobile VR industry are waiting until such popular brands as Apple and Samsung will incorporate depth-sensing technology in their mobile devices which will take another year or two.
Meanwhile, both Apple and Google have been pretty bullish about the augmented reality: just a few months ago, in September 2017, both companies announced about the release of ARKit and ARCore respectively. These are tools of developers of augmented reality applications which would essentially work most of the recent mobile devices (in the case of Apple, starting from iPhone 6 onwards).
Current augmented reality apps rely on existing mobile devices. However, existing mobile devices help identify only flat surfaces (a table, a bed or a floor) which limits the applicability of AR. For instance, we can place various digital objects (a monopoly board on the table) and move those digital pieces around. The problem with AR is that users are not able to interact with those digital objects (it is not possible to pick up the parts of monopoly and play with it). Introducing a depth-sensing sensor on the back of the phone will make such interactions possible.
Kids are always asking their parents for the latest technologies based on the commercials they see on YouTube. Once every second kid has a Tango-powered smartphone, we can say that there is actually the multi-billion dollar market for VR. In particular, it is expected that there will be ca. 4.5 billion Tango-powered mobile devices in the market in 2019. According to some VR geeks, this means that if we stack every ARKit and ARCore phone in one line, we may be able to capture distance to the moon and back.
We are now at the forefront of the life-changing technology. The technology has existed for years, the only missing thing is the time required to refine the hardware and knowledge related to its application. Who knows, perhaps we will be able to teleport using VR/AR in 2020?
Thank you for reading this post! If you liked it, click and hold the 👏 on your left side, or leave a comment.
I publish a new story every week. Follow me and you won’t miss my latest insights on innovation, creativity, and the recent trends in Silicon Valley and beyond.