“Genuinely excited”. Maybe those were the words I was looking for. But wait, I should not get ahead of myself. When we did “In Shadows” in 2014 it was a concept for an augmented reality iBeacon enabled game of tag. The world’s first! Revolutionary at the time. Maybe too revolutionary. Despites runs in Singapore and Tokyo, it failed spectacularly.
The app failed for many reasons, and we would have rebooted the app, if it wasn’t for the fact that we used the Metaio SDK and Metaio, the German company was soon after our launch bought by Apple. After that the SDK disappeared and I was left in the dark when Apple would have completed integrating it into their ecosystem. Now with the upcoming release of iOS11, this time seems to have come. And by the promise of what ARKit might be able to do I am genuinely excited.
Of course there was the unreal Wingnut AR demonstration. But that’s not it alone.
Finally it makes sense. With the impeding launch of iOS11, literally overnight millions of iDevices will evolve into the worlds largest augmented reality platform. Boom! And finally I understand the vision why Apple installed dual cameras on every iDevice launched since the iPhone 7; even though I still believe Apple dropped the ball on the headphone jack.
Dual cameras are good for better zoom capabilities but also better depth sensing. If you have two different viewpoints and you know the distance between these these two viewpoints, it is possible to triangulate the distance of a given object point. That means for every pixel the camera captures, the phone can calculate depth maps of what the camera sees. Our brains work in a similar way allowing us to perceive our world in 3D. The effect is called stereopsis. The benefit of this technology is that the software could gain an understanding which objects are in the foreground and which documents are in the background. The relative positions of objects to each other. One example of what then can be done is that the background can blurred. An effect known as bokeh.
Irrespective of the visual aesthetic, this technology provides ARKit with information about real world objects with digital information.
World tracking aims to create the illusion that the virtual world is part of the real world. That includes correct shadows, changing scale and perspective, and hit-testing digital props on flat surfaces. World tracking does not try to create a representation of the world but works through a technique called visual-inertial odometry (VIO). VIO pins objects to a points and via this tracks these points across images in the video signal. ARKit estimates the relations between these points through 3-dimensional projective geometry.
The steps to take here are usually
Clearly, the better the relation estimates are the more accurately ARKit is able to track points in the environment through the iDevice’s camera and motion sensors.
Of course the existing technologies like hand gesture recognition, accelerometers, bluetooth LE, and of course GPS combined with ARKit provides a treasure chest of opportunities for us app developers.
Surely we all know Apple and despite the continually bad press, they have not arrived here by accident. Besides the aforementioned Metaio, Apple acquired Linx a company that had developed mobile technology for multi-aperture camera models which can enable effects like background focus blur, parallax images and 3D picture capture. Remember bokeh? In addition, in 2013 Apple acquired PrimeSense the company that licensed their 3D-sensing technology to Microsoft for the Kinect. Go figure.
I truly believe we are looking into the first millions-large platform for augmented reality making apps like these viable again. Is that the end for Magic Leaps? Time will tell. We at tenqyu; however, will re-visit our AR apps and surely look into relaunching some of our previous concepts with ARKit enabled. Great times ahead.
Btw, here is the trailer from back in the day for your enjoyment.