Apple’s AirPods Will Usher in Augmented Reality

Written by rob.leclerc | Published 2016/09/11
Tech Story Tags: apple | tech | airpods | augmented-reality | ubiquitous-computing

TLDRvia the TL;DR App

When Apple announced their new AirPods, most people derided them as expensive wireless headphones. They’re not. Apple’s new AirPods are their newest wearable device, you just don’t know it yet.

At the heart of the new AirPods are a pair of new W1 processors that interact with optical sensors and accelerometers, control microphone input and speaker output, and wirelessly link to core computing devices like your iWatch or iPhone. These smart chips make AirPods more than a dumb pair of wireless headphones, they naturally extend the computing experience and capabilities of your iPhone or iWatch in ways that wouldn’t be possible with either of these other two devices.

The first generation of AirPods are already pretty smart. Unlike clunky Bluetooth systems, they pair seamlessly with the iPhone, and they use dual optical sensors and accelerometers to know when they’re docked in the ear — take one out, and it’ll turn itself off while the other one remains activated. Similarly, the chip takes advantage of the accelerometers to provide tap-based-input to prompt Siri.

While this may still be underwhelming for some of you, remember that the first iPhone does not look all that magical by todays standards. So let’s think about how Apple might build on this starting with one of the biggest advancement which I’d expect to see in one of the next two generations.

AirPods are the Path to Augemented Reality

By most accounts Google Glass was a disaster and one of the major problems was that it was difficult to be unobtrusive with Google Glass on your face. And since everyone knew that Glass might be recording what the wearer was looking at, its presence made people uncomfortable. Enough so that the term Glasshole entered our lexicon and more than a couple early adopters got beat downs.

But AirPods augmented with cameras won’t have this problem because people will already be accustomed to seeing these devices in people ears. So in a future generation Apple will add small wide angle cameras to their AirPod all within the same form factor. What this offers is a less obtrusive and far more socially acceptable path toward a first-person-perspective video input device. Paired with an iPhone or iWatch (or both), and suddenly that super computer in your pocket is computing and anticipating the world from your perspective. And this would open up a whole new world of applications.

Outfitted with cameras, the AirPods could look for hand gesture to control volume, or you could write on your hand just like you can with the iWatch.

Cameras would also give us a leap forward to ubiquitous augmented reality. Rather than using your arm to wave around the camera on your phone, stereoscopic AirPods would see the world (in 3D from your perspective) allowing you to hold your device more subtly and glance down to see your augmented perspective.

Stereoscopic cameras paired with an on-board digital compass could also greatly improve the map experience by bringing sub-foot positioning. Walk in you house and the stereoscoptic cameras can use distance information to calculate your exact position.

A stereoscopic view could also lead to future iPhones with 3D displays, or maybe by that time, Apple will have contact lenses that give you a heads up display for the full augmented reality experience.

AirPods Get Smarter with Siri

Right now you still need to activate Siri by taping one of the AirPods. This is a lot better than pulling your phone out of your pocket, double tapping the home button, bringing the phone close to your face, and issuing a command to Siri. However, double tapping the earbud is still one more step.

In the future, you won’t need physical interaction queues. With AirPods, Siri will taking advantage of stereoscopic microphones and a standardized positioning of the AirPod microphones on the speaker to be better able to recognize the activation word and separate that from background noises.

Imagine being able to walk into an elevator and using proximity sensors (or on-board cameras) you simply say: “12th floor”. Maybe you’re rushing to your next meeting and you ask Siri to call you an Uber and meet you outside. Or how about being able to dictate notes and reminders as your working on your computer, and calendar a meeting with a colleague without switching applications.

Or maybe you start to tap into a growing army of business-based chat bots to perform specialist functions that go way beyond Siri. For instance a clerk at a grocery could issue a command “Price check this Chobani yogurt” without having to carry a more cumbersome devices.

Of course these kinds of functions become greatly improved when the device has access to your visual perspective, because it gives our personal computing devices more information about your situation and environment to make smarter decisions. For instance, Siri could see the email you have open and schedule a meeting without tedious explanation. Or it could direct you to your Uber among the hundreds of cars outside the airport.

New technologies often seem like toys, which is why so many people make the mistake of dismissing them. But as most programmers know, you often need to start with a simple “Hello World” program and then build up from there. Apple has Moore’s law pushing this technology forward making it better, faster, smaller and as of right now, Apple has a huge jump over the competition.

Disclosure: I own shares of Apple.

Hacker Noon is how hackers start their afternoons. We’re a part of the @AMIfamily. We are now accepting submissions and happy to discuss advertising &sponsorship opportunities.

To learn more, read our about page, like/message us on Facebook, or simply, tweet/DM @HackerNoon.

If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!


Published by HackerNoon on 2016/09/11