paint-brush
Apple Quietly Takes a Bite of the AI Cherryby@adrien-book
245 reads

Apple Quietly Takes a Bite of the AI Cherry

by Adrien BookJune 15th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Apple’s WWDC 2023 keynote didn’t mention the term ‘AI’ once. The technology was referred to, of course, but always in the form of ‘machine learning’ This is in complete contrast to what happened recently at events of other Big Tech companies, such as Google I/O.
featured image - Apple Quietly Takes a Bite of the AI Cherry
Adrien Book HackerNoon profile picture

If you watched Apple’s WWDC keynote in June 2023, you may have realized the lack of mention of the term “AI” during the presentation. The term was not uttered; not even once. The technology was referred to, of course, but always in the form of “machine learning” — a more subtle and technically accurate description.

This is in complete contrast to what happened recently at events of other Big Tech companies, such as Google I/O or Microsoft Build.

Apple, as it often does, took a different route. Instead of highlighting AI as an omnipotent force in its own right, they only pointed to the features they developed by baking machine learning into its products.

Here’s a list of the Machine Learning / AI features that Apple unveiled:

Improved Auto-correct🤖

An enhanced auto-correct feature, powered by a transformer language model. This Machine learning model improves auto-correction and sentence completion as users type.

Personalized Volume Feature for AirPods🔊

This feature uses machine learning to adapt to environmental conditions and user listening preferences.

Enhanced Smart Stack on watchOS⌚

The Smart Stack (the feature that lets you pin widgets to your home screen) upgrade uses machine learning to display relevant information to users.

Journal App📖

This new app employs on-device Machine Learning to intelligently curate customized journaling prompts based on contacts, photos, music, workouts, location data, and even podcasts.

3D Avatars for Video Calls on Vision Pro😃

Apple introduced advanced Machine Learning techniques for generating 3D avatars for video calls on the newly launched Vision Pro. It largely went under the radar due to the hype under the Vision Pro itself.

Transformer-Based Speech Recognition🗣️

Apple announced a new transformer-based speech recognition model (called Dictation) that improves… dictation accuracy using Apple’s “Neural Engine”.

Apple M2 Ultra Chip🍟

This chip with a 32-core Neural Engine is capable of performing 31.6 trillion operations per second and supports up to 192GB of unified memory. It can train large transformer models, demonstrating a significant leap in AI applications… and helping Apple secure its future through continued control of its hardware stack.

By underplaying the role of AI during its presentation, Apple does a few things. Firstly, it makes the company look like the adult in the room. They’re saying “We don’t need to over-sell, the proof will be in the pudding”.

Secondly, they’re ensuring that any future AI failures (and they will come) will not tarnish the brand.

Thirdly, and most importantly, they’re making sure not to be impacted by future regulations by putting some distance between AI technology and themselves.

While Apple’s rivals are building massive models with server farms, supercomputers, and terabytes of data, Apple embed AI models on its devices. On-device AI bypasses a lot of the data privacy issues that cloud-based AI faces. When the model can be run on a phone, then Apple needs to collect less data in order to run it.

It’s like they’re saying, “hey if there’s gonna be any irrational hype about anything here it’s gotta be about our brand only”.

Good luck out there.

Also published here.