paint-brush
A summary of AR at WWDC 2018by@posttweetism
642 reads
642 reads

A summary of AR at WWDC 2018

by Mike PostJune 12th, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This summer I’ve started building an AR & machine learning prototype with a friend. We’re seeing if the technology is ready for an app idea we’d love to work on.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - A summary of AR at WWDC 2018
Mike Post HackerNoon profile picture

This summer I’ve started building an AR & machine learning prototype with a friend. We’re seeing if the technology is ready for an app idea we’d love to work on.

As part of the knowledge ramp up, I’ve decided to compile a list with the key points on each AR related session from this this year’s recent WWDC. I find that WWDC is sometimes information overload, and sometimes I’m a little fragmented about what areas a collection of sessions focuses on.

My essential WWDC sessions that I had to see live.

It’s not easy to curate the sessions based on Apple’s website and app. Thankfully they do make it easy to integrate the WWDC app with your calendar, so if you’re anal with setting yourself alerts like me, you can.

However, a title isn’t enough. I just need a small set of bullet points summing up what’s in each session. So I’m attempting to do that myself, with the objective of making it a little easier to find what I might need to revert to in future, and hopefully helping others do the same.

This will focus less on the announcements of what’s new in AR, etc, and more on the content breakdown in each of the talks. The below is in order of the date of the sessions.

Platforms State of the Union

A summary of the AR related concepts that will be covered over WWDC, and an intro into USDZ.

  • Because this session is a general keynote for developers, AR isn’t mentioned until about 1:21:45 hours in.
  • The first reveal of USDZ is in this session — the new “universal” AR format. It mentions some of the content creation tools for, and the code format of USDZ.
  • Adobe introduce their platform for USDZ, Project Aero.
  • A general overview is given, of the topics that are going to be covered in later sessions regarding ARKit 2 — improved face tracking, environment texturing (making objects look more realistic), image detection, 3D object detection(this was already in iOS 10’s Vision framework to an extent, but the object tracking looks to be improved), saving and sharing AR maps.

What’s New in ARKit 2

Good recaps on most of what ARkit does, app demos on new features, and some light dives into coding examples.

  • Starts off with a broad overview of what ARKit is, and showcases some demo apps. The bedrock is ARAnchor objects, which are delivered up to a frequency of 60 FPS via each ARFrame.
  • Explains the importance ARWorldMap, and how mapping can now be persisted.
  • A worldMappingStatus variable on an ARFrame, will indicate the accuracy of the mapping.
  • A bried demo on realistic texturing matching the environment, to give control over lighting, shadows, reflections, etc, on your AR objects.
  • Demos on tracking real world images. An extension of object detection for images. This is done with ARImageAnchor, via ARFrame updates.
  • Object detection on 3D real world objects (as opposed to image detection & tracking on 2D images). Again, this is done via the ARFrame updates, but with the new ARObjectAnchor.
  • Enhancements on face tracking via the ARFaceAnchor, including Apple’s very proud accomplishment of tongue tracking.

Screenshot from Apple’s WWDC 2018.

Integrating Apps and Content with AR Quick Look

A deeper dive into a new feature in iOS that provides a way to preview any AR object from a USDZ file.

  • There’s a great sequence diagram presented (I wish more sessions would have these!) for previewing USDZ objects, of which the QLPreviewController plays a central role.

Screenshot from Apple’s WWDC 2018.

  • For web developers, it covers HTML samples for how to preview USDZ objects in Safari.
  • Then it goes into a deep dive on how to create the actual USDZ objects, with more examples on new AR texturing capabilities.
  • There’s also a quick overview on how to optimize the files, to keep the size down, and there’s a breakdown of the files that make up the USDZ format.

Inside SwiftShot: Creating an AR Game

Covers world map sharing, networking, and the physics of how to build an AR game, as well as some design insight (I have limited game dev experience so I’ll do the best I can below).

  • Pointers to remember with designing an AR game, such as “encouraging” the user to slowly move the device for world mapping!
  • It demonstrates the usage of image & object detection, world map sharing, and iBeacons for the game.
  • Integrating ARKit with SceneKit and Metal, including the translation of physics data between each — position, velocity, and orientation.
  • Performance enhancement with the BitStreamCodable protocol.
  • A small look at how audio was integrated into the game.

Creating Great AR Experiences

Best practises mainly from a UX & design perspective (there are no code samples in this session).

  • Logical dos and don’ts that may be useful, if you need help with thought towards product and empathy towards the user.
  • They emphasize the importance of using transitions between AR scapes.
  • Why AR is a special combination of touch and movement.
  • They advise that minimal battery impact should be a huge focus! This is a challenge, given that they recommend to render the FPS at 60 to avoid latency.
  • There’s a lengthy demonstration of creating an AR fireplace, with complex texturing, etc. It looks great, but unfortunately there were no coding samples accompanying the demo.

Object Tracking in Vision

While technically not AR, the Vision framework (new in iOS 11) works closely with AR and machine learning in iOS.

  • It seems to go into better detail than in WWDC 2017, with regards to describing exactly how to use Vision. They categorize the explanations down into requests (VNRequest), request handlers (VNImageRequestHandler or VNSequenceRequestHandler), and results (VNObservation).
  • A break down of the different type of requests and what to use them for.

  • A break down of the request handlers and when to use them, and what is contained in the observations.

  • Performance optimization, and where to place your request/s.
  • Covers what’s new in iOS 12, including better face detection, and face tracking. Distinguishes between face detection and tracking, because they are different.

Understanding ARKit Tracking and Detection

A good broad overview of all of the main AR concepts.

  • This is such a good intro into not only AR on iOS, but AR in general, that it should have been part of 2017’s sessions when ARKit was first introduced. Better late than never. If you’re only going to watch one session, watch this one!
  • It recaps the main features of ARKit — orientation, world tracking, and plane detection, and demos all of these in depth with coding samples.
  • It then demos the new features of ARKit 2 — shared world mapping, image tracking, and object detection (which has been available in the Vision framework recapped above, but is now also accessible in ARKit).
  • A good explanation on a core AR principle, Visual Inertial Odometry, is given. Short of going into the actual physics equations behind it, this should give you a great understanding of VIO.

So all up, there were 5 quality sessions on AR in 2018! 7 if you include the state of the union, and the Vision session with shared concepts. Apple may not have released a dedicated AR device (yet), but they’ve really upped their game since introducing their AR framework in 2017.

The thing is, if Apple actually release an AR device in the next 2, 5, or even 10 years from now, these concepts and frameworks form the foundation on building AR apps in any future device.

Familiarizing ourselves with these concepts now, will better prepare us for the future platforms and devices that Apple (or Google’s better Glass, or Snap, or someone else) will inevitably create.