3D Touch
Last week Apple revealed their new offerings for their most successful product and brand, the iPhone.
Most things announced in their keynote were well within the expectations of Apple’s product evolution. New improvements on the top of the line, including a more modern and bigger phone with a 6.5 inches diagonal. Their largest phone ever.
iPhone XS
However, there was a massive diversion in their product strategy with the introduction of the iPhone XR. This move is an extremely rare product decision that will likely shape the future of the iPhone and change the way its produced and marketed.
For those who didn’t watch the keynote or haven’t read enough about it, the XR is a weird phone. Here are some reasons why this phone is completely different than anything that we have seen from Apple before:
The last point was for me the most important reveal of Apple’s upcoming strategy. It seems that Apple is preparing us for a future that will drop 3D touch from the top flagship line.
So this begs the question. Why did this feature fail and what does it mean for the future of the iPhone?I took a couple of days to break down the UX and history of the infamous 3D touch and try to shed some light on how the disappearance of the 3D touch feature will reshape the future of Apple’s most important product.
Apple introduced 3D Touch / Force Touch in 2014 as an embedded technology in their evolved trackpads. The technology came as a companion to the Haptic Engine technology, which its primary goal was to recreate the haptic feeling of mechanically pressed or actuated buttons. This was a significant evolution in Apple’s industrial design strategy which has always favored the reduction of mechanical parts that are prone to break with use.
Force Touch in Macbook’s Trackpad
The technology made its way into the iPhone and Apple Watch in 2015, with the idea of bringing a new dimension of interaction to touch-capable devices. The “game-changing” technology was one of the main selling points of the iPhone 6S.
Here is one of those now iconic Jony Ive videos explaining the technology
iPhone 6S with 3D Touch
As a designer, I remember watching this video for the first time and getting extremely excited about this new interaction level. But watching it three years later makes things loudly clear and evident. This was a solution begging for a problem. None of the interactions demoed in the video are remotely useful.
For example, the interactions that Jony calls “peek and pop” are merely gimmicky alternatives to open resources like photos and URLs. The video even fails to show how a complete shortcut flow enabled by 3D touch works, and only focus on the contextual menus.
The video shows the technology as a step-forward in touch sensing technology, but the reality is that even in a marketing piece it had very little practical use. It seems that Apple’s strategy with 3D touch was to provide the technology as a primitive of their sensors offering and let their developer community figure out creative and smart ways to use it and augment their experiences. It wouldn’t be the first time Apple had done this since some of their most breakthrough technologies came from similar rationales.
So why that didn’t happen? After doing an in-depth break down of the feature here is what I found out.
This may be either a reason or a result for/from the failure of 3D touch. Either way, it’s a fact.
I tested the app icon shortcut menu in a sample of 200 iPhone popular apps. Only 40% of the apps had a 3D touch shortcut menu. This adoption rate doesn’t sound terrible until you start digging into the details of their implementations.
For example, Google seems to include a 3D touch shortcut menu in every app, but I was surprised to find a lack of consistency in their implementations. The Sheets app doesn’t have a shortcut menu while the Docs app does.
This is a weird inconsistency for a group of apps that are part of the same suite.
Google Sheets — No 3D Touch Menu
Google Docs — 3D Touch Menu
Popular apps like Lyft and Bumble don’t have a shortcut menu, and Uber has it only for the ride app but not for the UberEats app.
Lyft — No 3D Touch Menu
Many apps have shortcuts that don’t even work. While testing different shortcuts, I notice some apps attempting to deep link into the described view or functionality and then getting stuck in white screens. This issue happened to me so much that I didn’t even make an effort to document it. Just go and test it yourself.
Most of the apps that provide a 3D touch-enabled shortcut menu are not offering a lot of value in their menus either. Take for instance the DoorDash app which the only option that provides via the shortcut menu is “Search.”
DoorDash 3D Touch Shortcut Menu
The shortcuts are dull and repeated flows that can be achieved without the 3D touch interaction.I elaborate more on this in the following point.
Let’s imagine for a minute that 3D touch was the godsend productivity and time-saving feature that it claimed to be. That would mean that by using it, you would speed up your workflows and achieve things with fewer steps. Right?
Well, this is far from the truth**. In all my tests I wasn’t able to find a single shortcut that was more practical and usable than merely using the app with the standard touch capabilities.**Instagram is an excellent example of this failure. Let’s take for instance the camera shortcut. If I want to open the camera via the 3D touch menu I have to do the following things:
Locate the Instagram Icon, 2) Force touch it 3) Tap on the Camera menu item.Now, if I wanted to open the camera via the traditional touch interactions I have to do the following things:
Instagram 3D Touch Shortcut Menu
Given that it takes the same amount of steps to achieve the same path with both interaction methods, there’s no good incentive to diverge from the traditional inputs. This problem is also real for other menu options that are slightly more functional and shortcut-y like switching accounts. There’s no reason to rely on an interaction that gives limited improvements and sometimes no improvements at all.
This point is perhaps the most well-known issue of this technology. 3D touch is exceptionally undiscoverable in the UI layer. If you want to understand the type of actions that this interaction enables, you have to force touch everything in your screen and expect to get some output. And sometimes when you do get output, it’s hard to understand what kind of augmentation or functionality is the interaction enabling.
Apple doesn’t even attempt to provide guidance on how to increase the discoverability of 3D touch. Their Human Interface Guidelines doesn’t provide context on this topic and only explains the nuances of the useless “peek and pop” concept.
If you want to experience a bit of what arthritis feels like, I would suggest you to 3D touch things on your iPhone for a full day. This thing is an ergonomic nightmare. Its main problem is how hard is too determine the right amount of pressure to trigger the interaction. In its default sensitivity setting (medium) sometimes it seems that a light touch would trigger it but most of the times it won’t. After a failed attempt to trigger the 3D touch, most users would then apply extreme pressure as a way to counterbalance the apparent need of force required to enable the interaction.
A home-made test using a food weighing balance and a hand to hand comparison revealed that sometimes a user could apply well above of 100 grams of pressure to trigger the 3D touch. I’m not saying that’s the actual pressure force required to trigger the interaction, but it might very well be on the user’s mind once it fails the first time. The fact that sometimes you find yourself applying a quarter of a pound of pressure to trigger the 3D touch makes this feature utterly impractical for daily use.
The iPhone is primarily a touch device. It has other input mechanisms like the mic, the accelerometer/gyroscope combo, and the camera, but none of these mechanisms can compare with the effectiveness and efficiency of touching a screen to register intent.Touch is such a dominant input mechanism that human hands anatomy is expected to change just based on how we use technology devices with our hands.
This rationale might be the reason why Apple thought that amplifying the number of available touch interactions was a no-brainer. They did it quite successfully with the introduction of multi-touch capacitive screens and the range of motions and interactions that came from that technology.But 3D touch is different. 3D touch doesn’t provide any practical advantage over a typical capacitive touch. In fact, it does the complete opposite. By being a feature that relies on physical pressure, 3D touch sits in a place in the interaction spectrum that clashes and negates the continuous success of the light touch interactions enabled by capacitive screens.
Remember how frustrating was to use a phone or device with a resistive touchscreen? 3D touch is a technology that brought back all that unnecessary impedance that made pressure screens so frustrating.
Nokia 5800 with a Resistive Screen. The most frustrating phone ever.
While it makes sense to find ways to amplify the range of interactions available, a technology like 3D touch was inevitably going to suffer an existential crisis especially when it was indirectly competing with one of the features that made the iPhone so beloved and successful.
As mentioned in one of the earlier points 3D touch is an unreliable technology.
It’s too hard to determine the amount of pressure required to trigger the 3D touch which makes it hard to use consistently. But the input is not even the worse part of this technology. 3D touch is so limited within the UI layer that it cannibalizes the potential experience benefits that other technologies like the haptic engine could provide to normal touch interactions.
Since 3D touch interactions usually come paired with haptic feedback produced by the haptic engine, the job of a fantastic technology like the haptic engine is reduced to the role of “peasant 3D touch chaperone”.
Although this is not necessarily a reason for why the 3D touch was a failure, it explains how limiting is the micro-universe created by the 3D touch. The feature doesn’t really add value to the final UX of the iPhone, but it’s dominant enough to feel like a limiting factor instead of an un-used technology.
With the introduction of the iPhone XR and the removal of 3D touch in that device, Apple’s intentions regarding the future of the technology are crystal clear. However, removing the feature from the top of the line it’s a more challenging process than deciding not to add it to future phones anymore.
Apple’s strategy to test price inelasticity on their high-end models seems to be working. But this strategy only works if Apple keeps adding features in a way that justify the price increments.
Removing 3D touch would be a challenge mainly because it means removing one particular technology that allows justifying the price for their high-end devices.
It’s unlikely that Apple will remove the technology without first finding an exchangeable replacement, even if it’s just a software-based alternative input like knuckle detection.
There’s also a small chance that Apple is looking into a new iteration of this technology based on other technologies like the one described on this patent,or even some sort of weight-sensing capable screen which would allow devices like the iPad work as small food weighing balances.
Of course, everything is possible, and the XR could be an indication that Apple’s new product strategy for high-end models could also be affected by the new approach of technology integration and deployment seen in this model.
Or maybe the replacement for the 3D touch technology, in what would be the final nail in Steve Jobs’ legacy, could be the addition of Apple Pencil support (something that has been expected in the last two years) and the introduction of an Apple Pencil Mini and special iPhone cases with pencil holders.
In any case, the present reality is that 3D touch is dead and Apple is still dealing with its body…
So what are your thoughts? Are you a hardcore user of 3D Touch and will you miss it if it’s gone? Let me know in the comments.
My name is Juan J. Ramirez (JJ). I’m currently a UX Designer at Amazon Web Services. If you enjoy my writing and ideas don’t hesitate to visit Waveguide, a design knowledge base where I dissect UX patterns and document design. Also make sure to follow me on Twitter to keep up with my projects and articles.
Occasionally I do on-demand UX tear-downs and UX strategy projects for companies and individuals. If you’re interested drop me a line on my website www.whoisjuan.me.