Citing a source with knowledge about Apple’s new iPhones, Fast Company writes that Apple will introduce a rear-facing 3D sensor array to new iPhone models. Apple plans to buy the laser components for this array from Lumentum, the California-based company from which it already buys the front-facing TrueDepth lasers found in today’s iPhones.
The publication’s source says that Apple engineers have been working on the rear-facing 3D camera for two years, and it is currently planned for inclusion in at least one model later this year. However, the timing could still change.
Apple is not alone in including this feature in 2020 flagship phones. Samsung’s new Galaxy S20+ and S20 Ultra, announced just last month, have rear-facing time-of-flight (ToF) sensors. They are used for Live Focus (which is an optional blur effect in photos) and Quick Measure (which allows users to measure objects in front of them).
Apple has made much more progress developing APIs and tools for third-party developers and its own software teams to use to create new experiences and functionality, though.
Apple introduced the similar front-facing TrueDepth array in the iPhone X in 2017. Its key feature is Face ID, which scans the user’s face to authenticate them before unlocking access to personalized files and services on the device. It’s also used for Animojis and Memojis, animated avatars used by some in the Messages app.
However, there is significant potential to the tech that remains largely untapped. Apple provided developers with the tools they needed to use the TrueDepth feature in their apps. There have been some applications, but they have mostly been gimmicks and games like Nathan Gitter’s Rainbrow.
Apple may hope that adding these sensors to the rear of the device will inspire new, more useful applications. And then there’s the obvious augmented reality connection.
Cupertino introduced ARKit, a set of tools and features available to developers for making augmented reality apps on iOS, in 2017. We wrote a detailed explainer about it in 2018 after ARKit 2 was introduced. The latest version is ARKit 3, introduced alongside iOS and iPadOS 13 in late 2019. ARKit 3 introduced people occlusion, built-in motion-capture tools, multiple face tracking, and the ability for multiple users to work simultaneously in the same AR environment, among other things.
Apple CEO Tim Cook has previously said he believes augmented reality will at some point be a watershed moment akin to the introduction of the App Store, and there has been plenty of evidence that the company has been working internally on an AR glasses product. Earlier this week, 9to5Mac claimed to have uncovered evidence of a new AR app in leaked iOS 14 code, as well as indications that a rear ToF sensor is coming to new iPad Pro models as well. The app would allow users to get heads-up information on products and other objects in spaces around them and would be used in both Apple Stores and at Starbucks locations.
Up to this point, AR apps on the iPhone have relied on depth differentials between the multiple traditional cameras on the back of the newer iPhones to handle depth, but that is not as precise as what this new sensor array would allow. Quoted in the Fast Company article, Lumentum VP Andre Wong says that AR apps haven’t taken off in a huge way in part because of the lack of this depth-sensing capability and what that means for the quality of the experiences:
When you use AR apps without depth information, it’s a bit glitchy and not as powerful as it ultimately could be… Now that ARKit and (Google’s) ARCore have both been out for some time now, you’ll see new AR apps coming out that are more accurate in the way they place objects within a space.
Introducing these more precise tools will almost certainly improve AR on iPhones, but it’s probably not enough to open the watershed Cook has predicted. There’s still something awkward about using AR experiences on a phone, and the success of the rumored glasses product might be needed to start a developer gold rush.
In the meantime, though, Apple seems to be remaining focused on AR, building out both hardware and software features and tech that allow developers to experiment and bring new ideas to the App Store. As we wrote in our ARKit 2 explainer, it’s a long game: a vast, refined array of APIs and tools would be necessary to facilitate rapid adoption of AR for the glasses product by third-party developers. Building all that now on the admittedly flawed-for-this-purpose iPhone platform would mean Apple could hit the ground running when and if its glasses finally come to market.
And in the meantime, the new rear sensors would likely enable some neat new camera features, which is the main battleground in a features arms race between Apple and its Android-wielding competitors like Samsung.