Technology

Apple plans to add a 3D sensor to the back of the iPhone, too

(Source: arstechnica.com)

The iPhone X’s front-facing TrueDepth sensor array could be used for more than just Face ID authentication, and it fits neatly into Apple’s broader march into augmented reality on the iPhone, but the iPhone X’s rear camera still uses a combination of motion sensors and two rear cameras for AR. That could change in a future iPhone; sources cited by Bloomberg claim that Apple plans to add 3D camera technology to the rear of next year’s iPhone in addition to the TrueDepth array already on the iPhone X’s front.

The rear camera might not use the same technology as the TrueDepth sensor array used for Face ID on the front of the iPhone X, however. Rather, the rear array might use time-of-flight sensors, which would map objects in 3D space by calculating how long it takes for light from its laser to bounce off of an object in its field of view. Bloomberg’s sources say that adoption of this technology is not certain, but it seems to be what Apple is testing right now. The technology is in development at Sony, Panasonic, Infineon Technologies, and STMicroelectronics.

In the iPhone X, Apple aligned the telephoto and wide-angle lens cameras on the back vertically (instead of horizontally, as on the iPhone 8 Plus) to make augmented reality applications more effective. But without a more advanced way to read and track 3D space, AR apps will remain limited. Unlike more robust hardware like Microsoft’s HoloLens, the current iPhones’ rear cameras can’t deal well with surfaces that aren’t flat. They can’t even track when an object is obstructing the camera’s view; current iPhone AR apps place an object in space relative to the flat surface but can’t partially obscure it behind a real-world obstacle, for example.

The addition of 3D sensors to the rear of the iPhone would address those limitations, allowing for much more realistic—and in some cases, more useful—AR experiences.

Apple CEO Tim Cook has been aggressively promoting AR to both consumers and investors. In a recent interview with The Independent, Cook said that he expects the adoption and impact of AR to be as dramatic as that of mobile apps when the Apple App Store launched more than nine years ago. There are also reports that Apple is working on an AR headset in a company group called T288, which has already produced ARKit, Apple’s AR software toolset for app developers.

The AR app marketplace is nascent now, but Apple wants AR to be more meaningful than Pokémon Go and a neat IKEA furniture shopping app. Even Warby-Parker’s impressive glasses-dressing-room app is just a hint of what might come later.

But if a future iPhone adds this rear-facing device, fragmentation of Apple’s installed base could be a challenge; between the 2019 iPhone, the iPhone X, the iPhone 8 series, and prior ARKit-supported iPhones like the iPhone 6S and 7, Apple and third-party app developers will have to support four different AR hardware toolsets. The prospects for AR are promising, but it’s going to be a bit messy realizing them.

Correction: This article originally stated that this feature was being developed for a 2018 release. The report actually indicated 2019. The article has been updated accordingly.

More Info: arstechnica.com

Advertisements

Categories: Technology

Tagged as: , , , , , , , , ,