The goal? To make the smartphone into the leading augmented reality (AR) device thanks to both front and rear-facing 3D sensing capabilties. Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, according to Bloomberg, quoting unnamed “people familiar with the plan.”
The TrueDepth camera for the iPhone X works by using a projector to cast 30,000 dots on your face, which it then reads with an infrared camera. Apple already demonstrated applications of the system, FaceID and Animojis, both of which promise implications of their own when the TrueDepth SDK (software developer kit) is available for developers. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a 3D picture of the environment, according to Bloomberg.
Chances this rumor is true, according to the Sellers Research Group (that’s me): 90%. It makes sense with Apple’s focus on AR with the iPhone X, TrueDepth cameras, and ARKit.