Apple will launch at least one new iPhone model with a triple-lens rear camera next year, according to Taiwanese website Economic Daily News, citing a research note from Deutsche Securities analyst Jialin Lu.
According to MacRumors, Lu thinks the triple-lens camera system will enable advanced 3D sensing via stereoscopic vision, with two of the sensors able to capture images of a single object from different angles. A triangulation method would then be used to obtain the distance between the iPhone and the object.
The goal? To make the smartphone into the leading augmented reality (AR) device thanks to both front and rear-facing 3D sensing capabilities. Apple is purportedly evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X.
The TrueDepth camera for the iPhone X works by using a projector to cast 30,000 dots on your face, which it then reads with an infrared camera. Apple already demonstrated applications of the system, FaceID and Animojis, both of which promise implications of their own when the TrueDepth SDK (software developer kit) is available for developers. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a 3D picture of the environment.
Chances this rumor is true, according to the Sellers Research Group (that’s me): 90%. It makes sense with Apple’s focus on AR with the iPhone X, TrueDepth cameras, and ARKit.
Like this article? Consider supporting Apple World Today with a $5 monthly Team AWT membership.