Apple Is Working Hard On An iPhone Rear-Facing 3D Laser For AR And Autofocus: Source

By Mark Sullivan , July 12, 2017

We speculated in June that Apple’s announcement of a new augmented reality development kit (ARKit) telegraphed the addition of new AR iPhone components in the very near future. This turns out to be exactly the case. A source with knowledge of the situation tells Fast Company Apple is working hard to add a rear-facing 3D laser system to the back of a future iPhone.

 

The new sensor system will enable better depth detection for augmented reality apps, and a more accurate type of autofocus for photography, the source tells me.

The source said the VSCEL laser system may be intended for the 10th anniversary iPhone (which may be called the iPhone 8 or the iPhone Pro or, hopefully, the iPhone X). Whether the sensor will be included in that phone, or a 2018 iPhone, depends on the progress the Apple engineers make in integrating the laser system into the phone, our source says.

An Apple spokesperson declined to comment.

We reported the existence of 3D sensors in the new iPhone back in February, but it was unclear at the time where they would go on the phone and what they would do. The picture has come into better focus since then. KGI Securities analyst Ming-Chi Kuo (who is usually accurate on iPhone plans) believes the iPhone 8 front-facing camera will use a laser for 3D selfie effects and to recognize the user’s face for authentication purposes.

Augmenting Augmented Reality

This may be true, but the rear of the phone is where the real action is for 3D lasers. The lasers are a huge part of Apple’s AR story. Right now, ARKit apps rely only on the single iPhone camera to picture and measure the real world into which digital content is placed. While the resulting experience is already surprisingly good, the addition of a 3D laser system on the back of the phone would dramatically improve the depth measurement part, and would make the AR even more lifelike.

Put in this context, the idea that Apple is working on a rear-facing 3D sensor for the iPhone is no huge revelation. The question is how long Apple can wait after the introduction of ARKit to get the laser into the iPhone.

 

(Phone-based AR is a clunky, device-in-front-of-face experience, but we’re stuck with this interim step toward AR glasses for the time being, and accurate depth sensing is needed in any event.)

VCSEL laser systems calculate the distance the light travels from the laser to the target and back to the sensor, and generate a Time of Flight (TOF) measurement. The system consists of a source (the VCSEL laser), a lens, detector (sensor), and a processor. The whole thing costs about $2 per phone, our source says.

Autofocus

The laser system would have big implications for the iPhone camera, too. Laser autofocus systems offer a faster and more accurate way of measuring objects in the frame compared to the more passive means used in iPhones now. The laser sends out light beams that bounce off objects and return to the sensor to indicate the precise depth of field of each. The camera lens can then focus on the desired aspect of a shot in milliseconds. Laser autofocus systems are already used in smartphones from Google, Huawei, OnePlus, and Asus.

Current iPhones use another kind of autofocus called phase detection autofocus, which was introduced with the iPhone 6 in 2014 under the name “Focus Pixels.” In general terms, phase detection systems compare the light rays coming into opposite sides of the camera lens, then compare them to determine whether the lens is focusing too close or too far away. The differences in the colors are used to constantly improve the focus of the camera.

About The Vendors

Our source says Lumentum will be providing the bulk of the VCSEL lasers for the new iPhone, while Finisar and II-VI (pronounced “two-six”) will contend for a smaller share. Apple has already qualified all three companies as potential suppliers, the source says. In June, executives at both Lumentum and Finisar discussed parts orders in coming quarters of such scale that some analysts surmise that they could have come only from Apple.

The TOF sensor will come from STMicro, Infineon, or AMS, the source says. Apple will likely buy the whole system in module form from either LG Innotek, STMicro, AMS, or Foxconn, said the source.

 

The iPhone’s Next Chapter

The iPhone 8 arrives on the 10th anniversary of the iPhone, and Apple clearly wants it to represent a new chapter in the life of the device. The new phone is expected to pack several new features that are brand new to iPhones, including wireless charging, an edge-to-edge OLED display, and–possibly–sealed buttons on the side of the phone that respond with haptic feedback and are completely waterproof. And, as we reported (July 21, 2017), some of these new technologies are difficult to implement.

Nor are they cheap. The iPhone 8 is widely expected to cost hundreds more than earlier iPhones.

The “iPhone 8” will be one of three new iPhones announced by Apple in the fall. The other two will be the successors to the current iPhone 7 and 7 Plus, and will likely be called the iPhone 7S and 7S Plus.

The addition will make future iPhones far better suited to the AR experiences being created by developers using the ARKit development platform.

We speculated in June that Apple’s announcement of a new augmented reality development kit (ARKit) telegraphed the addition of new AR iPhone components in the very near future. This turns out to be exactly the case. A source with knowledge of the situation tells Fast Company Apple is working hard to add a rear-facing 3D laser system to the back of one of the new iPhones to be announced this fall.

 

 

Fast Company , Read Full Story

(74)