• Uncategorised

What is Apple's new lidar tech and what can it do for the iPhone 12 Pro?

The iPhone 12 Pro's lidar sensor opens up a lot of possibilities for AR.

Apple

The iPhone 12 and 12 Pro are on sale now, but one of the key differences between the Pro and non-Pro models this year is a depth-sensing technology. If you look closely at one of the new iPhone 12 Pros, or the most recent iPad Pro, you'll see a little black dot near the camera lenses, about the same size as the flash. That's a lidar sensor.

Apple's been very bullish about lidar as a way of adding depth-sensing and new augmented reality features to its pro-end tablets and phones. It could help a lot with camera focus, too.

But why is Apple making a big deal about lidar and what will it be able to do if you buy the iPhone 12 Pro or iPhone 12 Pro Max?  It's a term that you'll start hearing a lot right now, so let's break down what we know, what Apple is going to use it for, and where the tech could go next.

What does lidar mean?

Lidar stands for Light Detection and Ranging, and has been around for a while. It involves using lasers to ping off objects and return to the source of the laser, measuring distance by timing the travel. 

For more like this

Subscribe to the Apple Report newsletter, receive notifications and see related stories on CNET.

The iPad Pro released in the spring also has lidar.

Scott Stein/CNET

How does lidar work to sense depth?

Lidar is a type of time of flight camera.

Time of flight cameras, which are already on some other smartphones, often measure depth with a single light pulse, while lidar sends waves of light pulses out in a spray of infrared dots and can measure each one on a sensor, creating a field of points that map out distances and can "mesh" the dimensions of a space and the objects in it. The light pulses are invisible (infrared), but you could see them with a night vision camera.

Isn't this like Face ID on the iPhone?

It is, but longer-range. The idea's the same: Apple's Face ID-enabling TrueDepth camera also shoots out an array of infrared lasers, but can only work up to a few feet away.

The rear lidar sensors on the iPad Pro and iPhone 12 Pro work at a range of up to five meters.

Remember the Kinect?

Sarah Tew/CNET

Lidar's already in a lot of other tech

Lidar is a tech that's sprouting up everywhere. It's used for self-driving cars, or assisted driving. It's used for robotics and drones.

Augmented reality headsets like the Hololens 2 have similar tech, mapping out room spaces before layering 3D virtual objects into them. But it also has a pretty long history.  Microsoft's old depth-sensing Xbox accessory, the Kinect, was a camera that had infrared depth-scanning, too.

In fact, Primesense, the company that helped make the Kinect tech, was acquired by Apple in 2013. Now, we have Apple's face-scanning TrueDepth and rear lidar camera sensors.

The iPhone 12 Pro camera could work better with lidar

Time of flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better low-light focus: Apple says it'll be up to 6x faster in low-light conditions.

The lidar depth-sensing will also be used to improve night portrait mode effects. Better focus is a plus, and there's also a chance the iPhone 12 Pro could add more 3D photo data to images, too. Although that element hasn't been laid out yet, Apple's front-facing depth-sensing TrueDepth camera has been used in a similar way with apps.

Snapchat's already enabling AR lenses using the iPhone 12 Pro's lidar.

Snapchat

It will also enhance augmented reality a lot

Lidar will allow iPhone 12 Pros to start into AR apps a lot more quickly, and build a fast map of a room to add more detail.

A lot of Apple's AR updates in iOS 14 are taking advantage of lidar to hide virtual objects behind real ones (called occlusion), and place virtual objects within more complicated room-mappings, like on a table or chair. But there's an extra potential beyond that, with a longer tail. A lot of companies are dreaming of headsets that will blend virtual objects and real ones: these AR glasses, being worked on by Facebook, Qualcomm, Snapchat, Microsoft, Magic Leap, and most likely Apple and others, will rely on having advanced 3D maps of the world to layer virtual objects onto.

Those 3D maps are being built now with special scanners and equipment, almost like the world-scanning version of those Google Maps cars. But, there's a possibility that people's own devices could eventually help crowdsource that info, or add extra on-the-fly data. AR headsets like Magic Leap and HoloLens already pre-scan your environment before layering things into it, and Apple's lidar-equipped AR tech works the same way.

In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part...and could pave the way for Apple to make its own glasses eventually.

A 3D room scan from Occipital's Canvas app, enabled by depth-sensing lidar on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning could be the killer app

Lidar can be used to mesh out 3D objects and rooms and layer photo imagery on top, a technique called photogrammetry. That could be the next wave of capture tech for practical things like home improvement, or even social media and journalism.

The ability to capture 3D data and share that info with others could open up these lidar-equipped phones and tablets to be 3D content-capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.

Remember Google Tango? It had depth-sensing, too.

Josh Miller/CNET

Apple isn't the first to explore tech like this on a phone

Google had this same idea in mind when Project Tango -- an early AR platform, that was only on two phones -- was created.

The advanced camera array also had infrared sensors and could map out rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces.

Google's Tango-equipped phones were short-lived, replaced by computer vision algorithms that have done estimated depth sensing on cameras without needing the same hardware.

But Apple's iPhone 12 Pro looks like the much more advanced successor.

You may also like...