Home Geo NEWS What’s a LiDAR sensor doing on an iPhone anyway?

What’s a LiDAR sensor doing on an iPhone anyway?


For our regular readers and the geospatial community in general, Light Detection and Ranging or LiDAR is part of common parlance. For everyone else, it’s a cool new tech that Apple first incorporated in iPad Pro and has now adapted for its iPhone 12 Pro models.

You will find the sensor integrated into the rear camera cluster of the Pro series. But what exactly is a LiDAR scanner doing on an iPad or iPhone in the first place?

By measuring how long it takes for light to reach an object and reflect it back, a LiDAR sensor provides incredible depth-sensing capabilities. Unlike other depth sensors that perform better in daylight conditions, LiDAR works just as well in low-light environments too. For Apple products, this pans out broadly into two main application areas:

Improved AR experiences

The LiDAR scanner on iPhone and iPad can be used to create incredible AR experiences that interact with real-world objects. Developers can instantly place AR objects in the real world and take advantage of the device’s depth information to create life-like experiences with real-world physics, object occlusion, and lighting effects. Think better home design tools, interactive shopping apps, and immersive gaming.

iPhone 12 Pro and Pro Max users will first see this technology being leveraged in Snapchat’s iOS app. The social networking company plans to launch a multitude of LiDAR-powered Lenses specifically for the new iPhone and iPad.

“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” says Eitan Pilipski, Snap’s SVP of Camera Platform, pointing out how LiDAR allows Snapchat’s camera to see a metric scale mesh of the scene, understanding the geometry and meaning of surfaces and objects.

The ongoing global pandemic has already shown us how technologies like AR and 3D scanning can open up a world of opportunities for businesses. In the last 3 months alone, spatial data company Matterport has seen more than 100,000 downloads of its iPhone app that can be used to create virtual walkthroughs of 3D spaces.

“Organizations opened virtual museum tours globally, enabled insurance and restoration claims to continue, as well as allowed apartments, offices, and warehouses to continue to be rented,” explains Robin Daniels, CMO, Matterport. “With the announcement of the iPhone 12 Pros with LiDAR, Apple has taken a big step in making powerful 3D sensing hardware mainstream. We couldn’t be more excited about this development and its applications.”

Better low-light photography

Apart from providing more realistic AR experiences, a LiDAR sensor on iPhone 12 Pro models will improve the camera’s autofocus by 6x in low-light scenes. This translates into more accuracy and reduced capture time in photos and videos. Users will also be able to capture Night mode portraits with beautiful low-light bokeh effect, such as the one you see below:

The Pro series is priced at a couple of hundred dollars more than the iPhone 12, which is not a bad bargain for an awesome tech like a LiDAR sensor. What do you think?

Ishveena is a geospatial enthusiast and a freelance technology writer who has been named among Geospatial World's 50 Risings Stars 2021. With 13 years of mainstream journalism and digital content writing experience, Ishveena is passionate about bringing to the fore the value of location technology to the economy and society. Her clients include GIS corporations, proptech companies, fintech leaders, and some of the world's top drone manufacturers and service providers.

No comments

Leave a reply

Please enter your comment!
Please enter your name here