What’s a LiDAR sensor doing on an iPhone anyway?
For our regular readers and the geospatial community in general, Light Detection and Ranging or LiDAR is part of common parlance. For everyone else, it’s a cool new tech that Apple first incorporated in iPad Pro and has now adapted for its iPhone 12 Pro models.
You will find the sensor integrated into the rear camera cluster of the Pro series. But what exactly is a LiDAR scanner doing on an iPad or iPhone in the first place?
By measuring how long it takes for light to reach an object and reflect it back, a LiDAR sensor provides incredible depth-sensing capabilities. Unlike other depth sensors that perform better in daylight conditions, LiDAR works just as well in low-light environments too. For Apple products, this pans out broadly into two main application areas:
Improved AR experiences
The LiDAR scanner on iPhone and iPad can be used to create incredible AR experiences that interact with real-world objects. Developers can instantly place AR objects in the real world and take advantage of the device’s depth information to create life-like experiences with real-world physics, object occlusion, and lighting effects. Think better home design tools, interactive shopping apps, and immersive gaming.
iPhone 12 Pro and Pro Max users will first see this technology being leveraged in Snapchat’s iOS app. The social networking company plans to launch a multitude of LiDAR-powered Lenses specifically for the new iPhone and iPad.
“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” says Eitan Pilipski, Snap’s SVP of Camera Platform, pointing out how LiDAR allows Snapchat’s camera to see a metric scale mesh of the scene, understanding the geometry and meaning of surfaces and objects.
The ongoing global pandemic has already shown us how technologies like AR and 3D scanning can open up a world of opportunities for businesses. In the last 3 months alone, spatial data company Matterport has seen more than 100,000 downloads of its iPhone app that can be used to create virtual walkthroughs of 3D spaces.
“Organizations opened virtual museum tours globally, enabled insurance and restoration claims to continue, as well as allowed apartments, offices, and warehouses to continue to be rented,” explains Robin Daniels, CMO, Matterport. “With the announcement of the iPhone 12 Pros with LiDAR, Apple has taken a big step in making powerful 3D sensing hardware mainstream. We couldn’t be more excited about this development and its applications.”
Better low-light photography
Apart from providing more realistic AR experiences, a LiDAR sensor on iPhone 12 Pro models will improve the camera’s autofocus by 6x in low-light scenes. This translates into more accuracy and reduced capture time in photos and videos. Users will also be able to capture Night mode portraits with beautiful low-light bokeh effect, such as the one you see below:
The Pro series is priced at a couple of hundred dollars more than the iPhone 12, which is not a bad bargain for an awesome tech like a LiDAR sensor. What do you think?
TomTom is collecting data from Uber to update its navigation maps
TomTom and Uber are joining forces to ensure our digital maps mirror real-road conditions. Uber has joined TomTom’s Map Editing Partnership (MEP) – a program wherein TomTom opens up its map-making process to trusted partners, allowing them to provide fresh, accurate location information that would help keep the maps up to date.
With the Uber app being used across more than 10,000 cities around the world, the ride-hailing and delivery company is uniquely placed to offer TomTom the latest on-the-ground insights from completed trips and deliveries – including observations about new roads, turn restrictions, street closures, etc.
In return, Uber will get to integrate TomTom’s more accurate than ever maps, traffic data, and Maps APIs across its global platform to provide a seamless experience to users. This will be an extension of the existing partnership between TomTom and Uber that began in 2015.
But what about map quality?
TomTom’s MEP trainers and subject matter experts teach trusted Uber drivers how to best edit a map. Once Uber’s editors gain certification through TomTom’s customized map editing curriculum, they are able to make changes to the live map.
At the same time, TomTom follows a stringent vetting process to ensure all changes are of high quality. All edits undergo the same quality checks that are required of TomTom editors. These real-time quality checks also give Uber editors a level of comfort when making edits and ensure the most accurate map possible.
Also, teaming up with Uber doesn’t mean that TomTom will stop collecting data from its existing sources – survey vehicles, government agencies, TomTom map users, vehicle sensors, and more. Even in the MEP program, TomTom plans to partner with more global leaders in the mobility, on-demand, logistics, and automotive industries.
TomTom regularly processes close to two billion map changes per month, and the MEP program is currently supporting an additional 3 million partner edits to the map every month in more than 70 countries.
“In these fast-changing times, having maps that reflect real-road conditions is more important than ever,” Shashi Vaijayanthi Rangarajan, VP Customer Program Management at TomTom, says. “That’s been especially true in recent months, as protests across the US blocked streets, flash floods washed out roads in southern Europe, and the coronavirus pandemic closed borders worldwide.”
Michael Weiss-Malik, Director of Product, Maps and GSS at Uber, concurs, “Our partnership will help create even more dynamic mapping experiences in our global marketplace.”