Google is calculating tree coverage from aerial imagery it collects for Maps
With extreme temperatures becoming common in urban jungles, policymakers are increasingly looking to increase shade on warming city streets. However, a well-planned (hence more effective) tree-planting effort requires budget and resources and not many cities can afford to spare those, especially during an unprecedented pandemic. Google, through its Environmental Insights Explorer platform, a tool that makes it easier for cities to measure, plan, and reduce carbon emissions, wants to change that.
Starting the novel ‘Tree Canopy Lab’ initiative with the City of Los Angeles in the United States, the technology giant is leveraging Google AI and Google Earth Engine’s data analysis capabilities to pinpoint all the trees in a neighborhood and measure their density.
A specialized tree-detection AI has been developed to automatically scan the aerial imagery that Google airplanes capture during the spring, summer, and fall seasons. The algorithm then detects the presence of trees and produces a map that shows the density of the tree cover, also known as ‘tree canopy.’
Google explains that the imagery is analyzed in combination with 3D digital surface models to generate a vegetation probability model. Any values above a certain threshold are classified as a tree canopy. The analysis is also said to include near-infrared data, when available.
Cities can use tree coverage insights in multiple ways, such as comparing tree coverage by neighborhoods and identifying residential blocks with high tree planting potential. Policymakers can learn about the tree coverage, average heat health index, and average population density per area and locate sidewalks that are vulnerable to higher temperatures due to low canopy coverage. Google is also equipping Tree Canopy Lab with pre-calculated visualizations to help city officials understand how heat and population density correlate with tree coverage.
Google’s analysis in Los Angeles, for example, has found that more than 50 percent of Angelenos live in areas with less than 10 percent tree canopy coverage and 44 percent of the city population live in areas with extreme heat risk.
As Eric Garcetti, Mayor of Los Angeles, says, “Every tree we plant can help stem the tide of the climate crisis, and when we expand our urban forest, we can sow the seeds of a healthier, more sustainable and equitable future for communities hit hardest by rising temperatures and intensifying heatwaves. Google’s technology will help us bring the power of trees to families and households across Los Angeles – adding greenery to our public spaces, injecting beauty into our city, and bringing cooler temperatures to our neighborhoods.”
Google plans to make Tree Canopy Lab available to hundreds of cities in 2021.
Velodyne eyes improved roadway safety with sub-$500 LiDAR sensor
In 2007, Velodyne used to charge almost $80,000 for each LiDAR unit it developed to advance safety in self-driving vehicles. Today, the company is prepping to offer one for less than $500.
With mass manufacturing expected to be underway by the second half of 2021, Velodyne’s solid-state Velarray H800 LiDAR sensor is architected for automotive-grade performance. Focusing on safe navigation and collision avoidance in ADAS and autonomous mobility applications, the sensor has been built using Velodyne’s breakthrough proprietary micro-lidar array architecture (MLA). It is compact enough to fit neatly behind the windshield of a vehicle or be mounted seamlessly on the automobile’s exterior.
Velodyne Lidar CEO, Anand Gopalan, acknowledges that the sensor comes in response to the need of delivering a self-driving mapping product at a price that makes economic sense for automakers. “The world needs enhanced safety in consumer vehicles and the Velarray product line makes that available to end consumers creating safer roadways and cars for all,” Gopalan stresses.
More specifically, Velodyne’s new LiDAR can support advancements in autonomy and advanced driver assistance systems (ADAS) from Level 2 to Level 5. The Velarray H800 boasts a field of view (FOV) of 120 horizontal degrees by 16 vertical degrees and provides perception data at a range of up to 200 meters, which means it can support safe stopping distances even at highway speeds. The company also assures that the sensor will offer a rich point cloud density required for high-resolution mapping and object classification tasks.
“We want to help build the public’s trust in automated vehicle systems,” says Marta Hall, Velodyne founder and CMO. “Velodyne is introducing [the Velarray H800] as a key sensor for systems designed as building blocks for vehicle safety. Once the public experiences the benefits of reliable automated safety systems, they will welcome more products like this.”
The Velarray H800 is Velodyne’s first new sensor to be launched since the company went public earlier this year.