Massive Autonomous Vehicle Sensor Data – What Does It Mean?
It was reported by Ford that connected car sensors generate 25 gigabytes of data per hour, and then by the WSJ that a typical autonomous vehicle generates 4 terabytes of data in 90 minutes, and then by Intel of 45 terabits per hour. All are massive numbers, but why so different? It points to a bandwidth problem. The raw data is beyond any auto OEM’s ability to manage, even in 5G, and so the amounts reported could be raw data or some prioritized data, that facilitate core vehicle diagnostics and operating behavior improvements.
One mitigating factor is local bandwidth. In-vehicle computing offers the potential for sharing the computational workload. Sensor systems like Mobileye’s onboard camera are integrating sophisticated machine-learning vision algorithms. Likewise, Lidar innovator Innoviz is building a software stack which has a similar strategy. One also can speculate a future where non-critical, near-redundant data capturing is efficiently crowdsourced.
Nevertheless, today there are in excess of 100 sensors onboard, and the auto sensor market is expected to grow over 100% per year, and reach nearly 200m units in 2021. Advances in the more powerful sensors (camera, lidar) will produce richer data, and require more bandwidth. Furthermore, physical world updating in real-time, and sensor-fusion for collating and acting will add more bandwidth demands. Taken together, we can be sure bandwidth issues will continue as a challenge for OEMs.
So, what does it mean for the OEM?
In an autonomous vehicle future, OEMs will differentiate by onboard data processing. OEMs become vehicle operators and software companies, and their success metrics will lean on navigation performance and incidence rates. Theoretically, as software companies, with massive real-time data, they can choose to optimize data processing between vehicle operating behavior and the environment beyond the vehicle. Practically, OEMs will be compelled to prioritize vehicle operating behavior.
So, what does it mean for real-time third party applications?
Even strong third party applications may be deprioritized. Weather is a good example. Weather modeling today is coarse and heavily interpolated. With onboard environmental and weather sensors, the ability to mine very detailed and very real-time data would enable ultra-precise weather models, of interest to utilities, infrastructure companies, smart cities. Interestingly, Continental has been testing vehicle swarm crowdsourcing to capture hyper-local weather sensor data.
So, what does it mean for AV technology companies?
While AV technology companies will continue to innovate and differentiate based on performance and system architecture, they will also increasingly be tested on bandwidth consumption. Here are some companies to watch, and their bandwidth strategies:
wpDataTable with provided ID not found!
Waymo and Lyft team up to work together on Self-driving car tech
The ever-changing Self-Driving Car landscape!
Waymo and Lyft have a common competitor in Uber, and that is a strong reason for them to partner together (today). That, of course, doesn’t mean that Waymo doesn’t harbor plans for its own ride-hailing service. The startup currently offers an Early Rider program for the residents of Phoenix metropolitan area, Azirona, USA to try their self-driving cars and provide the Waymo team with feedback.
Interestingly, Google filed for a patent that details how Self-Driving Cars will determine pick-up and drop locations. The patent was filed back in 2015 when Waymo didn’t exist, so it’s highly possible that it belongs to Waymo today.
The partnership with Lyft is a step towards bringing its Self-Driving car tech to the market sooner for Waymo and potentially for a deal down the lane with General Motors, which is a major investor in Lyft.
It’s going to be interesting to understand the details of the deal, as more information emerges but the Self-Driving car landscape is changing every day! Interesting times!