Google Maps turns 15 with new logo, updates, and reflections from C-suite
15 years ago, Google Maps started out with a simple but ambitious goal to map the entire world. At that time, getting step-by-step directions to where you were headed seemed revolutionary. Today, more than 1 billion people rely on Google Maps to not just navigate from point A to B, but also to shave hours off their commute, explore the world, and discover local businesses. As Google Maps turns 15 on February 8, 2020, it is getting a whole new look and a host of product updates.
The revamped app is packed with five easy-to-access tabs – Explore, Commute, Saved, Contribute and Updates. The ‘Explore’ tab is home to information, ratings, and reviews of about 200 million places around the world, including restaurants and tourist attractions. ‘Commute’ is where you get real-time traffic updates, travel times, and suggestions for alternative routes. The ‘Saved’ tab will let you view all of the spots you have saved on Google Maps in one convenient place, while the new ‘Contribute’ tab is to let you share local knowledge or add missing places, business reviews, and photos easily. ‘Updates’, meanwhile, will keep you engaged with a steady stream of trending, must-see spots from local experts and publishers.
To commemorate this momentous occasion, several top leaders at Google have also shared their reflections on Google’s geo efforts. For Elizabeth Reid, VP of Engineering, Google Maps, watching Maps grow into what it is today has been an unbelievable experience. “As we’ve added features and capabilities, Google Maps has evolved into much more than a website that gives you turn-by-turn directions. Today, it’s a gateway to exploring the world—both digitally and in real life, on foot or by car, via public transit or a wheelchair. Pardon the pun, but it’s been a long road!” Reid smiles.
And while Google CEO Sundar Pichai cannot stop crediting Google Maps for helping him discover some great burrito places, he is quick to acknowledge how useful the technology can be in places like India where there isn’t always a clear structure to the address system.
“Not only do maps make it easier to get around; they also can give you a sense of identity when you see your street on the map for the first time. That was one of the revelations of MapMaker. Launched by two Google engineers in 2008, it was a way for people to add streets and local landmarks to improve the experience of Google Maps, starting in India. It quickly evolved to help map floods in the Philippines and Pakistan, and later to allow people in the US to add a new road to their neighborhood. Its legacy continues today with Local Guides,” says Pichai, adding that one of the next frontiers for Maps will be to help the billions of people who live without a physical address get a digital one via Plus Codes.
Jen Fitzpatrick, Senior Vice President, Google Maps, points to the role of new technologies like machine learning and artificial intelligence in propelling the future of Maps. “In Lagos, Nigeria alone, machine learning has helped us add 20,000 street names, 50,000 addresses, and 100,000 new businesses—lighting up the map with local places and businesses where there once was little detailed information.”
Maintaining that a truly helpful map reflects on local insights and helps users to find places and experiences that are right for them – instead of just labeling streets and addresses – Fitzpatrick sums up thusly, “When we set out to map the world, we knew it would be a challenge. But 15 years in, I’m still in awe of what a gargantuan task it is. It requires building and curating an understanding of everything there is to know about the physical world, and then bringing that information to people in a way that helps you navigate, explore and get things done in your world. The real world is infinitely detailed and always changing, so our work of reflecting it back to you is never done.”
Why Descartes Labs is convinced it can make geospatial AI your core business competency
As a spatial analytics firm that uses machine learning and artificial intelligence to detect objects and patterns hidden inside petabytes of remotely sensed data, Descartes Labs has been creating custom geospatial AI solutions for global enterprises for quite some time. But now, the Santa Fe, New Mexico-based company is making its platform and tools publicly available so even non-traditional users can leverage geospatial AI as a core business competency.
The Descartes Labs Platform has been designed to handle nearly all geospatial modeling functions through a single cloud-based solution. It is made up of three components:
- Data Refinery: That hosts petabytes of analysis-ready geospatial data with the ability to rapidly ingest, clean, calibrate and benefit from any internal or third-party data source
- Workbench: A cloud-based data science environment that combines the Descartes Labs Platform APIs, visualization tools, and a model repository with a hosted JupyterLab interface
- Applications: To give users the ability to rapidly deploy models and applications
The company believes that its solution can prove especially critical for commodity-focused companies facing sustainability and efficiency challenges. It would allow organizations to quickly evaluate the output of models, speed development and proof-of-concept creation, and transform their decision-making, ultimately, saving millions of dollars in time and money.
To understand how the platform will unlock the power of global-scale geospatial data for organizations that have never been able to access it before, we talked to Sam Skillman, head of engineering at Descartes Labs.
What are some of the challenges that you strived to solve while building custom geospatial AI solutions for some of your early customers? How have those lessons been incorporated into the Descartes Labs Platform?
Due to the complex nature of geospatial data, many businesses have hit roadblocks when trying to apply geospatial AI at an enterprise-scale on their own. While this data adds invaluable insights not possible from other sources, it can also be challenging to acquire and work with.
It has traditionally been very hard to find, access, and utilize remote sensing data from satellites or aerial sensors, especially at a large scale and across multiple data sources. The data refinery component of the Descartes Labs Platform really makes data aggregation, normalizing and filtering easy for customers.
It can also be difficult for data scientists to build robust modeling environments that interact quickly and easily with storage and compute resources. We’ve made a lot of progress in solving this by building a hosted modeling environment using JupyterLab that sits next to the Descartes Labs Platform APIs. Having the data readily available in the data refinery, with quick and easy access to the modeling environment in workbench, all powered by the best cloud compute and storage resources makes Descartes Labs a trusted vendor that can solve business problems for companies of all sizes.
We developed the Descartes Labs Platform to give data scientists and their line-of-business colleagues the best geospatial data and modeling tools in one complete package, something no other company has successfully done before. The platform is the first-ever comprehensive cloud-based geospatial analytics platform that provides the data, scalable compute, and modeling tools all in one solution that helps turn AI into a core competency. As a result, data scientists are able to rapidly model using data that is already cleaned, processed and ready for use, all the while having the flexibility to add their own data and ultimately transform their decision-making capabilities.
Our aim when building the platform was to develop custom geospatial AI solutions for major businesses, proving out multiple use cases for the platform technology. By enabling the rapid and collaborative development of global-scale commodity analytics, forecasts, monitoring, and exploration capabilities, we are able to help enterprises understand the physical world at scale in near real-time.
Give us a real-world example of how an industry could benefit from your platform.
The drive for continuous improvement in physical forecasting and prediction has become a primary goal for producers and consumers across the full spectrum of global trade. This is especially true in 2020, as companies face more rigorous sustainability and efficiency challenges from input materials, manufacturing, or downstream product use. In many cases, sustainability and efficiency are two sides of the same coin. Companies often find it difficult to generate revenue and improve margins without making progress on both fronts.
Through geospatial data and predictive analytics, the platform provides transparency into supply chains and the larger market, supporting key business decisions. The modeling tool enables forecasting capabilities across industries, including agriculture, energy, sustainability, mining, shipping, financial services, and insurance, to facilitate everything from agricultural monitoring to mineral exploration. This is especially critical for commodity-focused companies facing sustainability and efficiency challenges, saving them millions of dollars in time and money by transforming the business quickly and cost-effectively.
We have energy customers using the platform today to solve all sorts of business problems. These might be as simple as having one system to aggregate and serve geospatial datasets to multiple users and applications in a rapid manner, to specific workflows like pipeline encroachment monitoring, well pad detection, or methane monitoring. The combination of having source data across all different resolutions and timescales, coupled with easy to work with modeling tools, and powerful storage and compute, means that energy users can start building production models quickly and without requiring as much domain-specific expertise or cloud computing experience.
Let’s talk about the trends driving the geospatial AI revolution…
Natural resources and supply chain management have become more global, competitive, and increasingly subject to sustainable investing standards. The rise of climate-related financial risk investing is changing incentives for public companies. At the same time, supply and demand shocks are being priced-in more rapidly and volatility is lower in today’s highly connected global environment. Adverse weather and changing environmental conditions make the picture even more opaque. They pose risks to physical assets and increase the need for more accurate assessments of insured value. With as many holes as the global picture has, these uncertainties are magnified when it comes to the impact on your market, the value of your products, your costs, and the risks to your bottom line. Most critically, the opportunity to decide and act is dramatically shorter today than even just a few years ago. The market is learning faster and faster.
When combined with the incredible growth and variety of remote sensing instruments being launched or deployed, there is tremendous opportunity to combine very disparate datasets to answer some of today’s most important questions.
Geospatial AI can help resolve these challenges, and so has become more of a requirement for commodity-focused companies. Especially as many face ongoing sustainability and efficiency challenges, it’s becoming harder and harder to improve margins without this capability. In this context, we equate geospatial AI with the prediction of phenomena in the physical world.