1.5 Million Walks, Runs, and Bike Rides from RunKeeper mapped on Mapbox
Two Mapbox developers Garrett Miller & Eric Fischer worked on a cool project. They’ve pulled some data from RunKeeper, a jogging app that recently partnered with Mapbox, and they visualized 1.5 million walks, runs and bike rides in cities all over of the world.
The effect is really cool. The colors of the tracks represent a length of each track. Shorter workouts range from cold blue to hot pink, mid-length routes are represented by glowing white and longer routes over 25 km yellow. Of course some cities have more data available than others. Nonetheless there are several patterns common in most of urban spaces e.g people like to run by a water.
We’ve seen a similar project made by Strava, which launched as a commercial platform called Strava Metro with a similar data designed for urban planners. Check out this video to learn how has the platform been used in Portalnd:
Compared, the Mapbox and Strava platforms are similar. What I like about Mapbox is that it visualizes additional data about the length of a trip. Strava on the other hand allows for much more detailed zoom than Mapbox. At the end the most important thing is the volume of the data, and I guess that this varies from a city to a city.
Apple patens laser beam mapping for iPhone
Apple has recently patented a technology which might change the way we measure and map our surroundings. It’s based on laser which would be mounted inside the iPhone and used together with the inbuilt motion sensors to generate a map of any surface it is pointed at.
Using laser technology would allow measuring distance from a device to the object. Combined with precise indoor/outdoor positioning technology it could allow for creating 3D maps of rooms and even buildings, something that currently requires land surveying equipment.
The technology would allow gathering single points, more similar to laser distance measuring devices like Disto, rather than point clouds we know from Lidar, which makes sense because it gives much more flexibility and requires less computing power.
The technology itself is not very complex and it’s been used for a while already. The main challenge is developing an app which would compute the data from all the sensors, generate a usable 3D model and possibly also combine it with a camera image.
It seems that this patent is an answer to Google’s Project Tango, which has been launched early this year.
source: Patently Apple