A glimpse of the new Mapbox Vision SDK. MAPBOX
Right now, a lot of people are very excited about the future of technologies like AR, VR, AI, and autonomous vehicles. However, as I’ve written before, most of these technologies are relatively useless without contextual awareness. I have also written in the past about the importance of image sensors and how they enable AI and autonomous systems to better understand the world around them. Combining location awareness and vision is incredibly difficult and is fundamentally what enables app developers to anchor digital assets in the real world for augmented reality. There are currently only two companies capable of doing this— Google and Mapbox. Today I wanted to talk about the lesser known of the two.
Mapbox announces new SDKs and partnerships
Mapbox has a leg up on Google in that it provides more flexible options for linking image sensors and contextual awareness. Just in the last month, Mapbox announced numerous partnerships and initiatives to further improve location awareness. First, Mapbox announced a partnership with the world leader in mobile chip design, Arm, to implement its new Vision SDK. Mapbox claims the Vision SDK will provide a fusion of visual and location data to improve the accuracy and overall experience of AR. The Vision SDK is arguably one of the biggest announcements out of Mapbox in quite some time—it expands the company’s capabilities while also giving its developers more tools to work with when it comes to live location. It will help developers enable more robust AR in places like automotive navigation. The more developers utilize Mapbox’s platform in their applications, the more Mapbox will thrive.
Mapbox also announced the React Native AR SDK and SceneKit SDK for iOS—two developer kits geared towards building AR experiences for mobile. These announcements have the potential to be game-changers. Most AR applications today behave very similarly to VR applications, confined to a single room or a single surface (like a table). Because of this, many consumers and businesses don’t find AR much more compelling than VR—both confine you to a certain space. The real world is where AR really shines, but only if applications can utilize live location correctly and with a reasonable amount of accuracy. I believe Mapbox’s live AR mapping capabilities will enable the next phase of world-scale AR apps, bringing AR much closer to realizing its full potential. Mapbox is investing heavily to enable AR virtually anywhere, which is why you see them continuing to add SDK support and features that make world-scale AR easier to implement.
While Google and Mapbox both offer similar capabilities, its worth noting that some of the biggest applications in the world run on Mapbox’s mapping platform. These companies include Foursquare, Snapchat, Tinder, Uber , and many more who rely on map accuracy and live location. Mapbox’s trustworthy status as an independent, 3rd party likely appeals to many of these companies. As more users become aware of how their data is used by companies like Google, they will likely become more concerned about how their location data is gathered, and by whom.
Ultimately, I believe these new SDKs from Mapbox will help usher in the next generation of AR, AI, and Autonomous Vehicle applications. The company’s flexibility and independent status make the company an attractive option for developers wary of Google, and I think we’re going to see its platform integrated into more and more applications in the coming years. We’re only at the very beginning of what’s possible with this technology, and I look forward to seeing what’s next.