NVIDIA Reinforces Automotive AI Lead With Uber And Volkswagen Announcements At CES 2018

Under the chaos and fog that is the Customer Electronics Show (CES) this week in Las Vegas, few companies break through the noise and stand out.  CES is the preeminent show for consumer technology vendors.  This year the primary focus of this mega conference is on advancements in robotics (machine intelligence), drones, medical devices, gaming, 3D Printing, sports technologies and especially autonomous/ self-driving vehicles. One of the companies that enables much of the innovation on display at CES is NVIDIA. Their Graphics Processing Units (GPU) chips are the core for many machine learning and artificial intelligence (AI) based solutions – effectively, applications that can see, hear, smell, and learn like humans.

Uber Self Driving Cars & Ride Hailing Cars/Freight Trucks

NVIDIA has always been at the forefront of machine learning. As difficult as it is to stand-out at CES, NVIDIA is making several announcements this week that I believe are sure to turn heads and rise above the noise. I’d like to go through the three announcements and to provide insights into what it means to the company and the industry

NVIDIA and Volkswagen are integrating AI to enhance co-pilot capabilities

By infusing AI into their cars, Volkswagen can deploy automobiles that have improved convenience and assistance systems such as AI based voice assistance, facial recognition, gaze tracking for alerting to driver distractions and leveraging real-time data from sensors inside and outside of the vehicle to help drivers to make better driving decisions.

This announcement by NVIDIA is similar to Intel’s partnership in November of 2017 with Warner Brothers to provide an intelligent, and immersive experience for drivers using AI and augmented reality. Further, Volkswagen is investing six billion euros in electromobility to ensure the company has a leadership seat at the table – this will, at a minimum, be matched by their competitors.  In my opinion, the industry for immersive automobile experiences is nascent, but the amount of investment in advanced automotive features and function has always been how manufacturers differentiate themselves. I anticipate as the war for self-driving cars heats up, the real battle will be for improved and immersive experiences for both the driver and passenger.  Co-pilot infused vehicles will only enhance the experience.
Uber will power their self-driving cars and trucks using NVIDIA’s AI technology

Since 2015, Uber has deployed dozens of self-driving (level 3 – 4) trials across the country. Uber’s self-driving vehicles have driven over 2M miles and over 50,000 rides.

This announcement makes a ton of sense. Even though Uber has stumbled from a security and privacy perspective, they are leading much of the innovation in the autonomous vehicle (AV) marketplace. Uber has traditionally been very secretive in disclosing what technologies they leverage to deploy their autonomous fleet, so this announcement is significant. I believe NVIDIA lends credibility to Uber by bolstering safety, reliability, and especially scalability when deploying the Uber self-driving army of vehicles.  Although Uber’s AV strategy is still a novelty; it is on the fast track to becoming a reality. Their real challenge will be winning not just the technology race, but the user experience as well. Thankfully, Intel and NVIDIA are enabling the innovation while automotive companies and manufacturing firms like Jabil Inc. are enhancing the user experience (UX).

As part of NVIDIA’s DRIVE Pegasus (PX) AI platform, Xavier processors will be delivered to customers in Q1’2018

Xavier is the result of a reported two billion dollar NVIDIA investment to expand processing power and capabilities to the AV marketplace from auto cruise to fully autonomous vehicles and robotaxis. NVIDIA’s DRIVE PX can pull and process information from sensors, cameras, LIDAR, and radar to get a holistic view of the environment around the AV.  Xavier has the processing power to incorporate deep learning and artificial intelligence to improve HD mapping, vehicle and pedestrian tracking, and improved decision making even when the car is not driving itself.

Improved innovation and faster processing are going to be the accelerant to get to level 5, fully autonomous, vehicles. While Xavier potentially represents a phase shift in NVIDIA’s DRIVE PX, the messaging and roll-out feels a bit muddled. With such advancement in performance as advertised, NVIDIA could have led with use cases of the 25 companies using Pegasus to bring production-ready vehicles to market and why the Pegasus AI platform is the right choice and direction.

The AV marketplace is driving much of the innovation and excitement in the technology world. Face it; we use our cars for communications, entertainment, information, and sometimes even transportation.  The AV market represents millions of small mobile data centers that require petabytes of data management and backhaul per day, faster than the real-time processing of information, and algorithms that can predict and learn our behaviors more quickly than we can ourselves. NVIDIA is leading much of the thought leadership and innovation in this disruptive marketplace. Based on what I have seen at CES and from NVIDIA thus far, it is going to be a fun ride.

Note: This blog contains contributions by Patrick Moorhead, principal analyst at Moor Insights & Strategy