Some folks say that NVIDIA has lost its early pole position in autonomous driving. Tesla de-selected NVIDIA to power its autonomous vehicle computing, and BMW picked Intel to be its technology supplier for autonomous driving systems. However, considering the company NVIDIA keeps and its wide array of technology, expertise, and services, I wouldn’t count it out.
In fact, NVIDIA invests heavily in its autonomous vehicle platforms, offering a complete suite of hardware and software needed to develop and run AI in the car. NVIDIA recently announced the availability of its next-generation DRIVE AGX platform and the DRIVE Constellation simulator. To counter the bad news from Tesla, the company is garnering support from companies such as Daimler Benz, Bosch, Continental, Audi, Toyota, and VW.
However, if you think this story ends with NVIDIA trying to sell AI platforms to automakers, you might be surprised to learn about the infrastructure the company is building to help develop and test drive AI models. In addition to fast AI chips, NVIDIA is providing its clients with a cloud platform to train and test AI models, built on a “Software Defined Car” to accelerate development and testing of safe autonomous vehicles. NVIDIA’s investment here is impressive, with thousands of engineers and thousands of dedicated GPUs already in place. The obstacles a safe autonomous vehicle must overcome are staggering, and NVIDIA is applying AI, VR, and Supercomputing to get to market first with as many auto companies as possible.
NVIDIA’s autonomous vehicle portfolio of technology and services include DRIVE AGX on-board computers, DRIVE Constellation driving simulation software, and Project Maglev.
NVIDIA DRIVE AGX Platforms
DRIVE AGX comes in two flavors: the DRIVE AGX Xavier targeting Level 2+ to Level 4 and the highest performance DRIVE AGX Pegasus for Level 5 robotaxis and driverless shuttles. As I covered in this article, Xavier is an SOC with sensor fusion and processing, vehicle location, and path planning, all integrated into a 30-watt package that delivers 30 Trillion Operations Per Second (TOPS). With an eye on the eventual autonomous Level 5 driving, NVIDIA includes an additional 2 next-generation TensorCore GPUs in its Pegasus version, which delivers 320 TOPS. To help customers develop their autonomous vehicle systems, NVIDIA recently announced development kits for both the Xavier and the Pegasus platforms, adding sensor cabling and DRIVE Software 1.0 to both AGX systems.
Figure 1: DRIVE AGX Developer kits are available from NVIDIA on October 1. NVIDIA
To understand the problem NVIDIA is trying to help its customers solve, imagine trying to test a (real) smart vehicle’s behavior to handle a myriad of driving conditions (sun, rain, snow, twilight, night, backlit, etc.). Then factor in the thousands of possible impediments it could encounter (bikes, motorcycles, trikes, trains, pedestrians, stop signs, potholes, black ice, traffic circles, etc.), on billions of miles of roads. It just isn’t remotely possible to test all of this with real vehicles. Automakers are building their own road databases and tools to support the development and training of these autonomous vehicles. Meanwhile NVIDIA is paralleling those efforts itself and enabling a comprehensive simulation platform for the automakers. By enlisting NVIDIA to help test their models and control systems, auto companies will be able to focus more on their vehicles’ designs and functions.
As I covered here in more depth, DRIVE Constellation is a simulation platform for autonomous vehicle software development and testing. In short, DRIVE Constellation is a closed-loop drive-sim platform in which one server uses photo-realistic imaging to create the sensor output of an autonomous vehicle. It then sends the sensor data to the DRIVE AGX Pegasus platform to navigate and virtually “drive” the vehicle. At that point, the DRIVE AGX system sends the control output back to the 1stserver to update the software-defined-car, all at 30 frames a second. This allows NVIDIA to test thousands of virtual vehicles on billions of miles of virtual highways.
Now that we understand what DRIVE AGX and DRIVE Constellation do, how will NVIDIA’s customers use these tools to create the AIs they will embed in their vehicles? Project MagLev, NVIDIA’s end-to-end auto model development platform, provides the on-ramp for auto customers and partners. With over 370 partners developing on NVIDIA DRIVE, NVIDIA provides scalable AI training models with traceability to code and data, petabyte-scale AI testing platforms, pre-curated data for model development, and workflow automation to speed development.
Figure 2: NVIDIA’s Project “MagLev” is a platform to develop AI deep learning models, and to manage the workflow as the model is developed, tested, and optimized. The platform is supported by a dedicated 4000-GPU cluster. NVIDIA
NVIDIA started with a small fleet of 30 vehicles, each equipped with 12 sensors (cameras, RADAR and LIDAR), that actively collect 1 Petabyte of road data every week. If 30 vehicles seem inadequate, buckle your virtual seatbelts—this is where AI and Simulation come in to magnify that data. Those 30 cars have, as of now, created a 15-petabyte (PB) dataset—for training neural networks to run on the DRIVE AGX system, and to enable the DRIVE Constellation virtual testing platform.
When developing supervised learning models, raw data is relatively useless without tagging. To address this, NVIDIA employs 1,500 people to label the objects in the database, at the rate of 20 million objects every month. So far, this has resulted in 20 deep learning models. These are then simulated on a 4,000 GPU cluster, which are fed the output controls from 100 DRIVE AGX Pegasus systems (DRIVE Constellations). NVIDIA’s goal is to have a curated dataset of hundreds of PBs, representing millions of miles of actual data, and thousands of DRIVE Constellations simulating billions of miles of driving. All of this, just to support its customers’ efforts to build real cars and production driving databases.
“If this technology is so great, why didn’t Tesla select NVIDIA for its driving platforms?” you astutely ask. Great question. Tesla is arguably among the most advanced companies developing their own AI hardware and software for autonomous driving. A likely possibility is that Tesla will use its own chip(s) for driving analysis and control, and perhaps use Intel for infotainment and UI.
The challenges the automotive industry must address in self-driving cars are many orders of magnitude harder than, say, landing a man on the moon. Considering this, it seems to me that NVIDIA’s end-to-end approach would be very attractive to companies like Toyota, DaimlerBenz, VW, Audi , and Volvo, not to mention the tier 1 suppliers like Bosch, Continental , and ZF. The hardware provides a wide set of choices for compute, power, and price, upon which automakers can develop unique driving experiences that support their brands.
In short, to borrow from a very old advertising slogan, NVIDIA is telling its customers, “Leave the driving to us.”