This week I virtually attended NVIDIA’s GTC 2021 event. The event had plenty of updates from NVIDIA’s AI, data science, high performance computing, graphics, edge computing, networking, and autonomous machine sectors and I was eager to dive into them. I’d like to focus on the company’s automotive business which currently sits at a $600M annual run-rate but I believe with a sizeable backlog. In April, I wrote about the autonomous vehicle product updates at the last NVIDIA GTC event, and you can read that write-up here.
At this NVIDIA GTC event, the company launched some new solutions focused on improving the experience and safety of autonomous driving. Autonomous vehicles will undoubtedly be part of our transportation systems in the future, but there are plenty of hurdles to get over in their current state. NVIDIA Automotive seems up to the task and is making the investments. The announcements at NVIDIA GTC included a good mix of new hardware and software announcements. Let’s dig in.
DRIVE Hyperion 8
One of the first announcements was the NVIDIA DRIVE Hyperion 8. The new DRIVE Hyperion 8 is a computer architecture and sensor set designed for self-driving vehicles. I consider it a platform. The architecture is modular in design, so developers can choose what features the system will utilize for a developer’s specific application. DRIVE Hyperion 8 focuses on functional safety and cybersecurity, and it’s good to see that NVIDIA already has support from some of the biggest sensor suppliers, including Continental, Hella, Valeo, Sony, and Luminar. The new DRIVE Hyperion 8 product is available now, but for the development of 2024 vehicle models.
I believe the two NVIDIA DRIVE Orin SoCs within the DRIVE Hyperion 8 platform offers headroom for AI computing. That means enough computing for enabling level 4 self-driving capabilities and intelligent cockpits as well. For those that aren’t familiar with NVIDIA Orin, the product is NVIDIA’s system-on-a-chip solution tailored for autonomous vehicle systems. Since autonomous vehicles need to decide in rapidly changing environments, it takes a hardcore SoC to keep up with these software-defined vehicles.
The other piece of the puzzle with DRIVE Hyperion 8 is sensors. Sensors are highly critical for safety while riding in an autonomous vehicle. NVIDIA enables a broad sensor suite with Hyperion 8, including 12 cameras, nine radars, 12 ultrasonics, and a front-facing lidar sensor. The sensor systems are also modular, so developers can test and validate the technology and then tailor its sensor solutions to what works best on the individual vehicle.
Flexibility matters a lot when developing an autonomous vehicle solution, and NVIDIA enables much flexibility for its developers using DRIVE Hyperion 8. The full-stack solution gives customers the same tools that NVIDIA engineers use every day to record, capture, and process AV data.
Luminar lidar selected for DRIVE Hyperion reference platform
Another announcement was the selection of Luminar lidar as part of the sensor suite for the DRIVE Hyperion reference platform. Luminar Technologies is a leader in lidar hardware and software technology for the automotive industry, so I am not surprised that the company is part of NVIDIA’s AV reference platform.
Luminar lidar will work with NVIDIA’s existing compute and AI software to provide a full sensor suite for developing autonomous vehicles. Lidar is an essential piece of an autonomous vehicle’s safety and autonomy as it uses lasers to determine ranges of objects in front of the car and feeds that information to the vehicle’s computer.
Luminar Technologies is an excellent addition to NVIDIA’s reference platform partners. Giving customers all of the hardware and software they need to develop new autonomous vehicles isn’t something NVIDIA can do on its own, so it’s good to see the company partnering with other industry leaders. NVIDIA is relying on the strength of other companies in addition to its own to package up a full-stack autonomous vehicle solution.
Another announcement was the new DRIVE Concierge and DRIVE Chauffeur AI platforms. These new additions are both AI platforms attempting to redefine the autonomous driving experience by making it safer and more convenient. These new AI platforms work with the NVIDIA DRIVE Orin computers within the NVIDIA DRIVE Hyperion 8 platform. It takes a mix of different hardware and software working in conjunction to define new experiences, which is the approach NVIDIA is taking here.
With DRIVE Concierge, NVIDIA creates an Omniverse Avatar for you to communicate and interact with while driving. For example, imagine a 3D emoji-like figure that can see, speak, and conversate back and forth with you while driving. The character is interactive and can help you with various tasks ranging from phone calls and alerts to booking reservations and making recommendations. For the Omniverse’s Avatar to work, NVIDIA blends speech AI, computer vision, natural language understanding, recommendation engines, and simulation to create a personalized virtual driving assistant. I think it would be beneficial to have the assistance of an Omniverses Avatar, especially since it seems like all my most significant issues seem to arise when I am driving down the road.
A couple of other use cases for DRIVE Concierge that NVIDIA called out was the on-demand valet and guardian capabilities. Your vehicle will be able to park itself with the help of an on-demand valet. The guardian feature uses interior cameras and multimodal interaction to track and notify the vehicle’s driver when something on the roadway needs attention. DRIVE Concierge and DRIVE Chauffer work together to give the driver a 360 degree and 4D visualization of their vehicle so the user can sit back and watch the vehicle drive and maneuver from a birds-eye view. Since the two systems work together, it enables more functionality like summoning a vehicle and searching for parking spots. Parking is a complex task as other cars are constantly moving in and out of parking spots. It may not be street legal to park in certain places, and you have to avoid hitting street signs and make sure not to park in time-restricted areas. I believe the self-parking feature is going to be extremely valuable to end-users. I believe this is just the beginning; as the DRIVE Concierge AI platform matures, I am sure the use cases and applications will expand.
Another AI platform announced in conjunction with DRIVE Concierge was DRIVE Chauffeur. DRIVE Chauffer is an AI-assisted driving platform built for highway and urban traffic environments. The system is based on NVIDIA DRIVE AV SDK technology. In short, DRIVE Chauffer allows the driver to delegate controlling the vehicle and monitoring of the outside environment to the AI system. DRIVE Chauffeur uses DRIVE Hyperion 8 compute and sensors to perceive the world in 4D and navigate from place to place safely and effectively. An example use case would be inputting an address of a friend across town and then allowing your vehicle to navigate safely to your destination with no human intervention. If you still want to control the autonomous vehicle, the system will enable active safety features and intervene in dangerous scenarios. I saw some level-4 demonstrations of this technology on display during the conference, and I was impressed to see the navigation with no human intervention.
With the additions of DRIVE Chauffeur, DRIVE Concierge and DRIVE Hyperion 8 at NVIDIA GTC, the company looks all-in on improving the safety, efficiency, and experience of driving in an autonomous vehicle. It isn’t a single solution that radically changes the autonomous vehicle experience, but rather when you gather a combination of hardware and software solutions together to craft a better user experience. That is what NVIDIA is doing with its new announcements. NVIDIA keeps adding more features and capabilities to its autonomous vehicle stack, and it’s great to see. It is enabling autonomous vehicle developers with the hardware and software needed to develop future vehicles, and it takes a deep commitment and many dollars to do that. I am eager to see what solutions developers will create in the 2024 timeframe when enabled with a whole stack of autonomous vehicle products.
There are plenty of kinks to iron out and hurdles to overcome, but I believe few companies are better suited than NVIDIA to take on the massive compute and software challenges that powering an autonomous vehicle requires. I was encouraged by the demos I saw during the virtual event and look forward to taking a ride in a level 4 autonomous vehicle-powered NVIDIA soon. Good job, NVIDIA.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.