Skip to main content

Nvidia makes autonomous vehicle simulator Drive Constellation generally available

Nvidia Drive Constellation
Image Credit: Nvidia

Watch all the Transform 2020 sessions on-demand here.


In an announcement timed to coincide with its 2019 GPU Technology Conference in San Jose this week, Nvidia today said that it’s making Drive Constellation, its cloud-based autonomous vehicle simulation platform, generally available.

It comes months after Nvidia seeded Constellation to simulation partners (in September), and weeks after Uber open-sourced Autonomous Visualization System, the web-based platform for vehicle data used by its Advanced Technologies Group, or ATG (the division charged with developing its autonomous car platform).

Drive Constellation made its premiere at last year’s GTC, where Nvidia CEO Jensen Huang broke it down into its component parts onstage. Constellation taps two different kinds of servers, the first of which (Constellation Simulator) powers Nvidia Drive Sim — a software platform that simulates a driverless car’s sensors — and the second of which (Constellation Vehicle) contains an Nvidia Drive AGX Pegasus chip (a pair of Xavier processors and GPUs rated at 320 trillion operations per second) and runs a complete autonomous vehicle software stack. Constellation processes simulated data from the Constellation Simulator as if it were recorded from sensors of real-world cars driving on the road; commands from Drive Pegasus are fed back to the simulator, completing the digital feedback loop roughly every 30 seconds.

Constellation can generate photo-realistic data streams to create a vast range of testing environments, simulating a variety of weather conditions, such as rainstorms and snowstorms, along with different road surfaces and terrain. Moreover, it can mimic the effects of blinding glare at different times of the day and limited vision at night. And because it’s decentralized, developers can upload traffic scenarios, integrate their own vehicle and sensor models, and drive entire fleets of test vehicles “billions” of simulated miles.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Toward that end, digital simulation company Cognata announced today that its scenario and traffic model can be supported on Drive Constellation, and automotive simulation company IPG Automotive said it’s working with Nvidia to build high-fidelity vehicle, powertrain, suspension, and vehicle control system models with its simulation solution, CarMaker.

According to Nvidia, global testing, certification, inspection, and training provider TÜV SÜD is already using it to formulate self-driving validation standards. And arguably more significantly, Toyota Research Institute — Advanced Development (TRI-AD) is Constellation’s first customer.

Nvidia Drive Constellation

“We believe large-scale simulation tools for software validation and testing are critical for automated driving systems,” said Toyota Research Institute (TRI) CEO Dr. James Kuffner. “Our vision is to enable self-driving vehicles with the ultimate goal of reducing fatalities to zero, enabling smoother transportation, and providing mobility for all, [and our] technology collaboration with Nvidia is important to realizing this vision. We believe large-scale simulation tools for software validation and testing are critical for automated driving systems.”

The partnership builds on Nvidia’s ongoing relationship with Toyota to use its Drive AGX Xavier computer, and is based on “close development” between teams from Nvidia, TRI-AD in Japan, and TRI in the U.S. Together, they intend to design future AI computing infrastructures using Nvidia GPUs, simulations using Constellation, and in-car computers based on Drive AGX Xavier or Drive AGX Pegasus.

For the uninitiated, Drive AGX is Nvidia’s development platform for driverless cars. The hardware kit, which was announced at GTC Japan last year, includes a Drive AGX Xavier computer, along with a vehicle harness to connect the platform to the car, an international power supply, a camera sensor, and other accessories. A single Drive Xavier system-on-chip packs 9 billion transistors with an 8-core CPU, a 256-core GPU based on Nvidia’s Volta architecture, and components tailored to accelerate deep learning, computer vision, and 8K video processing.

“Self-driving vehicles for everyday use and commercial applications in countless industries will soon be commonplace. Everything that moves will be autonomous,” Huang said. “Producing all these vehicles at scale will require a connected collaboration for all elements of the system. Our relationship with TRI-AD and TRI is a model for that collaboration.”