Skip to main content

Optimus Ride taps Nvidia for its level 4 autonomous cars

Optimus Ride
Image Credit: Optimus Ride

Watch all the Transform 2020 sessions on-demand here.


Optimus Ride, an autonomous technology startup based in Boston, today revealed Nvidia’s Drive AGX Xavier as its development platform of choice for driverless cars.

The company plans to upgrade all of its autonomous vehicles deployed in Boston’s Seaport District and the Union Smart Point development in Weymouth, Massachusetts with Drive AGX Xavier. It is also looking to incorporate the hardware/software framework into future prototypes at yet-to-be-announced sites.

Optimus CEO and cofounder Ryan Chin, who formerly led the City Science Initiative at the MIT Media Lab, said Xavier will help the company build its first fully driverless fleet service for geofenced deployments. The cars will be capable of level 4 autonomous driving, meaning they’ll operate with limited human input and oversight in specific conditions and locations (as defined by the Society of Automotive Engineers).

“With the introduction of Nvidia Drive AGX Xavier, our vehicles will be able to adapt and respond to their environments faster than ever before, comprehending what’s happening around them in real time,” he said. “With this level of insight and responsiveness, we’ll be able to introduce fully autonomous services in our deployment regions by the end of the year.”


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


Optimus — an MIT spinout founded by a team of DARPA Urban Challenge competitors and other autonomous driving engineers — has flown mostly under the radar since October 2017, when it announced a partnership with real estate developer LStar Ventures that saw the 1,550-acre Union Point neighborhood gain self-driving car service.

Optimus became one of the first firms to secure a driverless vehicle permit from the Massachusetts Department of Transportation in 2016, with tests of its 25-plus car fleet starting in Raymond L. Flynn Marine Park in the Seaport District. It previously piloted its software — a suite capable of mapping, controlling vehicles, coordinating vehicle fleets, detecting and avoiding objects, and more — on the campus of the Perkins School for the Blind in Watertown, Massachusetts.

If all goes according to plan, it’ll join an exclusive club of companies that have deployed level 4 autonomous passenger cars and taxis. Baidu launched level 4 autonomous shuttle buses in more than 10 regions across China earlier this year, and Google spinoff Waymo has tested level 4 vehicles on passengers participating in its Early Rider Program in Chandler, Arizona. Startup Drive.ai, meanwhile, is operating fleets of level 4 cars in Arlington and Frisco, Texas.

In November 2017, Optimus announced an $18 million funding round led by Greycroft Partners, with participation from Emerson Collective, Fraser McCombs Capital, and MIT Media Lab director Joi Ito. To date, it has raised $23.25 million.

Drive AGX

Nvidia — another Optimus Ride investor — claims that Drive AGX is capable of delivering 30 trillion operations per second. The developer kit, which was announced at GTC Japan, includes a Drive AGX Xavier computer, along with a vehicle harness to connect the platform to the car, an international power supply, a camera sensor, and other accessories.

“Optimus Ride’s adoption of the scalable Nvidia Drive AGX platform gives them the computational horsepower and sophisticated AI … stack essential for safe self-driving systems,” said Rishi Dhall, vice president of automotive business development at Nvidia.

A single Drive Xavier system-on-chip packs 9 billion transistors with an 8-core CPU, a 256-core GPU based on Nvidia’s Volta architecture, and components tailored to accelerate deep learning, computer vision, and 8K video processing. It draws just 30 watts of power, according to Nvidia, and it’s at the heart of the company’s AGX Pegasus platform — a pair of Xavier processors and GPUs rated at 320 trillion operations per second.

Nvidia claims that each car tapping Drive can produce up to a petabyte of data every week from lidar, radar, and other vision systems inside and outside the vehicle and that nearly 1,500 human workers label 20 million objects from that data a month.

AGX Xavier runs Nvidia Drive Software 1.0, which comes with data recording, navigation, and visualization tools. Those include DriveNet, a deep neural network that can detect and classify objects within view, and LaneNet and OpenRoadNet, AI systems that can identify lane markings and detect driveable space.

In October, Volvo announced it would adopt Nvidia’s Drive AGX Xavier for its next generation of vehicles.