Skip to main content

Aurora urges autonomous vehicle industry to adopt better safety metrics

Aurora self-driving
A prototypical Aurora self-driving car.
Image Credit: Aurora

Watch all the Transform 2020 sessions on-demand here.


Amazon-backed autonomous vehicle company Aurora is in search of better metrics by which to compare driverless system safety. In a Medium post this morning, CEO Chris Urmson noted that the commonly used disengagement number, which indicates how often a vehicle switched from autonomous to manual mode, doesn’t adequately capture improvements or their impact over time.

“Historically, the industry and media have turned to tallying on-road miles and calculating disengagement rates as measurements of progress,” he wrote. “If we drive 100 million miles in a flat, dry area where there are no other vehicles or people, and few intersections, is our ‘disengagement rate’ really comparable to driving 100 miles in a busy and complex city like Pittsburgh.”

That’s not to suggest disengagements don’t have their place, Urmson, a former Waymo engineer, pointed out. Aurora records two types internally: reactionary disengagements, where a vehicle operator disengages the system because they believe an unsafe situation might occur, and policy disengagements, where an operator proactively disengages ahead of an on-road situation the system hasn’t been taught to handle.

But Urmson claims that technical or engineering velocity is a superior measure of progress because it captures advancements made on core technology. To this end, Aurora taps a custom pipeline to label, evaluate, and make tests from scenarios its lidar sensor-, radar-, and camera-equipped Lincoln MKZs (which might in the next year be swapped out for Chrysler Pacific minivans) encounter on public roads. Once new versions of the software pass those tests, it’s assumed they’re performing better than their predecessors.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


“One persistent misconception in the self-driving space is that the team with the most on-road development miles will ‘win,'” Urmson wrote. “We disagree. Our goal has always been to scale our data pipelines so that we collect useful data. Quality is more important than quantity in on-road testing, and we try to make every mile count.”

Urmson proposes that the industry double down on simulation testing and “squeez[ing] more ‘value’ out of … on-road miles.” He notes that Aurora has prioritized investment in its Virtual Testing Suite — which allows it to run millions of off-road tests a day — and that its engineers continue to feed driving decisions into the company’s motion planning models, allowing it to learn from experience. Like GM-owned Cruise Automation’s The Matrix and Waymo’s Carcraft, the Virtual Testing Suite enables Aurora to model tests involving pedestrians, lane merging, and parked cars. Urmson estimates that a single virtual mile can be just as insightful as 1,000 miles collected on the open road.

“These priorities — advancing our core technology, securing a path to profitability, integrating [our system] into trucks, and continuing to develop a safety case — will get us closer to reaching our ultimate goal: developing self-driving technology that can safely move people and goods without the need of a human driver,” wrote Urmson.

Auora says that after a year of focusing on capabilities including merging, nudging, and unprotected left-hand turns, its autonomous system — the Aurora Driver, which has been integrated into six different types of vehicles to date, including sedans, SUVs, minivans, commercial vans, and Class 8 freight trucks  — can perform each seamlessly “even in dense urban environments.” In 2020, as it expands its vehicle fleets for data collection, testing, and validation, it plans to improve how the Driver predicts and accounts for “non-compliant actors,” or people who aren’t following the rules of the road, like jaywalkers and drivers who aggressively cut into the lane.

Urmson believes that over the next five years, commercial fleets of autonomous vehicles will begin piloting people and goods. Broad adoption might follow after.

The blog post comes a week after Kyle Vogt, cofounder and CTO of Cruise, posited it might be time for a new metric for reporting safety of self-driving cars. “Keep in mind that driving on a well-marked highway or wide, suburban roads is not the same as driving in a chaotic urban environment,” Vogt wrote. “The difference in skill required is just like skiing on green slopes vs. double black diamonds.”

Vogt and Urmson aren’t the only ones who have voiced their disapproval of disengagement-based safety measures. In a conversation with VentureBeat at the 2020 Consumer Electronics Show (CES), Dmitry Polishchuk, the head of Russian tech giant Yandex’s autonomous car project, noted that Yandex hasn’t released a disengagement report to date for that reason. “We have kind of been waiting for some sort of industry standard,” he said. “Self-driving companies aren’t following the exact same protocols for things. [For example, there might be a] disengagement because there’s something blocking the right lane or a car in the right lane, and [the safety driver realizes] as a human that [this object or car] isn’t going to move.”

Unfortunately for companies like Yandex, Cruise, and Aurora, less regulatory guidance — not more — seems the likelier near-future path, at least in the U.S. At CES on January 8, Transportation Secretary Elaine Chao announced Automated Vehicles 4.0 (AV 4.0), new guidelines regarding self-driving cars that seek to promote “voluntary consensus standards” among autonomous vehicle developers. It requests but doesn’t mandate regular assessments on self-driving vehicle safety, and it permits those assessments to be completed by automakers themselves as opposed to by a standards body.

Advocacy groups including the Advocates for Highway and Auto Safety almost immediately criticized the policy for its vagueness. “Without strong leadership and regulations … [autonomous vehicle] manufacturers can and will continue to introduce extremely complex supercomputers-on-wheels onto public roads … with meager government oversight,” Advocates president Cathy Chase said in a statement. “Voluntary guidelines are completely unenforceable, will not result in adequate performance standards, and fall well short of the safeguards that are necessary to protect the public.”

In the U.S., legislation remains stalled at the federal level, unfortunately. More than a year ago, the House unanimously passed the SELF DRIVE Act, which would create a regulatory framework for autonomous vehicles. But it has yet to be taken up by the Senate, which in 2018 tabled a separate bill, the AV START Act, that made its way through committee in November 2017.