This article is more than 1 year old
Surprise: Automated driving biz finds automated driving safer than letting you get behind the wheel
Waymo recreated 72 fatal crashes and it turns out its simulated AI driver isn't a careless, distracted jerk
Keen to prove that automated cars are safer than human-driven vehicles, Waymo, the robotaxi biz spun out of Google in 2016, set out to simulate 72 fatal crashes that occurred from 2008 through 2017 in the vicinity of Chandler, Arizona.
Chandler, a popular venue for testing automated cars in the United States due to its favorable weather and minimal regulation, became a Waymo test ground in 2016. It's about 16 miles from Tempe, where in 2018 an Uber self-driving car collided with and killed pedestrian Elaine Herzberg, an incident for which safety driver Rafaela Vasquez was charged last year with negligent homicide and has pleaded not guilty.
Trent Victor, director of safety research and best practices at Waymo, said in a blog post on Monday that the company's latest research supports data released in October, 2020, that showed its AI system, referred to as the Waymo Driver, was only involved in minor collisions after six million miles on the road.
The automatic auto firm recreated 72 fatal crashes in 91 simulations, putting the Waymo Driver in the role of both the initiating driver and the responding driver for accidents involving two vehicles. And in the simulator at least, the company's software outperformed the humans involved in the recreated incidents.
"When we swapped in the Waymo Driver as the simulated initiator (52 simulations), it avoided every crash by consistent, competent driving, and obeying the rules of the road—yielding appropriately to traffic, executing proper gap selection, and observing traffic signals," said Victor.
1/3 Today, we release results that show how the Waymo Driver likely would have performed in the majority of fatal crashes that occurred on the same roads over a 10 year period. This builds upon the research we released in October 2020. Read our findings: https://t.co/RrIdwfWoBb pic.twitter.com/CtGsiYgqnY
— Waymo (@Waymo) March 8, 2021
The implication is that humans could do as well, if they actually obeyed road rules, made a concerted effort to drive carefully when behind the wheel, and were never distracted or impaired.
When the AI system acted as the responding driver, it managed to avoid a collision 82 per cent of the time and in an additional 10 per cent of incidents, it took action that reduced the severity of the recreated crash.
In the eight per cent of simulations as a responding driver that Waymo's code couldn't improve on, each was the result of a rear-end collision – which human drivers also have a hard time avoiding.
Machine learning gets semi conscious... Waymo, Daimler vow to bring self-driving trucks to American highways
READ MOREVictor argues that because 94 per cent of crashes involve human error, based on NHTSA data, Waymo's robotaxis have the opportunity to improve road safety.
The research paper [PDF] describing Waymo's findings is a bit more circumspect, allowing that the tests have limitations and real world outcomes may differ. For example, the paper notes that the simulations did not include any other vehicle traffic, so any effect such traffic might have on the Waymo Driver's sensors is not modeled.
It also observes that "the collection of potential failure modes for [automated driving systems] may be different than that of a human," which alludes to potential problems arising from sensor or electronics snafus that would not hinder a human driver.
There are currently more than 60 companies involved in self-driving tests or having permits to do so in California. As one of those, Waymo is obligated to report its disengagement rate, the rate at which a human driver has to take over from the Waymo Driver.
The number Waymo reported last year was one disengagement per almost 30,000 miles, an improvement from one in 13,219 miles in 2019. The mech chauffeur biz, nonetheless, has argued that disengagement rate is not representative of the capabilities of its system.
Waymo is still working on driving in snow. ®