Uber Suspends Driverless Testing After Car Kills Pedestrian
By Carolyn Said
In what appears to be the first pedestrian fatality involving a self-driving car, an Uber vehicle operating in autonomous mode Sunday night struck a Tempe, Ariz., woman, who later died of her injuries at a local hospital.
Uber immediately suspended autonomous vehicle operations in all four cities, including San Francisco, where it tests the cars, while experts said the consequences for the nascent industry could be far-reaching.
The self-driving Volvo SUV had a human backup driver behind the wheel when it hit Elaine Herzberg, 49, who was walking a bicycle across the street outside a crosswalk, according to Tempe police. The accident was reported at 10 p.m. on an overcast evening.
Videos captured by the self-driving car make it "very clear it would have been difficult to avoid this collision in any kind of mode (self-driving or human driven) based on how (Herzberg) came from the shadows right into the roadway," said Sylvia Moir, Tempe police chief, in an interview.
Local media showed images of Herzberg's mangled bike, laden with plastic shopping bags, on the ground next to the car with its distinctive sensors jutting from the roof.
"Some incredibly sad news out of Arizona," tweeted Uber CEO Dara Khosrowshahi, who may face one of his biggest challenges from this incident. "We're thinking of the victim's family as we work with local law enforcement to understand what happened."
California is weeks away from allowing self-driving cars with no human drivers and no manual controls onto public roads. So far those plans don't appear to be affected by the Arizona accident.
Dozens of companies from large carmakers to tech startups are racing to develop self-driving cars, which most experts agree will transform transportation in coming years. A primary motivation is that self-driving vehicles supposedly are better than human drivers since they don't text, drink, doze off or get distracted. An overwhelming majority of the world's 1.3 million annual traffic fatalities are caused by human error.
But the Arizona accident raised fresh concerns by consumer advocates and others that the nation is rushing to deploy the cars without enough due diligence. A commentator on Cnet's Roadshow site called it the industry's "Apollo 1 moment," referring to the fatal fire of the spaceship's command module that could have scrubbed the nation's fledgling space program.
"There should be a national moratorium on all robot car testing on public roads until the complete details of this tragedy are made public and are analyzed," said John Simpson, privacy and technology project director at Consumer Watchdog. "Arizona has been the wild west of robot car testing, with virtually no regulations in place. That's why Uber and Waymo test there. When there's no sheriff in town, people get killed."
Arizona has prided itself on a welcoming environment for robot-car makers. Waymo, the self-driving unit of Google parent Alphabet, tests cars in Arizona with no human back-up drivers, although Waymo employees sit in a back seat. It plans to soon test the driverless cars with paying passengers.
California, a hotbed for autonomous vehicle testing with 50 companies piloting almost 400 driverless cars here, currently requires the vehicles to have backup drivers who can take control to avoid accidents -- though Uber had a backup driver in the Tempe incident.
The California Department of Motor Vehicles, which regulates autonomous cars and plans to permit cars with no backup drivers onto public roads starting April 2, said it will ask Uber for more information on the fatal crash. "The California DMV has many requirements in place for testing permit holders and requires collision reports and annual disengagement reports," said spokeswoman Jessica Gonzalez. Disengagement reports track how often a human driver needed to intervene. The reported California accidents largely have been fender-benders attributable to human error. Often the self-driving cars are rear-ended by other drivers.
California's reports are the most detailed that are required in any state. Arizona has no such requirements.
Wendy Ju, an information science professor at Cornell Tech in New York and an expert on human-robot interactions, said the death could lead to better data-sharing regulations.
"Public oversight and accountability (are) almost impossible except in extreme incidents such as this one," she wrote in an email. She hopes that the federal government will set policies "that prevent technology companies from shopping municipalities for the lowest regulation and oversight."
However, proposed federal rules, known as the AV START Act (S. 1885), do not include such safeguards.
Sen. Dianne Feinstein, D-Calif., and four other senators expressed concerns last week in a letter saying the act needs stronger safety standards, provisions for data collection and analysis, and measures to safeguard cybersecurity and consumer privacy.
"Congress is on the brink of opening the floodgates for (autonomous vehicles) to be sold in vast numbers, without having to meet a single safety standard specific to this new technology, while hamstringing the states from protecting their own citizens," Rosemary Shahan, executive director of Consumers for Auto Reliability and Safety, wrote in an email.
Akshay Anand, executive analyst for Kelley Blue Book, a car shopping research site, said the accident could shape public perception.
"A lot of people still have trepidation about autonomous vehicles, and this could reinforce those wary perceptions," he said.
Prior to the accident, Uber operated ride-hailing autonomous vehicles with backup drivers for paying customers in Arizona and Pittsburgh. In San Francisco, it had just started giving commute rides to employees in its self-driving division. It also has done testing in Toronto.
Uber's autonomous-car program has been rocky, to say the least.
Seeing robot taxis as integral to the ride-hailing company's future, ex-CEO and co-founder Travis Kalanick aggressively pursued their development. That led to a nasty lawsuit filed by Waymo, which claimed Uber stole key self-driving secrets. The companies settled in February several days into a jury trial, with Uber paying Waymo $245 million in stock and Khosrowshahi expressing "regret" but not admitting wrongdoing.
Uber's first forays into autonomous driving in San Francisco were ill-fated. In December 2016, it said it would offer autonomous rides to ride-hailing passengers here, but the DMV forced it to yank the program because it refused to get the required permits. During their few days of operation, the Uber cars racked up several reports of incidents, such as running red lights, that Uber attributed to human error. The company soon loaded the vehicles onto trucks to take them to Arizona, known for lighter regulations.
Eventually Uber returned to San Francisco testing with the proper permits, though it did not carry paying passengers here.
The National Transportation Safety Board and the U.S. National Highway Safety Administration said they are sending teams to investigate the crash.
While the fatal crash of a Tesla in autopilot mode two years ago drew widespread attention, the driver in that case was faulted for relying on the technology, which was not intended to replace human drivers -- although Tesla was also blamed because its car allowed the driver to misuse it.
By contrast, Sunday's accident involved a fully self-driving car. "You can plan for 999,999 potential scenarios for an autonomous vehicle, but there's always that 1 millionth one," Anand said. "It will be a while before we truly know what happened."
(c)2018 the San Francisco Chronicle