The safety driver behind the wheel of a self-driving Uber that struck and killed a woman in 2018 has been charged with a crime. Prosecutors in Maricopa County, Arizona, Tuesday said the driver, Rafaela Vasquez, has been indicted for criminal negligence. But Uber, her employer and the company that built the automated system involved in the fatal collision, won’t face charges.
The attorney for neighboring Yavapai County declined to prosecute Uber last year, writing in a letter that the office found “no basis for criminal liability.” (Yavapai took over the Uber part of the case because Maricopa County had worked with Uber on an anti-drunk-driving campaign.) Yavapai County attorney Sheila Polk declined to elaborate on her decision. A spokesperson for Uber declined to comment.
What happens when humans and machines work together to hurt others? The question isn’t new. As the anthropologist Madeleine Clare Elish noted earlier this year after an investigation into automation in the aviation sector, “conceptions of legal liability and responsibility did not adequately keep pace with advances in technology.” It has, in other words, been difficult—though not impossible—for the legal system to hold people responsible for the technology they build. Instead, the human in the loop, the person behind the wheel or the screen, has borne the bulk of the responsibility.
As a practical matter, it’s easier for prosecutors to sell juries on a story they already know. Vasquez was behind the wheel of a car and allegedly watching her cell phone instead of the darkened road in front of her when the car struck and killed a woman named Elaine Herzberg. People know about distracted driving. “That’s a simple story, that her negligence was the cause of [Herzberg’s] death,” says Ryan Calo, a law professor who studies robotics at the University of Washington School of Law. “Bring a case against the company, and you have to tell a more complicated story about how driverless cars work and what Uber did wrong.”
The story is more complicated, and more technical. Last year, the National Transportation Safety Board released its final report on the crash, the country’s first fatal one involving an autonomous vehicle. After combing through documents and software and interviews with Uber staffers, the safety panel determined that lots of people were responsible for the collision.
How a chaotic skunkworks race in the desert launched what’s poised to be a runaway global industry.
“Safety starts at the top,” NTSB chair Robert Sumwalt said. “The collision was the last link of a long chain of actions and decisions made by an organization that unfortunately did not make safety the top priority.” Among the culprits: Vasquez and Uber self-driving execs, who created what the NTSB called an “inadequate safety culture.”
Herzberg had been pushing a bicycle and crossing the road roughly 300 feet away from a crosswalk when she died. According to the NTSB investigation, Uber’s software system did not consider the possibility of pedestrians walking across roads outside of crosswalks, or the possibility of a person pushing a bicycle while on foot. Instead, Uber’s technology repeatedly tried to “categorize” the woman as a different kind of object and predict her path accordingly. When the vehicle first “saw” Herzberg, 5.6 seconds before impact, it classified her as a vehicle. For the next four-and-a-half seconds, it couldn’t guess what she was, classifying her as “other,” then a vehicle again, then “other,” then a bicycle, then “other,” then a bicycle once more. At 1.2 seconds, the vehicle recognized it would hit Herzberg. It held off braking for one second. Just 0.2 seconds before the collision, it sounded an audio alarm.
Vasquez did not react in time. Maybe that’s not surprising: For decades, research has shown that it’s very, very difficult to keep human attention focused on partially automated tasks. Initial hearings in Vasquez’s case are set for next month.
Uber reached a quick legal settlement with Herzberg’s family just over a week after the crash, the terms of which have not been disclosed. There’s a reason that outcome may feel unsatisfying, says Calo, the law professor. In a civil lawsuit, Uber faced “the woman who was killed and her descendants,” he says. “But in a criminal case, the other side of the [case] is the state, the people.” A criminal case against Uber might grapple with what it means to build faulty technology. But that won’t happen here. “That, to me, is perhaps symbolic,” says Calo.
More Great WIRED Stories
- ? Want the latest on tech, science, and more? Sign up for our newsletters!
- Gravity, gizmos, and a grand theory of interstellar travel
- How to deal with the anxiety of uncertainty
- One IT guy’s spreadsheet-fueled race to restore voting rights
- Is lightning-fast plasma the key to a cleaner car engine?
- The flagrant hypocrisy of bungled college reopenings
- ? Upgrade your work game with our Gear team’s favorite laptops, keyboards, typing alternatives, and noise-canceling headphones