A self-driving Uber SUV struck and killed a pedestrian on a street in Tempe, Arizona, in late March. This was the first known U.S. fatality involving a pedestrian and a fully autonomous car, and the incident has spurred discussion about whether or not tests of self-driving vehicles should continue — and, if so, how aggressively.
The self-driving Volvo XC90 SUV killed a 49-year-old woman as she walked her bicycle across a street, according to the Tempe Police Department. Preliminary information reveals the car was going approximately 40 mph in a 35 mph zone. There was a test driver at the wheel but the car was in autonomous mode at the time of the crash. There is no indication that the driver was impaired when police arrived at the accident scene. In 2017, Uber briefly pulled its self-driving vehicles from roads after a test vehicle landed on its side, also in Tempe. The National Transportation Safety Board (NTSB) is conducting an investigation.
Why two accidents in Arizona? The state doesn’t have as much inclement weather – particularly rain or snow – as the rest of the U.S. This makes it a practical location to test self-driving cars at this point in their development. Many companies have targeted 2020 as their hoped-for date when self-driving vehicle technology could be the dawning of a renaissance on America’s roads. Whether developers can keep to that optimistic schedule remains to be seen.
Problems with Tesla Self-Driving Vehicles?
Since autonomous vehicles began street testing, motorcyclists and bicyclists have expressed their concerns about self-driving cars. For the most part, their concerns have fallen on deaf ears within the industry, the news media, and government.
Fred Heppell, an avid British cyclist who had participated in bike tours all over the world, was reportedly riding on a straight country road in good weather with excellent visibility in November 2017 when a Tesla Model S 90D struck him from behind. The accident killed the 80-year-old Heppell and raised concerns among international road riders about whether the Tesla had been in “autopilot” mode at the time. Neither the police nor Tesla has publicly commented on that question. So far, the case has received little attention from the news media in Great Britain and virtually none in the United States. If it turns out the Tesla vehicle was on autopilot, it could further impact testing in the U.S. and other countries.
The autonomous vehicle industry, a loose configuration of traditional auto companies and tech start-ups, is aggressively selling the promise of rational, computer-controlled vehicles, free from road rage or driver error. But they have been woefully short on specifics about how they can ensure the safety of those with whom autonomous vehicles will share the road, according to cycling and consumer advocates in general.
The League of American Bicyclists has been lobbying Congress to require a “vision test” on all semi-autonomous or fully-autonomous cars before they can be deployed on public roads. “When human drivers apply for a driver’s license, we have to pass a vision test…automated driving systems should first have to pass a ‘vision test’ as well,” the cycling organization writes on its website.
As more self-driving vehicles are tested on our roads, and more accidents occur, which is likely a certainty, controversy will follow. Wary drivers, bike and motorcycle riders and pedestrians are justifiably cautious as they take a “wait and see” attitude.
If you have questions about personal injury cases or would like to schedule a free consultation, call Kaplan Lawyers PC to talk to our team.