The News Wheel
No Comments

Are Self-Driving Cars Causing Accidents By Following the Law?

Decrease Font Size Increase Font Size Text Size Print This Page

Autonomous vehicles take the entire human experience out of driving. There’s no second-guessing, no assumptions, no distractions. Self-driving cars are truly robots, and they operate however they are programmed. So when self-driving cars are programmed to follow the laws, are they causing more accidents?

The California DMV reports that out of 43 autonomous vehicle accidents in the state, 13 of those accidents were autonomous vehicles being rear-ended. Nearly 100 percent of the 43 accidents were at intersections, at low-speeds, with no injuries.

Most human drivers speed and roll stop signs. That doesn’t make it legal or the right way to drive, but it’s how the world has come to expect the roads to operate. However, self-driving vehicles didn’t get that memo – which throws human drivers off their game.

“They don’t drive like people,” said Mike Ramsey, analyst at Gartner. “They drive like robots. They’re odd, and that’s why they get hit.” Gartner is a technology company specializing in the automotive industry.

So how do manufacturers replicate the human driving experience without breaking the law? It’s not like they can program the autonomous vehicles to speed and run stop signs.

Waymo, Google’s self-driving car program, has been taking concrete steps toward the seamless integration of autonomous vehicles. The company is working to program its vehicles to make wider turns and “inch forward at flashing yellow lights,” according to The Drive.

General Motors (GM) has also been taking steps toward autonomous driving recently, with the new addition of Strobe to Cruise Automation, a self-driving tech company under GM’s umbrella. The CEO of Cruise Automation, Kyle Vogt, claims the autonomous versions of the Chevrolet Bolt are “designed to emulate human driving behavior but with the human mistakes omitted.” How the company plans to emulate that behavior is not yet clear.

Some would argue that self-driving technology will only work if every car on the road is autonomous; one could argue that the danger occurs when robots and humans are on the road together. However, some argue that these low-speed accidents are the fault of humans, not the robots – the robots are simply following the law, the way humans are supposed to.

News Source: The Drive