The News Wheel
No Comments

Lessons From Tesla Autopilot Crashes Reveal Weaknesses for Whole Industry

Decrease Font Size Increase Font Size Text Size Print This Page

Self-driving vehicle technology has come under some heavy scrutiny following the fatal self-driving Uber crash, but it had already been held in suspicion thanks to some Tesla crashes where the car, with self-driving mode engaged, ran into stationary objects like traffic barriers. However, as more details emerge on the crash, the primary lessons for the industry are clear.


Learning Things Is Good: Did you know Chevy is J.D. Power’s most-awarded brand?


Apparently, highway driving-assist technology is actually programmed to ignore non-moving objects.

This odd-sounding quirk is due to the history of how adaptive cruise control systems came to be. When they were first being developed, most of these systems used radar to watch the car in front and maintain a safe distance.

However, radar doesn’t generate a very clear image on what it sees, so the car could only get a very rudimentary idea of what was around, and would have a lot of trouble discerning what was on the road, like cars, and what was beside the road, like concrete barriers. What radar can do is figure out how fast something is going, so to make the system work, engineers just programmed the car to ignore things which aren’t moving, figuring that drivers would be watching the road to prevent collisions anyway. This way, the car just keeps a safe distance from other moving cars.

This system was improved with the addition of lane-keeping systems, but according to experts, these two technologies are not really integrated, often using different sensors (lane-keep assist typically uses cameras) and being made by different suppliers. As a result, they often don’t share data and so aren’t used to do any sort of path planning.


Moving Forward: The new Chevy Silverado is coming with new technology


So, the next question is, “What about automatic braking systems?” Apparently, the thing about these systems is that often they are programmed to only work at low speeds. Mostly, this is to avoid accidentally causing a crash.

At low speeds, slamming on the brakes will still likely give any following vehicles enough time to slam on the brakes as well, but at highway speeds the car needs to start braking much earlier to avoid a crash, making it way more likely that the car will make a mistake interpreting the situation—for example, mistaking an object near the road for being in the road—and cause a crash by both startling the driver and surprising following drivers.

Finally, automatic braking systems are also not integrated with the above two critical driver-assist systems.

So, perhaps the whole automotive industry, as it moves closer and closer to self-driving cars and introduces more highway driver-assist feature collections like GM’s Super Cruise or Nissan’s ProPILOT Assist, can take a note—keeping these systems as distinct and non-integrated entities could cause fatal crashes in a moment of distraction, which is something that studies have shown are very, very likely.

News Source: Ars Technica