Home / News / Software News / Can we trust Tesla Autopilot? Head-to-Head

Can we trust Tesla Autopilot? Head-to-Head

by

Tesla’s Autopilot system has ushered science fiction into reality, but how much should we trust driverless cars?

On May 7th 2016, Joshua Brown was killed when his Tesla Model S, in Autopilot mode, failed to react to a tractor-trailer changing lanes ahead of it. An investigation is currently underway to establish the cause of the accident.

In this week's Head-to-Head, Andy Vandervell (@AndyVan) and Sean Keach (@SeanKeach) discuss whether it’s safe to trust autopilot systems and driverless cars.

ANDY SAYS…

I have serious concerns about Tesla's Autopilot system.

While the ultimate cause and fault of this accident is still under investigation, it seems ridiculous to me that a system dubbed a 'beta' should be allowed on public roads.

Tesla claims its cars have travelled over 130 million miles safely in Autopilot mode and, while that is impressive, I have to wonder whether this is a reliable statistic. For example, Autopilot can only be used on 'highways' with marked lanes, but most accidents occur on smaller country roads and in built-up areas. So, while Autopilot appears better than human drivers, it isn't a genuine apples-to-apples comparison.

The name is also misleading. For most people, "Autopilot” suggests full autonomy, but drivers are still required to have their hands on the wheel and be concentrating on the road ahead. For me, it’s closer to a “smart cruise control” system and should be described as such.

SEAN SAYS…

The Autopilot software is in a public beta, so Andy is technically correct – it’s not “ready”.

Despite this, Tesla's safety record speaks for itself. Among all vehicles in the US, there is a fatality every 94 million miles. Compare that to the 130 million miles of safe driving with Tesla’s autopilot, and it’s clear you’re statistically safer using the driverless system.

It's up to you whether you trust driverless cars, Tesla isn't forcing drivers to use it. The system is disabled by default and you have to actively engage it, which means only drivers who are comfortable testing the software will use it.

While the investigation is ongoing, there remains the possibility that the accident was in fact caused by human error. There are reports the driver was distracted at the time and did not have his hands on the wheel, as required by the system.

Ultimately, the driver is still responsible for the car, so any failure to pay proper attention is human error, just like any other accident.

WHAT DO YOU THINK?

Have Andy and Sean got it all wrong? Would you trust a driverless car?

Related: 5 things you need to know about the Tesla Model 3

Have your say in the debate and let us know your thoughts on autopilots and driverless cars in the comments below or tweet us @trustedreviews.

comments powered by Disqus