Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Video: Can we trust Tesla Autopilot? Head-to-Head

Tesla’s Autopilot system has ushered science fiction into reality, but how much should we trust driverless cars?

On May 7th 2016, Joshua Brown was killed when his Tesla Model S, in Autopilot mode, failed to react to a tractor-trailer changing lanes ahead of it. An investigation is currently underway to establish the cause of the accident.

In this week’s Head-to-Head, Andy Vandervell (@AndyVan) and Sean Keach (@SeanKeach) discuss whether it’s safe to trust autopilot systems and driverless cars.

ANDY SAYS…

I have serious concerns about Tesla’s Autopilot system.

While the ultimate cause and fault of this accident is still under investigation, it seems ridiculous to me that a system dubbed a ‘beta’ should be allowed on public roads.

Tesla claims its cars have travelled over 130 million miles safely in Autopilot mode and, while that is impressive, I have to wonder whether this is a reliable statistic. For example, Autopilot can only be used on ‘highways’ with marked lanes, but most accidents occur on smaller country roads and in built-up areas. So, while Autopilot appears better than human drivers, it isn’t a genuine apples-to-apples comparison.

The name is also misleading. For most people, “Autopilot” suggests full autonomy, but drivers are still required to have their hands on the wheel and be concentrating on the road ahead. For me, it’s closer to a “smart cruise control” system and should be described as such.

SEAN SAYS…

The Autopilot software is in a public beta, so Andy is technically correct – it’s not “ready”.

Despite this, Tesla’s safety record speaks for itself. Among all vehicles in the US, there is a fatality every 94 million miles. Compare that to the 130 million miles of safe driving with Tesla’s autopilot, and it’s clear you’re statistically safer using the driverless system.

It’s up to you whether you trust driverless cars, Tesla isn’t forcing drivers to use it. The system is disabled by default and you have to actively engage it, which means only drivers who are comfortable testing the software will use it.

While the investigation is ongoing, there remains the possibility that the accident was in fact caused by human error. There are reports the driver was distracted at the time and did not have his hands on the wheel, as required by the system.

[videoai]

Ultimately, the driver is still responsible for the car, so any failure to pay proper attention is human error, just like any other accident.

WHAT DO YOU THINK?

Have Andy and Sean got it all wrong? Would you trust a driverless car?

(apester:57860bbf33e3688a29938ffa)

Related: 5 things you need to know about the Tesla Model 3

Have your say in the debate and let us know your thoughts on autopilots and driverless cars in the comments below or tweet us @trustedreviews.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words