Denseman on the Rattis

Formerly known as the Widmann Blog

entransport

Tesla’s model doesn’t work

Testing the Tesla autopilot (self driving mode)
Testing the Tesla autopilot (self driving mode).
It’s been revealed that the first person ever has been killed in a crash by a self-driving car:

The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Tesla’s autopilot mode, which is able to control the car during highway driving.

Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said. The car attempted to drive full speed under the trailer, “with the bottom of the trailer impacting the windshield of the Model S”, Tesla said in a blog post.

In spite of this, I still believe self-driving cars will take over. However, it does highlight one fallacy, namely the idea that a human driver can be expected to supervise a near-perfect self-driving car.

Just think about it: If your car has been driving perfectly for a whole year, would you find it easy to keep your eyes glued to road and your hands to the steering wheel, just in case the car’s computer has a nervous breakdown? Wouldn’t you start playing with your smartphone, eat a sandwich or even doze off for ten minutes?

What this accident shows is that Google’s model (where the car is fully autonomous and the passengers don’t have access to a steering wheel) is correct, and Tesla’s is doomed. If a car is driving on its own, nobody should pretend that a human is ultimately in charge.

Leave a Reply

Your email address will not be published. Required fields are marked *