Yesterday a story regarding a possible Tesla Autopilot failure was making the rounds. Here’s one article about it, written by Jason Torchinsky and presented on jalopnik.com:
Here is a news report about the accident, presented on Taiwan’s news (the accident occurred there).
Mr. Torchinsky, in the article linked to above, makes the following statement:
Tesla’s Autopilot system primarily uses cameras for its Autopilot system, and previous wrecks have suggested that situations like this, a light-colored large immobile object on the road on a bright day can be hard for the system to distinguish.
This is very much true and reminds me of a similar incident which happened to a Tesla fanatic whose car, on Autopilot, plowed into a truck towing a trailer. The accident occurred in 2019 and the driver was killed.
If you’ve read my ramblings here, you should know that I own a Tesla Model 3. You should also know that I absolutely love the car to death and, yes, I have used the Autopilot quite a bit, especially when on long highway trips.
I find the Autopilot a great driving aid.
Yet I never let my guard down and am always focused on what’s going on in front of me when using the device.
There are those who criticize Elon Musk and Tesla because they’ve been playing fast and loose with the whole “Autopilot” and “Full Self-Driving” terminology.
The two are very different things and you need to understand what each is.
Full Self-Driving is a theoretical idea so far and not a reality. That you can get into your car, in your driveway, and instruct it to take you to, say, your work. The car does everything from that point on, backing your car out of your driveway, moving from street to street and stopping at traffic lights/stop signs, moving from regular streets to highways then back to regular streets and dropping you off at your destination.
No, that doesn’t exist yet, though Tesla and several other companies are working hard to make it.
Autopilot, on the other hand, is like an advanced version of Cruise Control and it does exist in Tesla vehicles.
In Autopilot, the car “sees” what is in front and around it and adjusts the driving for you, slowing down when cars are stopped before you and speeding up to the velocity you have specified for it to go at a maximum. However, it doesn’t take you to locations and, once off a highway, it will disengage.
I tend to use Autopilot only in situations where the car is driving mostly “straight”. Yes, Autopilot can take curves -and has done so- but in city driving Autopilot will not take you from your home all the way to your job. In the version I have (there have been advances since then and I’m still waiting to get my updates central processor) it does not see traffic lights or stop signs.
Autopilot is an aide to be used for mostly straight forward driving and nothing more.
In the case of the accident presented above and the fatal accident from 2019, both cases highlight a problem that Tesla’s Autopilot has and which the company needs to address: The Autopilot system seems to sometimes get confused when a stationary large object, especially if it is white, is directly in front of your car and on a straightaway.
The accident in Taiwan and the fatal accident from 2019 had the same general elements: A Model 3 was indeed moving on a straight highway/road and in front of it appeared a large white object, be it a trailer or overturned truck. In both cases, the Autopilot feature did not detect the fact that they were there.
Having said that…
It appears these two accidents are the only ones thus far which have happened under these circumstances. The loss of life in the earlier accident is a tragedy but some 90 people die in automotive accidents every day.
The fact that we have 2 accidents -one non-fatal- involving this Autopilot technology in a span of some 2 years (2019 and 2020, thus far) indicates this is a situation that occurs incredibly infrequently.
Still, Tesla should get on top of this situation, infrequent as it may be.