Tesla Full Self-Driving is getting more dangerous as it gets better

I just got back from driving about 200 miles (350 km) with Tesla’s (supervised) Full Self-Driving, and the system is getting better, but it’s also getting more dangerous as it gets better.

The risk of complacency is terrifying.

Last weekend I went on a road trip covering about 200 miles from Shawinigan to Quebec City and back, and I used Tesla’s (supervised) Full Self-Driving (FSD), v12.5.4.1 to be exact, for almost the entire trip .

Here’s the good and the bad, and the fact that the former merges into the latter.

The good

The system is increasingly starting to feel more natural. The way it handles merging, lane changes and intersections feels less robotic and more like a human driver.

The new camera-based driver monitoring system is a massive upgrade from the steering torque sensor that Tesla has used for years. I only had one problem with it where it kept giving me warnings to pay attention to the road even though I was doing just that and it eventually shut down the FSD for the drive because of it.

But this only happened once in the few weeks since I’ve used the latest update.

For the first time I can get good chunks of city driving without any interference or disengagement. It’s still far from perfect, but there is a marked improvement.

It stopped to let pedestrians cross the street, it handled roundabouts pretty well, and it drives at more natural speeds on highways (most of the time).

The system is becoming good to the point that it can induce some dangerous complacency. More on that later.

As I’ve been saying for years, if Tesla developed this technology in a vacuum and didn’t sell it to the public as “going unattended self-driving”, most people would be impressed.

The bad one

During those ~200 miles, I had five disconnects, including a few that were getting really dangerous. It was apparently about to run a red light once and stop a second time.

I say apparently because sometimes it becomes hard to tell because FSD often approaches intersections with stop and red lights more aggressively.

It used to drive closer to how I’ve driven my electric cars forever, which is to slowly decelerate using regenerative braking when approaching a stop. But this latest FSD update often maintains a higher speed, enters these intersections and brakes more aggressively, often using mechanical brakes.

It’s a strange behavior that I don’t like, but I was at least starting to get the feel of it, which makes me somewhat convinced that FSD would blow the red light and stop signs on those two occasions.

Another disconnect appeared to be due to sunlight in the front cameras. I get more of it this time of year as I ride more often during the sunset which happens earlier in the day.

That seems to be a real problem with Tesla’s current FSD configuration.

On top of the release, I had an unmanageable number of interventions. Intervention is when the driver has to enter a command, but it is not enough to disengage the FSD. This is mainly because I keep activating my turn signal to tell the system to go back into the right lane after passing.

FSD only goes back into the right lane after passing if a car comes close to you in the left lane.

I have shared this finding on X and was disappointed with the response I received. I suspected this might be because American drivers were an important part of the training data, and no offense, as this is a problem everywhere, but American drivers tend not to respect the guidelines (and the law in some places) for left lane. only to pass average.

I feel this could be an easy fix or at least an option to add to the system for those who want to be good drivers even when FSD is active.

I also had an intervention where I had to press the accelerator to tell the FSD to turn left on a flashing green light, which it hesitated to do as I held up for traffic behind me.

Electrek’s Take

The scariest thing for me is that FSD will be good. If I take someone with no experience with FSD and take them for a short 10-15 mile drive, there’s a good chance I won’t have any intervention and they’ll be really impressed.

It’s the same with a regular Tesla driver who consistently gets good FSD experiences.

This can create complacency on the part of drivers and result in less attention.

Fortunately, the new driver monitoring system can greatly help with that, as it tracks the driver’s attention, unlike Tesla’s previous system. However, it only takes a second of inattention to get into an accident, and the system allows you that second of inattention.

Also, the system becomes so good at handling intersections that even if you are paying attention, you can end up blowing through a red light or stop sign, as I mentioned above. You might feel confident that the FSD will stop, but with its more aggressive approach to the intersection, you’ll let it go, even if it doesn’t start braking as soon as you’d like it to, and before you know it, isn’t braking at all.

There’s a four-way stop near my place on Montreal’s South Shore that I’ve driven through many times with FSD without issue, and yet FSD v12.5.4 was apparently blowing right past it the other day.

Again, it’s possible it just slowed down late, but it was far too late for me to feel comfortable.

Although it’s getting better and better at a more noticeable pace lately, the crowdsourced data, which is the only data available since Tesla refuses to release any, indicates that FSD is still years away from being capable of unsupervised self-driving :

Tesla would need about a 1,000x improvement in miles between disengagement.

I’ve lost a lot of faith that Tesla got there because of things like the company’s recent claim that it accomplished its September goals for FSD, which included a “3x improvement in miles between critical disengagement” with no evidence that this happened.

In fact, the crowdsourced data shows a regression on that front between v12.3 and v12.5.

I fear that Elon Musk’s attitude and repeated assertion that FSD is incredible, combined with the fact that it is actually getting better and his minions are raving about it, could lead to dangerous complacency.

Let’s be honest. FSD accidents are inevitable, but I think Tesla could do more to reduce the risk – mainly by being more realistic about what it’s accomplishing here.

It’s developing a really impressive vision-based ADAS system, but it’s nowhere near being unsupervised self-driving.

FTC: We use income earning auto affiliate links. More.

Leave a Comment