It’s Reasonable To Conclude Musk Knew About Autopilot ‘Failing To Detect Cross Traffic’ Says Florida Judge

Tesla Judge Traffic
ADVERTISEMENT

Tesla’s boldly-named Autopilot system has become a controversial feature thanks to a number of fatal incidents in which the driving assist has been implicated. The matter has landed the automaker in court. Tesla is alleged to have allowed the system to be used despite significant flaws in its performance. Now, a judge has come out to say that there is reasonable evidence the automaker was well aware of the problems prior to certain tragic incidents.

As covered by Reuters, the case concerns a fatal crash from 2019, involving owner Stephen Banner. Banner was killed when his Tesla Model 3, under Autopilot command, drove under the trailer of an 18-wheeler semi-truck that had pulled onto the road in front of the vehicle. The crash sheared the roof off the Model 3, with Banner suffering fatal injuries as a result.

Banner’s wife brought a lawsuit over the crash, alleging that Autopilot failed to take any evasive action to avoid the accident. In response, Tesla denied liability, claiming Autopilot to be safe and fit for purpose when properly supervised. In documents submitted to the court, it noted the requirement that drivers must pay attention while driving and keep their hands on the wheel. Notably, though, Banner’s attorneys were able to depose Tesla executives, securing internal documents that they believed indicated Musk and Tesla were aware of Autopilot’s limitations before the crash.

Full Self Driving 0 18 Screenshot
Tesla would go on to refer to later systems as “Full Self Driving,” though the company has thus far only delivered Level 2 driver assist systems that are incapable of fully autonomous operation.

Now, it appears that Judge Reid Scott may agree. Presiding over the Circuit Court in Palm Beach County, Scott ruled that the case could proceed to trial, with the plaintiff able to bring punitive damages against Tesla for gross negligence and intentional misconduct regarding the matter. Scott noted the case was “eerily similar” to a previous fatal crash which claimed the life of Joshua Brown, when his Tesla Model S crashed into a tractor-trailer while under Autopilot control. Scott’s overall opinion of the matter is, thus far, quite damning. “It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” Scott said on the matter.

Tesla’s attitude towards marketing and promoting Autopilot is at the root of the issue. Judge Scott found that the automaker “engaged in a marketing strategy that painted the products as autonomous,” something that was readily reflected in the attitudes of many owners towards the system. Scott laid the blame for this to a degree at Musk’s door, claiming that his public statements “had a significant effect on the belief about the capabilities of the products.” A 2016 promotional video was also cited by Scott, which claimed the Tesla vehicle involved was driving itself.

It’s early days yet, with the case yet to proceed to trial; it was originally supposed to begin in October but has yet to be properly rescheduled. If it’s found that Tesla was negligent in its promotion and delivery of an unsafe Autopilot system, it could be liable for damages not just in this case, but in all manner of other crashes in recent years.

However, Tesla won’t be playing softball when the case moves ahead. It will be coming off a recent victory in California, where a jury found 9-3 that the company was not responsible for the death of Micah Lee in 2019. Lee passed away after his Model 3 veered off the road into a tree, reportedly while under the command of Autopilot. However, the jury found for Tesla, with the company’s lawyers alleging the crash was down to human error. It followed on from a similar win in a case involving a non-fatal crash that was heard in California earlier this year.

This case would be bucking the trend if Tesla is found to be at fault. That could be a boon to other individuals that may be seeking restitution of their own from incidents involving potential Autopilot failures. Banner’s attorneys will have their work cut out as they battle what is likely a very seasoned legal team that has been fighting these cases before. Drivers, owners, and the auto industry at large will be watching this case as a bellwether for whether an automaker will be held responsible for the actions of its cars under so-called “self driving” control.

Image credits: Tesla, Tesla via YouTube screenshot

About the Author

View All My Posts

18 thoughts on “It’s Reasonable To Conclude Musk Knew About Autopilot ‘Failing To Detect Cross Traffic’ Says Florida Judge

  1. Tesla has always been clear in their owner’s manual and web site that FSD requires the driver to be touching the steering wheel and paying full attention to the road.

    Unfortunately, a certain amount of people are flaming idiots. Tractor trailer rigs are kinda hard to miss, and they certainly aren’t moving fast when crossing in front of you.

    I would expect this case will not result in any liability from Tesla.

    If it were to go the other way, I suppose yet another Pyrrhic victory for idiots everywhere.

    1. Yet there’s no sensor ensuring the driver actually IS touching the steering wheel – much less in the driver’s seat of the car – while misleadingly-named “Full Self Driving” is in operation.

      Unlike other manufacturers which have more advanced systems and require more driver interaction while their Supercruise or whatever are in play.

      Heck – Mercedes-Benz now even makes owners watch a training video in the car before they’re allowed to enable their new Level 3 systems. Who at Tesla is ensuring drivers are reading their owners manuals and understanding their Level 2 limitations?

      1. I own a Tesla, and yes, there is a sensor to ensure you are touching the wheel when auto pilot is engaged. It gives you a screen warning.

        If you don’t respond with pressure on the wheel, it turns off auto pilot for the rest of your trip.

        1. Since when was that added?
          Because from the videos of people out of their drivers seat and sleeping in their cars while rolling down the road – It would seem to not be the case.

          1. My car was built in August 2023. I should also clarify there are 3 levels of “AutoPilot” as Tesla calls it.

            My car is currently using the base autopilot which would be considered level 2 autonomous driving, meaning it essentially controls lane position, speed, and distance from the car in front of you without driver intervention.

            It most definitely requires you to apply some level of touch on the steering wheel or the nannies scold you immediately.

    2. The manual says that, but Musk used to say that it worked as he claimed and they only had to keep a driver in control for legal reasons. So the messaging is certainly mixed.

      Of course, his attorneys will claim that was hyperbole and that reasonable people would heed safety warnings. And there’s some truth to that. If you trust Musk over the folks who wrote the manual, that’s a mistake.

    3. Yeah also wtf is when these darn emergency are in the middle of the road blocking the path of the vehicles of Lord and Savior Musk? How dare they have home when his great creation plows into stopped traffic

  2. at least until these level systems are actually and reliably self driving, there needs to be established rules on what these things can be called that is common for every automaker depending on the level of actual assistance they provide.

  3. Cross traffic detection is weakness of almost every forward collision mitigation system. Of course Tesla is aware of the weakness. The big questions is, was the limitation properly disclosed to the owner?.

    I owned a Subaru with eyesight and a Grand Cherokee with a radar based system. Both manuals made me aware of the cross traffic limitation.

  4. I kind of get why Highland is only released outside the US.
    I would also avoid selling any product in the US that potentially could cause bodily harm: {Scissors, toothpicks, cars, cotton swabs, etc}

    1. Yes, it’s because of competition. Lots of new EVs (including from China) are showing up in the rest of world. The US EV market is a little mellower, upgrades can wait.

  5. Oh, come on, Musk can’t be bothered with every little detail- especially when he’s been so preoccupied lately, dealing with that annoying British secret agent who keeps snooping around his launch facilities

    1. He tried to put his two go-to guys on it but the first one couldn’t get his favorite hat through TSA and the second quit when he was disqualified from the company dental plan. Can’t find good help these days.

Leave a Reply