Dear Reader,
Recently, I came across an article I wanted to share with you. It highlights some of the risks we’re seeing with self-driving cars. The story is about a driver who was sitting in her parked car at a mall in Maryland when a Tesla, with no one behind the wheel, crashed into her. Yeah,
The Tesla owner was trying out this feature called Summon for the first time. It’s supposed to make the car drive itself to you in a parking lot using the phone’s GPS, and the car parks itself. But things went wrong. The Tesla scraped the back of her car and kept moving even after the impact. Can you imagine? The woman said she couldn’t believe her eyes when she saw this car coming toward her with no one inside. As for the Tesla driver, he was just as shocked and said, “It’s not supposed to do this!”
The feature he was using is part of an update called Smart Summon, which allows the car to steer and navigate on its own from up to 200 feet away. The technology is impressive in theory, but it’s still in beta, meaning it’s not fully developed or thoroughly tested yet. Tesla has put it out there for drivers to try, but it’s not quite ready for everyday use. That’s why we’re seeing issues like this crash. Even though the feature is supposed to help, it’s clearly not foolproof.
The article goes on to talk about how the I-Team tested this feature with a Tesla at the Insurance Institute for Highway Safety. During their testing, sometimes the car worked perfectly—it stopped for a simulated pedestrian and drove around obstacles. But other times, it wasn’t so smooth. One time, the Tesla started veering toward another car, and another time, it actually hit the curb. It just goes to show that while this technology is exciting, it’s not fully reliable yet.
What’s concerning is that Tesla is letting regular drivers test these features out in real-world settings. Experts are saying that Tesla and other companies are releasing these beta features too early—before they’ve been fully proven safe. Michael Brooks from the Center for Auto Safety even said that this kind of release is dangerous because the technology hasn’t been thoroughly validated yet. The National Highway Traffic Safety Administration (NHTSA) is keeping an eye on this, and they track crashes involving vehicles with these types of advanced driver-assistance systems. Since 2023, there have been over 900 crashes reported, and Tesla makes up the bulk of those reports.
The insurance industry is also scratching its head over this. Think about it—if there’s no one behind the wheel, who’s responsible when a crash like this happens? This situation in Maryland was the first of its kind for both the Tesla driver’s insurance agent and the woman’s insurance agent. We’re in uncharted territory when it comes to how insurance handles accidents involving self-driving cars.
At the end of the day, every expert agrees that as advanced as these cars might be, there’s no replacement for having an engaged, attentive driver behind the wheel. Technology can only go so far, and right now, it’s clear that even features like Smart Summon aren’t quite ready to take over completely. Drivers still need to be fully in control and ready to step in when things go wrong—because, as we’re seeing, things can go wrong.
I’m sharing this with you because I think it’s important for you to know the risks that come with self-driving features. We’re in a world where technology is moving fast, but that doesn’t mean it’s always safe, especially when companies are rolling out features before they’re truly ready. If you or someone you care about is ever involved in an accident like this—where technology fails and causes harm—give me a call. I’m here to help and make sure you get the justice you deserve when someone else’s negligence or a faulty product puts you at risk.
Until next time, please be safe, keep your hands on the wheel, and even if you’re in a self-driving car, never text while driving.
Paul Samakow
Attorney Paul Samakow
703-761-4343 | 301-949-1515