Culture

Tesla driver charged with manslaughter in Autopilot crash

It's the first time Autopilot has been involved in a felony case. Not that Tesla will care.

SEAL BEACH, CA - AUGUST 15: A damaged Tesla, right, and Honda Civic, sit on tow trucks after a colli...
MediaNews Group/Orange County Register via Getty Images/MediaNews Group/Getty Images

A Tesla driver who ran a red light while using Autopilot, hitting a car and killing two passengers, is being charged with two counts of vehicular manslaughter in Los Angeles County. Kevin George Aziz Riad, the defendant in the case, has pleaded not guilty to both counts and is currently free on bail.

While exiting the freeway at a high speed, Riad drove through a red light and crashed into a Honda Civic. Both Riad and another passenger in the Tesla ended up in the hospital with non-threatening injuries; the two people in the Civic died at the scene of the crash. The National Highway Traffic Safety Administration (NHTSA) has confirmed that Autopilot was enabled when the crash occurred.

The 27-year-old is the first person to be charged with a felony for a crash that involved Autopilot, as far as we know. Whether or not the court system ultimately deems him responsible for the two deaths will likely prove an important precedent for future cases involving driver assist programs. And don’t even get us started on Tesla’s silence on the matter.

A known problem — Though this is the first time someone has been charged with a felony while using Autopilot, it’s certainly not the first Autopilot-related crash we’ve seen. Actually, there have been so many at this point that the NHTSA has begun officially investigating Autopilot’s general safety. Tesla is legally required to report any crashes related to semi-autonomous or full self-driving technology.

Autopilot’s general inability to function is an open secret. Take, for example, the fact that it regularly has trouble distinguishing between a traffic light and the face of the moon. Allowing a piece of software with such blatant limitations to assist your driving experience probably isn’t going to turn out well.

Only the beginning — Greeted by their own software’s real-world dangers, some companies would take a step back to re-evaluate how best to mitigate those risks. Not Tesla. The company won’t even acknowledge Autopilot’s shortcomings, never mind assume culpability for them.

Tesla has instead decided to barrel forward with its Full Self-Driving software — before even the partial self-driving software is actually safe to use. Tesla owners are out on public roads using Full Self-Driving right now, despite its known penchant for putting people directly in harm’s way.

Tesla, it seems, is dead-set on providing “features” that really just make its vehicles more dangerous. Elon Musk must get off on it or something; why else would you allow your company to create software that explicitly carries out illegal driving maneuvers?

Riad’s case would be a great reason for Tesla to re-evaluate its priorities here. Autopilot was just implicated in a felony case. But we know better than to hold out hope for Tesla to change, at this point.