While Tesla driver's case ends, autopilot's ethical questions persist

Tesla says its cars are not autonomous, but critics say the electric vehicle maker continues a misleading marketing campaign implying that vehicles using autopilot can drive themselves.

US safety regulators are probing Tesla’s partially automated driving systems in at least 35 crashes and 17 deaths nationwide since 2016.  Photo: AP Archive
AP Archive

US safety regulators are probing Tesla’s partially automated driving systems in at least 35 crashes and 17 deaths nationwide since 2016.  Photo: AP Archive

When a criminal prosecution against a Tesla driver in Los Angeles County ends, it will mark the final step of a case believed to be the first time in the US prosecutors brought felony charges against a motorist who was using a partially automated driving system.

But the conclusion of driver Kevin Aziz Riad's case is offering little solace to Lorena Ochoa, whose spouse was one of two people killed in the 2019 crash in a Los Angeles suburb. She believes both Tesla and Aziz Riad, who received probation as punishment, should face harsher consequences.

Aziz Riad faces a restitution hearing on Tuesday, where a judge will determine how much money he owes the families of Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez. Aziz Riad was using Autopilot, and the case has raised legal and ethical questions about the technology, particularly as Tesla sales grow and more automakers equip cars with similar systems.

The victims' families have separately filed civil lawsuits against Aziz Riad and Tesla that are ongoing.

Tesla says on its website that its cars require human supervision and are not autonomous, but critics say the electric vehicle maker continues a misleading marketing campaign implying that vehicles using autopilot can drive themselves.

“They make cars that they know cause accidents, and they don’t care,” said Ochoa, Alcazar Lopez’s spouse, in an interview in Spanish last week. “Families are broken, lives are lost and they don’t care.”

Authorities say Aziz Riad, a limousine service driver, was at the wheel of a Tesla Model S that was moving at 119 kph (74 mph) when it left a freeway and ran a red light on a local street in Gardena, California, on Dec. 29, 2019. The Tesla struck a Honda Civic at an intersection, and Alcazar Lopez and Nieves-Lopez died at the scene.

Tesla says autopilot technology can keep a car in its lane, maintain some distance from cars in front of it and make lane changes. But autopilot has had trouble with stopping for emergency vehicles parked on roads, and it’s also under investigation by the National Highway Traffic Safety Administration for braking without driver input.

US safety regulators are probing Tesla’s partially automated driving systems in at least 35 crashes and 17 deaths nationwide since 2016. The automaker did not respond to requests for comment.

Aziz Riad, the Tesla driver, pleaded no contest to two counts of vehicular manslaughter with gross negligence. Despite facing more than seven years behind bars, a judge sentenced him to probation in June.

Bryant Walker Smith, a University of South Carolina law professor who follows automated vehicles, says the law has to balance two arguments that are both correct. One is that people should be held accountable for mistakes if they fail to control a two-ton vehicle. Another is that in Aziz Riad's case, there's no evidence he intended to kill anyone.

Read More
Read More

Tesla CEO announces plan to develop robotaxis

Can Tesla be made more safer?

The question of civil liability is even more complex. Is Aziz Riad responsible for the deaths, since he was behind the wheel, or is Tesla?

It’s possible to argue Tesla engineers should know that people will become too reliant on driver-assist systems and trust them too much, Walker Smith said.

For years, he and others have said Tesla can do more to make its technology safer. Their suggestions include limiting Autopilot use to freeways, as well as upgrading a driver-monitoring system that currently allows drivers to “check out” while behind the wheel. Walker Smith also wants Tesla's technology to shut down faster if it determines drivers are not watching the road.

Similar technology from Ford and General Motors, for instance, monitor drivers with infrared cameras to make sure they’re paying attention. If they don’t, the systems warn the drivers and will turn off. They also confine use of their systems to mostly limited-access freeways and turn them off when they are on city streets, which are more complex and present more dangers.

Read More
Read More

South Korea fines Tesla for exaggerating driving range of its EVs

Route 6