- Advertisement -spot_img
Sunday, May 29, 2022

A Tesla driver in California has been charged with involuntary manslaughter, the first fatal accident involving autopilot.

TOM KREESER and STEPHANIE DAZIO | Associated Press

DETROIT — California prosecutors have filed two manslaughter charges against a Tesla driver on autopilot who ran a red light, crashed into another car and killed two people in 2019.

Apparently, the defendant became the first person in the United States to be charged with a felony for a fatal accident involving a motorist using a partially automated driving system. The Los Angeles County Attorney’s Office filed the charges in October, but they only came to light last week.

The driver, 27-year-old Kevin George Aziz Riad, pleaded not guilty. Riad, the limousine driver, is out on bail while the case is pending.

Misuse of the autopilot, which can control steering, speed and braking, has occurred on many occasions and is the subject of an investigation by two federal agencies. Blaming an accident in California could serve as a warning to drivers using systems such as autopilot that they cannot rely on them to drive their vehicles.

The criminal charges are not the first to involve an automated driving system, but they are the first to involve widely used driver technology. Arizona authorities have charged a driver hired by Uber to test a fully autonomous car on public roads with negligent homicide in 2020. Uber, an SUV with a spare driver on board, hit a pedestrian and killed him.

On the contrary, autopilot and other driver assistance systems are widely used on roads around the world. It is estimated that 765,000 Tesla vehicles are equipped with it in the United States alone.

Police said the Tesla Model S was driving at high speed at the time of the crash when it veered off the freeway and ran a red light in the Los Angeles suburb of Garden before crashing into a Honda Civic at an intersection on December 29, 2019. Two people. who were in the Civic, Gilberto Alcazar López and Maria Guadalupe Nieves-López died at the scene. Riad and the woman in the Tesla were hospitalized with non-life-threatening injuries.

Autopilot is not mentioned in the prosecution documents. But the National Highway Traffic Safety Administration, which sent investigators to the scene of the accident, confirmed last week that Tesla was using autopilot at the time of the crash.

Riad’s attorney did not respond to requests for comment last week, and the Los Angeles County District Attorney’s office declined to discuss the case. A preliminary hearing in the Riad case has been scheduled for February 23.

The NHTSA and the National Transportation Safety Board are looking into widespread misuse of autopilot by drivers whose overconfidence and inattention have been blamed for numerous accidents, including fatalities. In one crash report, the NTSB referred to its misuse as “automation complacency”.

The agency said that in a 2018 crash in Culver City, California, when a Tesla crashed into a fire truck, the design of the autopilot system “allowed the driver to take their eyes off the task of driving.” Nobody was hurt in that accident.

Last May, a California man was arrested after police noticed his Tesla driving on a freeway with a man in the back seat and no person driving.

Teslas that used autopilot also collided with road barriers or tractor-trailers that crossed roads. Since 2016, the NHTSA has deployed investigation teams to investigate 26 autopilot accidents that have killed at least 11 people.

Messages were left asking for comment from Tesla, which has dissolved its media relations department. Since Autopilot crashes began, Tesla has updated the software to make it harder for drivers to abuse it. He also tried to improve the autopilot’s ability to detect ambulances.

The company said that autopilot and a more sophisticated “full self-driving” system cannot drive a car on its own, and that drivers need to be alert and ready to respond at all times. “Full Self Driving” is being tested by hundreds of Tesla owners on public roads in the US.

Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles, said this is the first U.S. case he is aware of in which serious criminal charges have been filed for a fatality involving a partially automated driver assistance system. Tesla could be “criminally, civilly or morally guilty” if found to have implemented dangerous technology, he said.

Donald Slavik, a Colorado-based lawyer who has served as a consultant in automotive technology lawsuits, including against Tesla, said he, too, is unaware of any previous felony charges against an American driver who used part of the automated driving technology involved in the accident. fatal accident.

The Lopez and Nieves-Lopez families sued Tesla and Riada in separate lawsuits. They allege negligence on the part of the Riad and accuse Tesla of selling defective vehicles that can accelerate suddenly and lack an effective automatic emergency braking system. A joint trial is scheduled for mid-2023.

Lopez’s family alleges in court documents that the car “suddenly and unintentionally accelerated to excessive, unsafe and uncontrollable speeds.” The Nieves-Lopez family also allege that Riad was an unsafe driver, had several traffic violations on his record, and could not handle the high-performance Tesla.

Separately, the NHTSA is investigating a dozen accidents in which a Tesla on autopilot collided with several parked ambulances. In the crashes under investigation, at least 17 people were injured and one person died.

World Nation News Desk
World Nation News Deskhttps://www.worldnationnews.com
World Nation News is a digital news portal website. Which provides important and latest breaking news updates to our audience in an effective and efficient ways, like world’s top stories, entertainment, sports, technology and much more news.
Latest news
Related news
- Advertisement -

LEAVE A REPLY

Please enter your comment!
Please enter your name here