Tesla faces fresh safety probe following fatal accident

Regulators with the National Highway Traffic Safety Administration (NHTSA) are opening a probe involving a fatal crash involving a Tesla Model Y. The accident, occurring on July 19, found a Tesla striking a tractor-trailer truck in Virginia, fatally wounding the driver of the automobile. These regulators believe that the 57-year-old Tesla driver was relying on the company’s advanced driver assistance programs at the time of the accident, according to a report by Reuters.

The Fauquier County Sheriff's Office provided more details on the accident, saying that the tractor trailer attempted to turn onto a highway from a truck stop when the Tesla struck the side and slid underneath the trailer. The Tesla driver was pronounced dead at the scene. As for the truck driver, authorities issued a summons for reckless driving.

The summons indicates that authorities blame the truck’s driver for the incident, but Tesla’s assistance program is supposed to account for mistakes stemming from other people on the road, thus the NHTSA investigation. To that end, the safety regulator has opened more than three dozen investigations into crashes involving Tesla vehicles and their advanced assistance algorithms. All told, the agency suspects the system has been involved in 23 deaths since 2016.

In 2021, the National Transportation Safety Board (NTSB) urged the NHTSA to issue stricter regulations for autonomous driving, stating in its letter that “Tesla is testing on public roads a highly automated AV technology but with limited oversight or reporting requirements.”

Tesla’s proprietary Autopilot technology is intended to steer, accelerate and brake within the vehicle’s lane, while an enhanced system assists with changing lanes on highways. Tesla says the system isn’t truly automated and requires active human supervision. The company hasn’t responded to a request for comment by Reuters regarding this latest accident and the newly-opened probe.

This article originally appeared on Engadget at https://www.engadget.com/tesla-faces-fresh-safety-probe-following-fatal-accident-180725262.html?src=rss

Uber safety driver involved in fatal self-driving car crash pleads guilty

The Uber safety driver at the wheel during the first known fatal self-driving car crash involving a pedestrian has pleaded guilty to and been sentenced for an endangerment charge. Rafaela Vasquez will serve three years of probation for her role in the 2018 Tempe, Arizona collision that killed Elaine Herzberg while she was jaywalking at night. The sentence honors the prosecutors' demands and is stiffer than the six months the defense team requested.

The prosecution maintained that Vasquez was ultimately responsible. While an autonomous car was involved, Vasquez was supposed to concentrate on the road and take over if necessary. The modified Volvo XC90 in the crash was operating at Level 3 autonomy and could be hands-free in limited conditions, but required the driver to take over at a moment's notice. It noticed Herzberg but didn't respond to her presence.

The defense case hinged on partly blaming Uber. Executives at the company thought it was just a matter of time before a crash occurred, according to supposedly leaked conversations. The National Transportation Safety Board's (NTSB) collision findings also noted that Uber had disabled the emergency braking system on the XC90, so the vehicle couldn't come to an abrupt stop.

Tempe police maintained that Vasquez had been watching a show on Hulu and wasn't paying attention during the crash. Defense attorneys have insisted that Vasquez was paying attention and had only been momentarily distracted.

The plea and sentencing could influence how other courts handle similar cases. There's long been a question of liability surrounding mostly driverless cars — is the human responsible for a crash, or is the manufacturer at fault? This suggests humans will still face penalties if they can take control, even if the punishment isn't as stiff for conventional situations.

Fatal crashes with autonomy involved aren't new. Tesla has been at least partly blamed for collisions while Full Self Driving was active. The pedestrian case is unique, though, and looms in the background of more recent Level 4 (fully driverless in limited situations) offerings and tests from Waymo and GM's Cruise.While the technology has evolved since 2018, there are still calls to freeze robotaxi rollouts over fears the machines could pose safety risks.

This article originally appeared on Engadget at https://www.engadget.com/uber-safety-driver-involved-in-fatal-self-driving-car-crash-pleads-guilty-212616187.html?src=rss

NTSB: Autopilot was not a factor in fatal Tesla Model S crash

Tesla's Autopilot was not at fault in a 2021 crash in which two people died, according to the National Transportation Safety Board (NTSB). In a final report spotted by Ars Technica, the agency determined that the 2019 Model S accelerated just before hitting a tree in Spring, Texas, just north of Houston. Neither occupant was in the driver's seat when they were found, leading to questions about the use of Autopilot.

Based on information provided by Tesla, the NTSB found (PDF) that the car's rapid acceleration from 39MPH to 67MPH two seconds before the crash and a loss of control of the EV was likely due to "impairment from alcohol intoxication in combination with the effects of two sedating antihistamines, resulting in a roadway departure, tree impact and post-crash fire." The NTSB says data indicated that Autopilot had not been employed "at any time during this ownership period of the vehicle." Investigators did not find any "evidence of mechanical deficiencies" that could have contributed to or caused the crash.

One of the occupants was found in the front passenger seat, while the other was in the rear. It's presumed that the driver was in the back seat because he was trying to escape. Security footage showed that the men were in the front seats as they set off, while data showed that both front seatbelts were buckled at the time of the crash — the car left the road around 550 feet from the driver's home. The men died as a result of the collision and post-crash battery fire.

Jury finds Tesla just ‘1%’ responsible for a Florida teen’s crash

Tesla is receiving minimal blame for a fiery 2018 crash in South Florida, which killed two teenagers and injured another. A jury today found Tesla just one percent responsible for the crash, reports the AP, which means it's only responsible for paying $105,00 of the $10.5 million awarded to the teen's family. 90 percent of the blame was placed on the teen driver, Barrett Riley, while his father James Riley received nine percent of the blame.

According to an NTSB investigation, Barrett Riley was driving at 116 mph in a 30 mph zone near Fort Lauderdale Beach. The agency concluded he most likely lost control of the vehicle. James Riley initially sued Tesla over the crash, claiming that it would have been survivable if the electric car's lithium ion batteries hadn't “burst into an uncontrollable and fatal fire." He also noted that the company removed a speed limiter that was meant to keep the vehicle under 85 mph. An investigation later found that his son had asked a Tesla dealership to remove that limiter.

Tesla lawyers argued that Riley's parents were negligent by allowing him to drive the car, despite his record of reckless driving and speeding. They denied negligence on the company's part. After the crash in 2018, Tesla released an update allowing drivers to set their own speed limits, a feature initially dedicated to Barrett Riley.

NHTSA deepens its probe into Tesla collisions with stationary emergency vehicles

The National Highway Traffic Safety Administration (NHTSA) has deepened (PDF) its investigation into a series of Tesla crashes involving first responders to an engineering analysis. As The Washington Post explains, that's the last stage of an investigation, and the agency typically decides within a year if a vehicle should be recalled or if the probe should be closed. In addition to upgrading the probe's status, the investigation now covers 830,000 units, or almost all the Tesla Model Y, Model X, Model S and Model 3 vehicles the company has sold since 2014.

This development expands upon the investigation the NHTSA initiated back in 2021 following 11 collisions of Tesla vehicles with parked emergency responders and trucks. Since then, the agency has identified and added six more incidents that occurred over the past couple of years. In most of those crashes, Autopilot gave up vehicle control less than one second before impact, though Automatic Emergency Braking intervened in at least half of them. 

The NHTSA also found that the first responders on the road would've been visible to the drivers at an average of eight seconds before impact. Plus, forensic data showed no driver took evasive action between 2 to 5 seconds prior to impact even though they all had their hands on the wheel. Apparently, nine of the 11 vehicles originally involved in the investigation exhibited no driver engagement visual or chime alerts until the last minute before the collision. Four of them didn't exhibit any engagement visual or chime alert at all. 

The NHTSA also looked into 191 crashes not limited to incidents involving first responders. In 53 of those collisions, the agency found that the driver was "insufficiently responsive" as evidenced by them not intervening when needed. All these suggest that while drivers are complying with Tesla's instructions to make sure they have their hands on the wheel at all times, they're not necessarily paying attention to their environment. 

That said, the NHTSA noted in its report that "a driver's use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect." As University of South Carolina law professor Bryant Walker Smith told The Post, monitoring the position of a driver's hands isn't effective enough, because it doesn't ensure a driver's capability to respond to what they encounter on the road. 

In addition, the NHTSA noted that the ways a driver may interact with the system is an important design consideration for Level 2 autonomous driving technologies. These systems still aren't full autonomous and still mostly depend on the human driver, after all. "As such, ensuring the system facilitates the driver's effective performance of this supervisory driving task presents an important safety consideration," the agency wrote.

Tesla Autopilot under investigation following crash that killed three people

A recent Model S crash that killed three people has sparked another Federal probe into Tesla's Autopilot system, The Wall Street Journal has reported. The National Highway Traffic Safety Administration (NHTSA) is conducting the investigation and said it's currently looking into more than 30 incidents involving Tesla's Autopilot.

The accident occurred on May 12th in Newport Beach's Mariners Mile strip, according to the Orange County Register. The EV reportedly struck a curb and ran into construction equipment, killing all three occupants. Three construction workers were also sent to hospital with non-life-threatening injuries. Police declined to say whether Tesla's Autopilot was involved. 

Tesla is one of a number of automakers that have released Level 2 driver assistance systems designed to ease driving chores. Those systems are far from full self-driving (Level 4 or 5) though, and Tesla specifically instructs drivers to pay attention to the road and keep their hands on the wheel. 

The NHTSA said last August that it was opening an investigation into Autopilot following 11 crashes with parked first responder vehicles since 2018 that resulted in 17 injuries and one death. 

The NHTSA itself has been criticized by the National Transportation Safety Board (NTSB) for not ensuring automakers include the right safety features in their Level 2 autonomous vehicles. NTSB chair Jennifer Homendy has called Tesla's use of the term "Full Self-Driving" for its latest Autopilot system "misleading and irresponsible," saying "it has clearly misled numerous people to misuse and abuse technology." 

Tesla driver in fatal California crash first to face felony charges involving Autopilot

A Tesla owner is facing the first felony charges filed against someone using a partially automated driving system in the US, according to AP. The defendant, Kevin George Aziz Riad, was driving a Model S when he ran a red light and crashed into a Honda Civic at a California intersection in 2019. It ended up killing the Civic's two passengers, while Riad and his companion sustained non-life threatening injuries. California prosecutors filed two counts of vehicular manslaughter against Riad in October last year.

The court documents reportedly didn't mention anything about Autopilot. However, the National Highway Traffic Safety Administration (NHTSA), which has been investigating the incident over the past couple of years, recently confirmed that it was switched on at the time of the crash. The NHTSA formally opened a probe into Tesla's driver assistance system in August last year following a string of 11 crashes involving parked first responder vehicles that killed 17 people. It's also investigating other types of crashes with Tesla vehicles, including one complaint blaming the beta version of the company's Full Self Driving technology for a collision in California. 

As AP notes, Riad is the first to face charges involving a widely used driver assistance technology, but he's not the very first person using an automated driving system to be charged in the US. In 2020, an Uber backup driver was charged with negligent homicide after the company's autonomous test vehicle struck and killed a pedestrian in Arizona. According to an investigation by the National Transportation Safety Board (NTSB), Uber's technology detected the victim more than five seconds before the crash but wasn't able to identify her as a pedestrian. The driver could have avoided the crash if she had been paying attention. 

The NHTSA told AP in a statement that "every vehicle requires the human driver to be in control at all times" even if it has a partially automated system. On its Autopilot page, Tesla says that Autopilot is "intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment."

Apple is reportedly working on a way for iPhones to detect car crashes and auto-dial 911

Your iPhone might have a new capability as soon as next year: detecting a car accident and automatically dialing 911. Apple plans to unveil a feature called "crash detection" for both iPhones and Watches, according to a Wall Street Journal report. The feature would supposedly use sensors like the accelerometer built into Apple devices.

Apple has reportedly been working on the feature for several years and testing it using real world data. According to documents seen by the WSJ, Apple has been collecting data shared anonymously from iPhone and Watch users. It has detected more than 10 million suspected vehicle impacts, with more than 50,000 of those accompanied by a call to 911. Apple has been using that data to improve the accuracy of its crash-detection algorithm, since a 911 emergency call is pretty solid confirmation of a serious crash. 

It's certainly not first to the gate with this. Google introduced a similar feature for the Pixel 3 and Pixel 4 via its Personal Safety app that can detect when you've been in a car crash and alert emergency services. GM has been offering it for years in its cars with OnStar, and recently introduced crash detection to smartphones via the OnStar Guardian app. OnStar's in-vehicle service reportedly responds to over 6,000 crash notifications a month, as the WSJ noted. 

Apple introduced fall-detection to the Apple Watch 4 series, and it can automatically call emergency services and contact your loved ones if you don't respond to a prompt in a certain amount of time. The crash-detection feature is supposed to come out for iPhone and Apple Watches in 2022, provided everything goes to plan. 

Apple is reportedly working on a way for iPhones to detect car crashes and auto-dial 911

Your iPhone might have a new capability as soon as next year: detecting a car accident and automatically dialing 911. Apple plans to unveil a feature called "crash detection" for both iPhones and Watches, according to a Wall Street Journal report. The feature would supposedly use sensors like the accelerometer built into Apple devices.

Apple has reportedly been working on the feature for several years and testing it using real world data. According to documents seen by the WSJ, Apple has been collecting data shared anonymously from iPhone and Watch users. It has detected more than 10 million suspected vehicle impacts, with more than 50,000 of those accompanied by a call to 911. Apple has been using that data to improve the accuracy of its crash-detection algorithm, since a 911 emergency call is pretty solid confirmation of a serious crash. 

It's certainly not first to the gate with this. Google introduced a similar feature for the Pixel 3 and Pixel 4 via its Personal Safety app that can detect when you've been in a car crash and alert emergency services. GM has been offering it for years in its cars with OnStar, and recently introduced crash detection to smartphones via the OnStar Guardian app. OnStar's in-vehicle service reportedly responds to over 6,000 crash notifications a month, as the WSJ noted. 

Apple introduced fall-detection to the Apple Watch 4 series, and it can automatically call emergency services and contact your loved ones if you don't respond to a prompt in a certain amount of time. The crash-detection feature is supposed to come out for iPhone and Apple Watches in 2022, provided everything goes to plan. 

The Dutch government claims it can decrypt Tesla’s hidden driving data

Tesla's closely-guarded driving data has been decrypted for the first time, according to a Dutch government-run forensic lab. The Netherlands Forensic Institute (NFI) said it discovered a wealth of information about Tesla's Autopilot, along with data around speed, accelerator pedal positions, steering wheel angle and more. The findings will allow the government to "request more targeted data" to help determine the cause of accidents, the investigators said. 

The researchers already knew that Tesla vehicles encrypt and store accident related data, but not which data and how much. As such, they reverse-engineered the system and succeeded in "obtaining data from the models S, Y, X and 3," which they described in a paper presented at an accident analysis conference.

These data contain a wealth of information for forensic investigators and traffic accident analysts and can help with a criminal investigation after a fatal traffic accident or an accident with injury. 

With knowledge of how to decrypt the storage, the NFI carried out tests with a Tesla Model S so it could compare the logs with real-world data. It found that the vehicle logs were "very accurate," with deviations less than 1 km/h (about 0.6 MPH).

The NSI also analyzed several accidents using the raw data it acquired. In one case, a Tesla on Autopilot collided with a car ahead that suddenly braked. Normally, if the Autopilot doesn't brake in time, the driver is supposed to take over.  

"In this case, the investigation showed that the driver did indeed intervene and also within the expected response time," said researcher Aart Spek. "The fact that it turned out to be a collision was because the following distance [chosen by Autopilot] was too tight in the busy traffic situation. That makes it interesting, because who is responsible for the following distance: the car or the driver?" 

It used to be possible to extract Autopilot data from Tesla EVs, but it's now encrypted in recent models, the investigators said. Tesla encrypts data for good reason, they acknowledged, including protecting its own IP from other manufacturers and guarding a driver's privacy. It also noted that the company does provide specific data to authorities and investigators if requested.

However, the team said that the extra data they extracted would allow for more detailed accident investigations, "especially into the role of driver assistance systems." It added that it would be ideal to know if other manufacturers stored the same level of detail over long periods of time. "If we would know better which data car manufacturers all store, we can also make more targeted claims through the courts or the Public Prosecution Service," said NFI investigator Frances Hoogendijk. "And ultimately that serves the interest of finding the truth after an accident."