The Dutch government claims it can decrypt Tesla’s hidden driving data

Tesla's closely-guarded driving data has been decrypted for the first time, according to a Dutch government-run forensic lab. The Netherlands Forensic Institute (NFI) said it discovered a wealth of information about Tesla's Autopilot, along with data around speed, accelerator pedal positions, steering wheel angle and more. The findings will allow the government to "request more targeted data" to help determine the cause of accidents, the investigators said. 

The researchers already knew that Tesla vehicles encrypt and store accident related data, but not which data and how much. As such, they reverse-engineered the system and succeeded in "obtaining data from the models S, Y, X and 3," which they described in a paper presented at an accident analysis conference.

These data contain a wealth of information for forensic investigators and traffic accident analysts and can help with a criminal investigation after a fatal traffic accident or an accident with injury. 

With knowledge of how to decrypt the storage, the NFI carried out tests with a Tesla Model S so it could compare the logs with real-world data. It found that the vehicle logs were "very accurate," with deviations less than 1 km/h (about 0.6 MPH).

The NSI also analyzed several accidents using the raw data it acquired. In one case, a Tesla on Autopilot collided with a car ahead that suddenly braked. Normally, if the Autopilot doesn't brake in time, the driver is supposed to take over.  

"In this case, the investigation showed that the driver did indeed intervene and also within the expected response time," said researcher Aart Spek. "The fact that it turned out to be a collision was because the following distance [chosen by Autopilot] was too tight in the busy traffic situation. That makes it interesting, because who is responsible for the following distance: the car or the driver?" 

It used to be possible to extract Autopilot data from Tesla EVs, but it's now encrypted in recent models, the investigators said. Tesla encrypts data for good reason, they acknowledged, including protecting its own IP from other manufacturers and guarding a driver's privacy. It also noted that the company does provide specific data to authorities and investigators if requested.

However, the team said that the extra data they extracted would allow for more detailed accident investigations, "especially into the role of driver assistance systems." It added that it would be ideal to know if other manufacturers stored the same level of detail over long periods of time. "If we would know better which data car manufacturers all store, we can also make more targeted claims through the courts or the Public Prosecution Service," said NFI investigator Frances Hoogendijk. "And ultimately that serves the interest of finding the truth after an accident."

Automakers must report crashes involving self-driving and driver-assist systems

The National Highway Traffic Safety Administration (NHTSA) has implemented a new policy that will require car companies to report incidents involving semi- and fully autonomous driving systems within one day of learning of an accident. In an order spotted by The Washington Post, NHTSA mandates automakers fill out an electronic incident form and submit it to the agency when one of their systems was active either during a crash or immediately before it. They must report an accident anytime there's a death, an injury that requires hospital treatment, a vehicle that's towed away, an airbag deployment or when a pedestrian and or cyclist is involved. The order covers Level 2 advanced driver-assistance systems to Level 5 fully autonomous vehicles, meaning it includes the gamut of everything from Tesla cars with Autopilot to Waymo taxis.

"This action will enable NHTSA to collect information necessary for the agency to play its role in keeping Americans safe on the roadways, even as the technology deployed on the nation's roads continues to evolve," the regulator said. NHTSA said it would also require automakers to send in monthly reports detailing all incidents with injuries or property damage involving their automated driving systems. Companies that fail to comply with the order could face fines of up to $22,992 per day, according to The Post.

NHTSA's order comes some two months after a 2019 Tesla Model S was involved in a high-profile crash where investigators initially said there was no one behind the car's wheel. The National Transportation Safety Board (NTSB) later said it examined home security footage that showed the owner got into the driver's seat before the fatal accident. Mere weeks ahead of that incident, Robert Sumwalt, the chair of the NTSB, sent a letter to NHTSA in which he called on the agency to implement stricter regulation related to automated vehicle technology. NHTSA "must act" to "develop a strong safety foundation," he said, citing Tesla frequently in his letter.