Waymo issued a recall after two robotaxis crashed into the same pickup truck

Last year, two Waymo robotaxis in Phoenix "made contact" with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles' software. A "recall" in this case meant rolling out a software update after investigating the issue and determining its root cause. 

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn't pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn't elaborate on what it meant by saying that its robotaxis "made contact" with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren't carrying any passenger. 

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to "persistent orientation mismatch" between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20. 

Waymo's rival company Cruise was involved in a more serious incident last year, wherein one of its robotaxis accidentally dragged someone hit by another vehicle a few dozen feet down a San Francisco street. California then suspended its license to operate in the state, and Cruise eventually paused all robotaxi operations, even the ones with a human driver behind the wheel, as part of a safety review. Meanwhile, it's business as usual for Waymo, which recently announced that it will start testing driverless vehicles on highways and freeways in and around Phoenix. 

This article originally appeared on Engadget at https://www.engadget.com/waymo-issued-a-recall-after-two-robotaxis-crashed-into-the-same-pickup-truck-055708611.html?src=rss

GM’s Cruise division is being investigated by the DoJ and SEC following a pedestrian accident

GM's driverless Cruise division is under investigation by both the Department of Justice (DoJ) and Securities and Exchange Commission (SEC), The Washington Post has reported. The probes follow an incident last year in which a jaywalking pedestrian was struck by a Cruise autonomous vehicle and then dragged 20 feet, worsening her injuries.

At the same time, yesterday Cruise released its own third-party findings regarding the accident, which took place on October 2 and involved another vehicle (a Nissan). The company said it "failed to live up to the justifiable expectations of regulators and the communities we serve... [and] also fell woefully short of our own expectations," adding that it's "fully cooperating" with investigators. According to its own findings, that's an understatement to say the least. 

According to the report, Cruise withheld crucial information from officials during a briefing the day after the accident. Specifically, the company failed to mention that its autonomous vehicle (AV) had dragged the victim 20 feet at around 7 MPH, causing serious injuries. According to the internal report, that occurred because the vehicle mistakenly detected a side (rather than a frontal) collision and attempted to pull over rather than stopping. 

At least 100 Cruise employees, including members of senior leadership, legal and others, were aware of the dragging incident — but failed to disclose it during October 3 meetings with the San Francisco Mayor's Office, NHTSA, DMV and other officials, the report states.

The company said it intended to let a video of the dragging incident speak for itself, then answer questions about it. However, the video didn't play clearly and fully due to internet connection issues, and then Cruise employees failed to verbally affirm the pullover maneuver and dragging of the pedestrian. In case that's not bad enough, the third-party findings state:

Cruise leadership was fixated on correcting the inaccurate media narrative that the Cruise AV, not the Nissan, had caused the Accident. This myopic focus led Cruise to convey the information about the Nissan hit-and-run driver having caused the Accident to the media, regulators and other government officials, but to omit other important information about the Accident. Even after obtaining the Full Video, Cruise did not correct the public narrative but continued instead to share incomplete facts and video about the Accident with the media and the public.

The report says the failings came about due to "poor leadership, mistakes in judgment, lack of coordination, an 'us versus them' mentality with regulators, and a fundamental misapprehension of Cruise’s obligations of accountability and transparency to the government and the public." 

Prior to the crash, Cruise was facing other problems with its autonomous vehicles (AVs) failing to recognize children and the frequency with which human operators took control. According to former CEO Vogt, human drivers needed to intervene in trips every four to five miles. 

Cruise had its license to operate suspended in California back in October. The company also laid off 24 percent of its workforce late last year, following the resignation of co-founder Daniel Kan and the departure of its CEO Kyle Vogt. On top of the two federal investigations, the company is also facing a lawsuit from the city of San Francisco. 

This article originally appeared on Engadget at https://www.engadget.com/gms-cruise-is-being-investigated-by-the-doj-and-sec-following-a-pedestrian-accident-104030508.html?src=rss

FAA grounds roughly 171 Boeing 737 Max 9 planes after a cabin panel blew out during flight

The Federal Aviation Administration (FAA) has ordered airlines to temporarily ground some Boeing 737 Max 9 planes for safety inspections after an Alaska Airlines plane lost a cabin panel during a flight on Friday with about 180 people on board. The plane, which had only been in service since November, according to the New York Times, was able to safely land back at Portland International Airport in Oregon, where it had taken off from. There were no major injuries, though the Alaska division of the Association of Flight Attendants said workers described “explosive” decompression in the cabin and reported one flight attendant sustained minor injuries.

“The FAA is requiring immediate inspections of certain Boeing 737 Max 9 planes before they can return to flight,” FAA Administrator Mike Whitaker said. “Safety will continue to drive our decision-making as we assist the NTSB’s investigation into Alaska Airlines Flight 1282.” 

Immediately following the incident, Alaska Airlines CEO Ben Minicucci put out a statement saying the company would be grounding its fleet of 65 Boeing 737-9 aircraft for what it expects to be a few days as it conducts safety checks. “Each aircraft will be returned to service only after completion of full maintenance and safety inspections,” Minicucci. The FAA order extends the grounding to “approximately 171 airplanes worldwide” that are either operated by US airlines or in US territory.

Minicucci also said that the National Transportation Safety Board is investigating what happened with Flight 1282 and “we will fully support their investigation.” The plane had been on its way to Ontario, California. Reuters, citing FlightRadar24, reported that the blowout occurred at around 16,000 feet. In social media posts shared with Reuters and the NYT, passengers can be seen sitting right next to the gaping hole and the fully exposed sky.

Boeing's 737 Max was previously grounded for almost two years after fatal crashes in 2018 and 2019. All 189 people on board the plane were killed in the 2018 crash in Indonesia, and another 157 died in the 2019 crash in Ethiopia. In 2021, Boeing agreed to pay $2.5 billion in a settlement with the Department of Justice to avoid criminal charges over the crashes.

This article originally appeared on Engadget at https://www.engadget.com/faa-grounds-roughly-171-boeing-737-max-9-planes-after-a-cabin-panel-blew-out-during-flight-210331403.html?src=rss

Waze will now warn you if a road has a history of crashes

Waze's latest feature focuses on safety and will give you the knowledge needed to make an informed choice about the route you're taking. The Google-owned navigation app has launched crash history alerts, which will send you a notification if you're driving along a crash-prone road. Waze will publish a prompt that says "history of crashes" in-app before you reach, say, a curve that's particularly tricky to navigate. That way, you can slow down or be on the lookout for anything that could derail your vehicle. 

A computer render of a navigation app.
Waze

The app decides whether to show you a notification based on reports from the Waze community and an AI analysis of your route, such as its traffic levels, its elevation and whether it's a highway or a smaller local road. It will not show you crash alerts for routes you usually take in order to minimize distractions, which suggests that its main purpose is to give you a heads up if you should drive with more caution than usual in places you're not familiar with. 

Waze has released several protective features intended to keep you safe on the route you're planning to take over the years. A few years ago, it started sending out real-time accident data so that you can take an alternate route if needed and first responders can get to accident sites sooner. In 2020, it also rolled out guidance prompts telling you to get in the right spot for an upcoming merge or exit before you get there. 

This article originally appeared on Engadget at https://www.engadget.com/waze-will-now-warn-you-if-a-road-has-a-history-of-crashes-130011100.html?src=rss

Tesla’s Autopilot was not to blame for fatal 2019 Model 3 crash, jury finds

A California jury has found that Tesla was not at fault for a fatal 2019 crash that allegedly involved its Autopilot system, in the first US trial yet for a case claiming its software directly caused a death. The lawsuit alleged Tesla knowingly shipped out cars with a defective Autopilot system, leading to a crash that killed a Model 3 owner and severely injured two passengers, Reuters reports.

Per the lawsuit, 37-year-old Micah Lee was driving his Tesla Model 3 on a highway outside of Los Angeles at 65 miles per hour when it turned sharply off the road and slammed into a palm tree before catching fire. Lee died in the crash. The company was sued for $400 million plus punitive damages by Lee’s estate and the two surviving victims, including a boy who was 8 years old at the time and was disemboweled in the accident, according to an earlier report from Reuters.

Lawyers for the plaintiffs argued that Tesla sold Lee defective, “experimental” software when he bought a Model 3 in 2019 that was billed to have full self-driving capability. The FSD system was and still is in beta. In his opening statement, their attorney Jonathan Michaels also said that the “excessive steering command is a known issue at Tesla.”

Tesla’s defense argued that there was no such defect, and that an analysis cited by the plaintiffs’ lawyers identifying a steering issue was actually looking for problems that were theoretically possible. A fix to prevent it from ever happening was engineered as a result of that analysis, according to the company. Tesla blamed human error for the crash, pointing to tests that showed Lee had consumed alcohol before getting in the car, and argued that there’s no certainty Autopilot was in use at the time.

The jury ultimately found there was no defect, and Tesla was cleared on Tuesday. Tesla has faced lawsuits over its Autopilot system in the past, but this is the first involving a fatality. It’s scheduled to go on trial for several others in the coming months, and today's ruling is likely to set the tone for those ahead.

This article originally appeared on Engadget at https://www.engadget.com/teslas-autopilot-was-not-to-blame-for-fatal-2019-model-3-crash-jury-finds-210643301.html?src=rss