Sonos Ace headphones hands-on: Joining your home theater setup with the push of a button

After years of rumors and leaks, Sonos has finally pulled the wraps off of it's much-anticipated entry into a new product category. Today, Sonos announced the Ace headphones: a meticulously designed, feature-packed set of premium cans from the company that made its name with multi-room audio and stellar sound. But, that reputation was built on speakers and soundbars, and now Sonos is lending its mix of aesthetics, acoustics and tech to headphones. The Ace is first and foremost a set of Bluetooth noise-canceling headphones that can be used on the go, but it's also got some unique home theater chops that work in tandem with its soundbars. You'll have to wait a bit longer to try to the $449 headphones, but you can pre-order them now if you're already convinced. 

Design-wise, these Sonos headphones have a refined look that draws some inspiration from the company's speakers. Sonos opted for a mix of matte finishes, stainless steel and leather for its high-end look, keeping everything black on one version while using white with silver accents on the other. Even with the premium materials, the Ace weighs 11 ounces (312 grams). That's lighter than the AirPods Max which is 13.6 ounces (385 grams) thanks so some use of plastic. 

"It's all in the interest of doing something that's going to make this light and comfortable for the customer," Sonos CEO Patrick Spence told Engadget. "We knew it had to be premium, just like all the speakers that we've designed, but we felt like we could do this in a different way than anybody else." 

A key aspect of the Ace's design is the hidden hinge, which Sonos has placed in the ear cup. The company says this puts less stress on cabling than a folding mechanism, but it also argues that it just looks better. Sonos chose physical controls rather than a touchpad, assigning those functions to a multi-purpose button it calls the Content Key. Here, you have volume and playback controls along with the ability to switch between ANC and transparency modes. A single button on the opposite side handles power and pairing. Like Apple, Sonos uses removable, magnetic ear pads on its headphones, and plans to sell replacements in the future.

Inside, 40mm custom dynamic drivers power the Ace's sound. Sonos promises "impeccable precision and clarity" across the EQ with spatial audio and dynamic head tracking for increased immersion. These headphones also support lossless audio over Bluetooth if you're streaming from a device with Qualcomm's Snapdragon Sound. They also offer lossless listening over USB-C if you prefer a wired connection for that purpose. And if the stock tuning doesn't suit you, the company allows you to adjust bass, treble and loudness from the Sonos app. 

White headphones laying flat on a small table, showing the buttons on both sides.
Billy Steele for Engadget

Active noise cancellation (ANC) is onboard the Sonos headphones and there's an Aware mode when you need to let in ambient sounds. The company says the Ace is equipped with eight beamforming microphones that pull double duty with ANC and voice targeting, so you'll be able to use them during calls. The headphones also have wear detection sensors which will automatically pause movies or music when you take them off. Sonos says you'll be able to use the Ace for up to 30 hours on a charge with ANC on, 10 hours more than the AirPods Max and on par with Sony's WH-1000XM5. The latter of which is our current top pick for best wireless headphones

None of this is a surprise given how many of the details broke cover before the official reveal, but Sonos did manage to keep secret how the Ace would interact with its other products. While the company's app will carry key features for the headphones, the interaction with other Sonos speakers is unique here. The Ace has a feature called TV Audio Swap that sends the audio from a Sonos soundbar to the headphones as long as you're in range. To make this happen, the company says the Ace switches to Bluetooth LE to maintain a connection with the app for controls and settings while Wi-Fi allows it to sync with a soundbar. At launch, the swap functionality will only work with the Sonos Arc, but the company says it will come to both generations of Beam and Ray in the future. 

"What we realized is for the majority of the population, and for the many use cases of headphones, the best way to do it is the Bluetooth first with connectivity to the system," Spence said. "Because what's more important to the customer is power management and battery life." 

There's also a version of the company's TruePlay tuning on the Ace, but it's called TrueCinema. When it arrives later this year, the feature will map the room your soundbar is in to create a complete virtual surround system inside the headphones. The goal here is to mimic the acoustics of the room you're in so that maybe you'll forget you're even wearing headphones. 

"It's more natural, because often times the headphones will be tuned to a perfect room," Spence explained. "We thought it was better to have it tuned to the room that you're actually in because it would create the effects that you would expect."

After some time listening to both music and movie clips on the Ace, I'm impressed with what the company has built in terms of sound quality. There's pristine detail and heightened immersion with Dolby Atmos content that make the headphones a complement to a home theater setup. However, the most surprising thing about the Ace to me was how well the TV Audio Swap feature works. 

Once the headphones have been added to your collection of devices in the app, all you have to do is press the Content Key button to switch the sound to what's coming from your soundbar. It's quick and easy, and there's no jumping, popping or other distractions when you hop back and forth. I can see a lot of people using them so that they can still hear the finer details of Dune or every shot of John Wick 4 when their family has gone to bed.

Even if your content isn't 7.1.4-channel Dolby Atmos, Sonos' 3D virtualization tech will upscale it so it sounds comparable. The company has also developed its own head tracking processing that learns from your position and the direction you're looking so that it's not constantly recentering if you look down at your phone. Unfortunately, the head tracking, spatial audio and the TV audio swap with Sonos Arc will only be available in the iOS version of the Sonos app at launch. Android compatibility is coming "shortly after."  

The Sonos Ace headphones are available for pre-order today from the company's website for $449 and will begin shipping on June 5th. While that's more expensive than flagship models from Bose, Sony and others, it's $100 less than the AirPods Max. 

This article originally appeared on Engadget at https://www.engadget.com/sonos-ace-headphones-hands-on-joining-your-home-theater-setup-with-the-push-of-a-button-130045023.html?src=rss

How to pre-order the Sonos Ace headphones

Sonos might be known for its high-quality speaker systems, but the company has finally announced its first foray into the personal listening space: the Sonos Ace headphones. We no longer have to rely on leaked information and can confidently say the wireless headphones will launch on June 5 and are available for $449 in either black or white. While that's not too long a wait, you can pre-order the Sonos Ace headphones now through the company's website.

The Sonos Ace wireless headphones offer features like active noise cancellation and aware modes, spatial audio with Dolby Atmos and lossless audio. Plus, they have two custom-designed drivers and eight beamforming microphones. The headphones will use Sonos' upcoming TrueCinema technology, which maps your space, aiming to provide a surround sound experience. Sonos also claims the headphones last up to 30 hours of listening or talking use and can get three hours of battery life in just a three-minute charge.

In a statement, Sonos CEO Patrick Spence said the Sonos Ace headphones leverage "everything we've learned over two decades as an audio leader to bring stunning sound, sleek design and long-standing comfort to one of the largest and most popular audio categories worldwide." Even the physical design reflects this with a matte finish, memory foam interior wrapped in vegan leather and lightweight build. 

You can pre-order the Sonos Ace headphones today with orders shipping on June 5. For all of the details and our initial impressions, you can read our hands-on here

This article originally appeared on Engadget at https://www.engadget.com/how-to-pre-order-the-sonos-ace-headphones-130040879.html?src=rss

Adobe Lightroom gets its own AI eraser tool

Adobe is adding another AI-powered tool to its belt with the announcement of Generative Remove for Lightroom. As the name indicates, Generative Remove lets you get rid of any unwanted objects from a photo and then creates "pixel perfect generations" that make it seem as if nothing was ever there. These items could be anything from an ugly trash can in a beautiful photo or a lamp post that blocks an otherwise clear skyline. It's pretty much Adobe's version of Google's Magic Eraser

The new tool uses Adobe Firefly, a generative AI creation model launched in March 2023. Firefly trains on licensed content, such as that from Adobe Stock, and can improve image quality, create photos using a description and utilizes Generative Fill and Expand to add, remove or broaden the image. It exists across Adobe products like Photoshop, Illustrator and InDesign. 

Generative Remove is currently available as an early access feature for Lightroom, which Adobe claims will make it available to millions of people. Adobe has also expanded Lens Blur, which adds "aesthetic blur effects to parts of a photograph, to be generally available — and with new automatic presets. 

This article originally appeared on Engadget at https://www.engadget.com/adobe-lightroom-gets-its-own-ai-eraser-tool-130003020.html?src=rss

Sony’s WH-1000XM5 headphones are $72 off right now

There are a lot of really great headphones out there, but for us, none compare to Sony's WH-1000XM5 model. We're excited to say our favorite wireless headphones are now on sale, down to $328 from $400 — an 18 percent discount. While this isn't a record-low deal, it is the cheapest we've seen them available for yet this year.

Sony's WH-1000XM5 headphones came out two years ago, yet they're still an amazing choice. We gave them a 95 in our initial review thanks to features like incredible crisp and clear sound quality with a punchy base. The M5, which offers active noise cancellation, also doubled the number of microphones and processors while adding an optimizer to ensure you can truly block out the rest of the world. The headphones also have up to 30 hours of battery life, and you can get another three hours with just three minutes of charging.

Despite the upgrades, the M5 is actually 0.14 ounces lighter than its predecessor and has better weight distribution, making for a more comfortable experience. The entire look is sleeker, though one of the few negatives of the M5s is that they don't fold, so they can be a bit bulky to carry around.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/sonys-wh-1000xm5-headphones-are-72-off-right-now-123008959.html?src=rss

The Morning After: Microsoft introduces its AI-centric Copilot+ PCs

Microsoft couldn’t wait until its Build conference today. It just revealed a bunch of new hardware and plans for Windows. Copilot+ PCs were the big announcement, designed to run generative AI processes locally instead of in the cloud. Of course, Microsoft had new Surface devices to showcase these features, but the usual PC suspects also have new laptops that meet the spec requirements — and include Copilot+ in their name for added chaos. The company also claims Copilot+ PCs are 58 percent faster than the M3-powered MacBook Air.

TMA
Engadget

We’ll drill into some other announcements down below.

— Mat Smith

Another patient will get Neuralink’s brain implant

Intel-powered Copilot+ PCs will be available this fall

Here are all of the just-announced Copilot+ PCs with Snapdragon X Chips

Volvo and Aurora introduce their first self-driving truck

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

TMA
Engadget

The new Surface Laptop is a redesigned PC with thinner bezels in 13.8- and 15-inch sizes and Qualcomm’s Arm-based Snapdragon X Elite chip. Microsoft says this is the brightest display it has ever shipped, at 600 nits, and the new Studio Camera is now in the bezel, so no visible notch.

Will the Snapdragon X Elite give better performance? Expect potent battery life. Microsoft claims the 15-inch model will run for up to 22 hours on a single charge while playing videos locally and up to 15 hours while actively browsing the web. We’ve got some hands-on impressions right here, but we’ve got reservations. Devices like the Surface Pro 9, which ran Windows on Arm, still didn’t feel as fast or responsive compared to their more traditional x86-based counterparts.

Continue reading.

Microsoft says it has rebuilt core components of Windows 11 to better support Arm-based hardware and AI. That includes a new kernel, compiler and, most importantly, an emulator named Prism, for running older x86 and x64 apps. Thanks to a powerful new Neural Processing Unit (NPU) in the Snapdragon X Elite chips, Copilot+ PCs can run more than 40 trillion operations per second, a measure of a chip’s AI performance, more than four times the performance of today’s AI PCs.

Continue reading.

This sounds very good. Microsoft also announced Recall, a new feature to make local Windows PC searches as quick and effective as web searches, tapping into AI to add more contextual search parameters. Microsoft product manager Caroline Hernandez gave the example of searching for a blue dress on Pinterest using a Windows PC with Recall. She can search the Recall timeline for ‘blue dress’ (using her voice), which pulls all of her recent searches, saving her from having to sift through browser history. She further refined the query with more specific details like ‘blue pantsuit with sequined lace for Abuelita,’ and Rewind delivered relevant results. Microsoft says it can start with exact information or vague contextual clues to find what you want — and it’s apparently all done locally. It is, however, a Copilot+ exclusive.

Continue reading.

AI companies love to tap Scarlett Johansson’s star power, but this time it’s a bigger player in AI. Johansson accused OpenAI of copying her voice for one of the ChatGPT voice assistants, despite her denying the company permission to do so. Johansson’s statement on Monday came hours after OpenAI said it would no longer use the voice. “The voice of Sky is not Scarlett Johansson’s, and it was never intended to resemble hers,” an OpenAI spokesperson said in a statement sent to Engadget. The Her actor said OpenAI only stopped using the voice after she hired legal counsel.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-microsoft-introduces-its-ai-centric-copilot-pcs-111916490.html?src=rss

Volvo and Aurora introduce their first self-driving truck

Volvo and Aurora have unveiled their first production autonomous truck, three years after the companies initially announced that they were teaming up. They've just showed off the Volvo VNL Autonomous truck, which was designed by autonomous trucking and robotaxi company Aurora but will be manufactured by Volvo, at ACT Expo in Las Vegas. 

It's powered by Aurora Driver, a level 4 autonomous driving system that uses high-resolution cameras, imaging radars, a LiDAR sensor that can detect objects up to 400 meters away and even more sensors. Aurora's technology has driven billions of virtual miles for training, as well as 1.5 million commercial miles on actual public roads. For safety purposes, the truck has "redundant steering, braking, communication, computation, power management, energy storage and vehicle motion management systems."

According to TechCrunch, the vehicle will still have a human driver behind the wheel to take over whenever needed when it starts ferrying cargo across North America over the next few months. An Aurora spokesperson told the publication that it will be announcing pilot programs with its clients that are planning to use Volvo's truck sometime later this year. It didn't name any companies, but the startup previously ran pilot programs with FedEx and Uber Freight. 

The autonomous vehicle company also intends to deploy 20 fully driverless trucks between Dallas and Houston soon, but it's unclear if this inaugural fleet of driverless vehicles will be comprised of Volvo's trucks or of its other manufacturing partners'. The companies did say at the Las Vegas event, though, that Volvo has already started manufacturing a test fleet of the VNL Autonomous truck at its New River Valley assembly facility in Virginia. Nils Jaeger, President of Volvo Autonomous Solutions, called this truck the "first of [the company's] standardized global autonomous technology platform." Jaeger added that it will enable Volvo "to introduce additional models in the future."

This article originally appeared on Engadget at https://www.engadget.com/volvo-and-aurora-introduce-their-first-self-driving-truck-080058835.html?src=rss

Senua’s Saga: Hellblade II review: A series of unfortunate events

Senua’s Saga: Hellblade II shoves hopeless brutality in your face and screams at you to recognize its beauty. Its landscapes are littered with disemboweled corpses and shrines of rotting flesh, and its shadows conceal monsters carrying blood-stained blades. Every chapter in Hellblade II is a parade of suffering and pain, and every scene has a backdrop of taunting whispers. After just a few minutes of playtime it can feel like you’re strapped down, trapped in Senua’s claustrophobic reality on the Icelandic coast, suffocating under the dark waves with her.

And here’s the thing — it is absolutely beautiful.

Hellblade II is a third-person narrative adventure set in Iceland in the 10th century, and it’s the sequel to Ninja Theory’s 2017 game, Hellblade: Senua’s Sacrifice. Senua is a young warrior who hears a cacophony of disembodied voices in her mind, judging her every move. The whispers are a critical and permanent part of Senua’s psyche — a lesson she learned in the first game, after realizing the depths of her father’s abuse when she was a child. In Hellblade II, Senua is figuring out how to live outside of her father’s influence, but his voice still rumbles in her head at inopportune times, drowning out the female whispers that she’s come to view as her allies.

Senua's Saga: Hellblade II
Ninja Theory

Senua regularly sees the ghosts of the warriors and civilians that have died around her, and she carries their souls as a mental shroud, fueling her intense desire to save as many oppressed people as she can. She has targets right away: Hellblade II opens with Senua and others held captive on a slave ship during a catastrophic storm. The ship is thrown ashore, scattering slavers and the enslaved on the black rocks. This is where Senua finds her sword.

She makes her way inland, where she finds companions and learns about a plague of giants terrorizing the local villages, slaughtering entire communities and destroying precious farmland. Throughout her journey, Senua fights the giants and hordes of human-sized enemies — always one at a time — with a single broadsword and a bit of magic. The whispers provide a constant soundtrack of criticism, encouragement, warnings, fear, anger and doubt, hisses slicing through the silence and interrupting moments of dialogue, inescapable.

Hellblade II is more of an extended, extremely anxious and violent vibe than a traditional adventure game. Its combat is OK, its puzzles are slightly tedious and, emotionally, it’s one-dimensional — but as an interactive brutality visualizer, Hellblade II is outstanding. Senua fights until her pores ooze blood, screaming through each swing of her sword as the whispers surround her, cuing her when to strike and telling her to ignore the pain. Every fight is close combat and one-on-one, warriors waiting in a circle of fog for their turn to rush in, punch her in the face and slice her to pieces. The sounds of flesh smacking against flesh join the whispers and the screen splatters red when Senua is hit. Hellblade II revels in physical violence.

Senua's Saga: Hellblade II
Ninja Theory

Outside of combat, Senua’s body is sacrificed to the elements, caught in swirling riptides and burned to ash in bursts of hellfire, pieces of broken earth floating around her. Senua steps gingerly into dark pools filled with the ghosts of the damned, shadowy figures wailing and trying to pull her down. A giant’s hand slams to the ground, smashing a body with a wet splat beneath its palm. Reality bends and shatters, and Senua is trapped in torturous, psychedelic nightmares narrated by her father’s booming voice. She burns herself at the stake. Her face fills the screen, panic spewing from her lips and horror sharp in her eyes. In every scene, the whispers persist.

Basically, it’s metal as hell.

I played Hellblade II on Xbox Series S without headphones and on a high-end gaming PC with headphones. The game is stunning in both formats, though of course the details and lighting looked a bit crisper on PC. There may be no laughter in Senua's life, but there are breathtaking landscapes dotted with delicate shrubs and rough boulders, fine red dust stretching to the horizon. There are towering cave systems lit by the soft glow of blue flames; there are beaches with roiling waves; there are snow-capped mountains backlit by a golden setting sun. The environments in Hellblade II are all phenomenally detailed, which is great news for the game's Photo Mode.

Senua's Saga: Hellblade II
Ninja Theory

Whether on console or PC, I encourage every Hellblade II player to wear headphones in order to fully enjoy the binaural audio. In this format, the whispers surround your head as they do Senua’s, spawning from various directions in a terrifying way. With no UI in the entire game, stellar acting from Melina Juergens as Senua, and headphones full of soft, hissing judgements, Hellblade II can get incredibly immersive.

Outside of its aesthetic and tonal merits, combat in Hellblade II is simplistic and sometimes frustrating. The game has pared-down mechanics, and Senua has just a standard and heavy sword attack, plus an evade move, a parry and a special power-up. Timing is everything in battles, and Senua moves sluggishly enough that last-second adjustments rarely land. Additionally, Senua’s special move is — dare I say — overpowered, and it basically guarantees she’ll kill whatever minion she’s fighting. Between the annoying timing structure and a too-strong power-up, it’s difficult to find a rhythm with any enemy in Hellblade II. That may indeed be a rhythm to be found in these battles, as demonstrated by the original Hellblade, but I couldn’t identify it in my initial playthrough.

Senua's Saga: Hellblade II
Ninja Theory

The most successful puzzles in Hellblade II involve Senua magically changing the landscape around her, focusing on specific vortices to manifest platforms where she needs them. These riddles aren’t particularly challenging and they don’t gain layers of complexity as the game progresses, but their settings are gorgeous, generally contained to cave systems that glow like the night sky. The game’s most annoying puzzles are the symbol-hunting ones, where Senua has to find pieces of an ancient language in the environment. These moments feel like filler; they don’t advance the story in any meaningful way and they make me squint at the screen uncomfortably. That last one might be because I need new glasses, but still, I could do without the symbol-seeking puzzles entirely.

Hellblade II uses a limited set of inputs — just directional, sprint, focus and interact — and portions of it are fully playable with one hand, whispers shadowing Senua’s steps. Everything that Senua does is cloaked in apocalyptic framing; every conversation she has, whether with herself or her companions, is drenched in anxiety and urgency. There is no joy in Senua’s life, no respite from the pressure to save everyone, nowhere to run from the guilt that already weighs heavily on her mind. Senua’s singular emotion is desperation, her trauma is repeated over and over, her ghosts are explained again and again.

It’s as if Ninja Theory built Hellblade II to be an art installation at a busy museum — like they expected its audience to be milling in and out of the room, paying attention for spurts at a time, and they wanted to make sure every scene told the full story. Though Senua gathers a few allies along her journey, every character in this harsh world feels transitory, and their presence lacks lasting impact. Hellblade II is stuck at 11 on the emotional scale and it doesn’t offer any opportunities for falling or rising tension, causing its climax to fall a bit flat. The final scenes are epic, like the rest of the game, but they also feel like… the rest of the game. So much so, that I was surprised when the credits started to roll.

Senua's Saga: Hellblade II
Ninja Theory

As a side note, I’ve never had a game trigger my phobia of crashing waves quite like Hellblade II. I had to close my eyes for half of one segment because it involved Senua getting swept up by the raging sea in regular pulses, waves repeatedly crushing her body and the camera, and it was making my stomach turn. This is far from the only deep-water horrorscape in the game, so fair warning.

Hellblade II is an impressive sensory experience interrupted every so often by tedious symbol-hunting puzzles. It’s an epic poem in video game form, violent and timeless.

Senua's Saga: Hellblade II is available on Xbox Series X/S and PC, included in Game Pass.

This article originally appeared on Engadget at https://www.engadget.com/senuas-saga-hellblade-ii-review-a-series-of-unfortunate-events-080047631.html?src=rss

Another patient will get Neuralink’s brain implant

Neuralink will be able to surgically implant its device into another patient’s brain. The Wall Street Journal reports that the company was approved to move forward with a second procedure months after Noland Arbaugh became the first person to receive the brain implant.

Elon Musk said last week that the company was “accepting applications for the second participant” in the trial. The company began recruiting potential participants for its first clinical trial last year with the goal of bringing the technology to people with ALS, spinal cord injuries or other conditions that cause quadriplegia.

Neuralink has also reportedly come up with a potential fix for an issue that caused Arbaugh’s implant to malfunction about a month after his surgery. The company said earlier this month that some of the implant’s threads “retracted from the brain” causing the issue. Arbaugh recently told Bloomberg that software updates have since restored many of those capabilities. Neuralink has shared clips of Arbaugh, who is paralyzed from the neck down, playing chess, controlling a music player app and performing other activities. 

According to The Journal, Neuralink told the FDA that in a second procedure it would place the implant’s threads deeper into the patient’s brain to prevent them from moving as much as they did in Arbaugh’s case. The FDA is apparently on board with the changes. The company reportedly wants to complete the second surgery in June and has seen more than 1,000 people sign up for a chance to participate in the trial.

This article originally appeared on Engadget at https://www.engadget.com/another-patient-will-get-neuralinks-brain-implant-235059248.html?src=rss

Scarlett Johansson says OpenAI used her likeness without permission for its ‘Sky’ voice assistant

Actor Scarlett Johansson has accused OpenAI of copying her voice for one of the voice assisstants in ChatGPT despite denying the company permission to do so. Johansson’s statement on Monday came hours after OpenAI said that the company would no longer use the voice in ChatGPT but did not provide a reason why.

“Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system,” Johansson wrote in the statement shared by her public relations team with Engadget (NPR first reported the story). “He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and AI. He said he felt that my voice would be comforting to people.” Johansson added that she declined the offer after “much consideration and for personal reasons,” but when OpenAI demoed GPT-4o, the company’s latest large language model last week, “my friends, family, and the general public all noted how much the newest system named ’Sky’ sounded like me.” 

When Johansson saw OpenAI’s newest demo, she said she was “shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mind that my closest friends and news outlets could not tell the difference.” She also revealed that Altman had contacted her agent just two days before the company revealed GPT-4o and asked her to reconsider, but released the system anyway before she had a chance to respond. 

"The voice of Sky is not Scarlett Johansson's, and it was never intended to resemble hers," an OpenAI spokesperson said in a statement sent to Engadget that the company attributed to Altman, OpenAI's co-founder and CEO. "We cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson. Out of respect for Ms. Johansson, we have paused using Sky’s voice in our products. We are sorry to Ms. Johansson that we didn’t communicate better." 

Even though “Sky” has been one of the voice assisstants in ChatGPT since September 2023, GPT-4o, which the company announced last week, takes things a step further. The company said that the new model is closer to “much more natural human-computer interaction” and demoed its executives having nearly human-like conversations with the voice assistant in ChatGPT. This invited comparisons to Samantha, the virtual voice assistant played by Johansson in the 2013 movie Her who has an intimate relationship with a human being. Shortly after the event, Altman tweeted a single word — “her” — in an apparent reference to the film.


On Monday, OpenAI said that it was pausing the use of “Sky” in ChatGPT and released a lengthy post revealing how the company hired professional voice actors to create its own virtual assistants, and denying any similarities with Johansson’s voice.

"We believe that AI voices should not deliberately mimic a celebrity's distinctive voice — Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice," OpenAI wrote and added that each of its performers, who it declined to name for privacy reasons, was paid “above top-of-market rates, and this will continue for as long as their voices are used in our products.”

This move, Johansson said in her statement, only came after she hired legal counsel who wrote two letters to Altman and OpenAI asking for an explanation. “In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity,” Johansson wrote. “I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.”

Update, May 20 2024, 9:09 PM ET: This story has been updated to include a statement from OpenAI.

This article originally appeared on Engadget at https://www.engadget.com/scarlett-johansson-says-openai-used-her-likeness-without-permission-for-its-sky-voice-assistant-233451958.html?src=rss

Meta approved ads in India that called for violence and spread election conspiracy theories

Meta’s advertising policies are once again in the spotlight as a watchdog group says the company approved more than a dozen “highly inflammatory” ads that broke its rules. The ads targeted Indian audiences and contained disinformation, calls for violence and conspiracy theories about the upcoming elections.

The ads are detailed in a new report from Ekō, a nonprofit watchdog organization. The group says it submitted the ads as a “stress test” of Meta’s company’s advertising systems, but that the spots “were created based upon real hate speech and disinformation prevalent in India.”

In all, the group was able to get 14 of 22 ads approved through Meta’s company’s advertising tools even though all of them should have been rejected for breaking the company’s rules. The group didn’t disclose the exact wording of the ads, but said they “called for violent uprisings targeting Muslim minorities, disseminated blatant disinformation exploiting communal or religious conspiracy theories prevalent in India's political landscape, and incited violence through Hindu supremacist narratives.” Researchers at Ekō pulled the ads before they ran and they were never seen by actual Facebook users, according to the report.

It’s not the first time Ekō has gotten inflammatory ads approved by Meta in an effort to draw attention to its advertising systems. The group previously got a batch of hate-filled Facebook ads targeting users in Europe approved, though the ads never ran.

In its latest report, Ekō says it also used generative AI tools to create images for the ads. Researchers at the organizations said none of the ads were flagged by Meta as containing AI-generated material, despite the company’s statements that it’s working on systems to detect such content.

Meta didn’t immediately respond to a request for comment. In a response to Ekō, the company pointed to its rules requiring political advertisers to disclose their use of AI and a blog post about its efforts to prepare for the Indian elections.

Update May 21, 2024 6:10 PM ET: "As part of our ads review process—which includes both automated and human reviews—we have several layers of analysis and detection, both before and after an ad goes live," a Meta spokesperson said in a statement. "Because the authors immediately deleted the ads in question, we cannot comment on the claims made." 

This article originally appeared on Engadget at https://www.engadget.com/meta-approved-ads-in-india-that-called-for-violence-and-spread-election-conspiracy-theories-225510165.html?src=rss