The team behind the upcoming Minecraft movie shared a new clip during Minecraft Live that expands on the brief crafting moment we saw in the polarizing first teaser. The scene comes in the middle of a discussion between Mojang creative director Torfi Frans Olafsson and A Minecraft Movie director Jared Hess, at 4:51. The segment also gives us our first look at the movie’s interpretation of a Minecraft bee, which I’m not quite sure how to feel about yet. That you can find toward the end of the video.
A Minecraft Movie is slated for release in April 2025 and stars Jack Black as Steve, alongside Jason Momoa, Danielle Brooks, Emma Myers and Sebastian Eugene Hansen. Plans for it were first announced a decade ago, and potential release dates were set and scrapped on multiple occasions in the time since. At long last, it’s actually now happening — for better or worse.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/heres-a-peek-at-how-a-minecraft-movie-will-handle-crafting-220454126.html?src=rss
Over the last year, Meta has made its AI assistant so ubiquitous in its apps it’s almost hard to believe that Meta AI is only a year old. But, one year after its launch at the last Connect, the company is infusing Meta AI with a load of new features in the hopes that more people will find its assistant useful.
One of the biggest changes is that users will be able to have voice chats with Meta AI. Up till now, the only way to speak with Meta AI was via the Ray-Ban Meta smart glasses. And like last year’s Meta AI launch, the company tapped a group of celebrities for the change.
Meta AI will be able to take on the voices of Awkwafina, Dame Judi Dench, John Cena, Keegan Michael Key and Kristen Bell, in addition to a handful of more generic voices. While the company is hoping the celebrities will sell users on Meta AI’s new abilities, it’s worth noting that the company quietly phased out its celebrity chatbot personas that launched at last year’s Connect.
In addition to voice chat support, Meta AI is also getting new image capabilities. Meta AI will be able to respond to requests to change and edit photos from text chats within Instagram, Messenger and WhatsApp. The company says that users can ask the AI to add or remove objects or to change elements of an image, like swapping a background or clothing item.
Meta is testing AI-generated content recommendations in the main feed of Facebook and Instagram.
Meta
The new abilities arrive alongside the company’s latest Llama 3.2 model. The new iteration, which comes barely two months after the Llama 3.1 release, is the first to have vision capabilities and can “bridge the gap between vision and language by extracting details from an image, understanding the scene, and then crafting a sentence or two that could be used as an image caption to help tell the story.” Llama 3.2 is “competitive” on “image recognition and a range of visual understanding tasks” compared with similar offerings from ChatGPT and Claude, Meta says.
The social network is testing other, potentially controversial, ways to bring AI into the core features of its main apps. The company will test AI-generated translation features for Reels with “automatic dubbing and lip syncing.” According to Meta, that “will simulate the speaker’s voice in another language and sync their lips to match.” It will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America, though the company hasn't shared details on rollout timing.
Meta also plans to experiment with AI-generated content directly in the main feeds on Facebook and Instagram. With the test, Meta AI will surface AI-generated images that are meant to be personalized to each users’ interests and past activity. For example, Meta AI could surface an image “imagined for you” that features your face.
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-ai-can-now-talk-to-you-and-edit-your-photos-172853219.html?src=rss
Over the last year, Meta has made its AI assistant so ubiquitous in its apps it’s almost hard to believe that Meta AI is only a year old. But, one year after its launch at the last Connect, the company is infusing Meta AI with a load of new features in the hopes that more people will find its assistant useful.
One of the biggest changes is that users will be able to have voice chats with Meta AI. Up till now, the only way to speak with Meta AI was via the Ray-Ban Meta smart glasses. And like last year’s Meta AI launch, the company tapped a group of celebrities for the change.
Meta AI will be able to take on the voices of Awkwafina, Dame Judi Dench, John Cena, Keegan Michael Key and Kristen Bell, in addition to a handful of more generic voices. While the company is hoping the celebrities will sell users on Meta AI’s new abilities, it’s worth noting that the company quietly phased out its celebrity chatbot personas that launched at last year’s Connect.
In addition to voice chat support, Meta AI is also getting new image capabilities. Meta AI will be able to respond to requests to change and edit photos from text chats within Instagram, Messenger and WhatsApp. The company says that users can ask the AI to add or remove objects or to change elements of an image, like swapping a background or clothing item.
Meta is testing AI-generated content recommendations in the main feed of Facebook and Instagram.
Meta
The new abilities arrive alongside the company’s latest Llama 3.2 model. The new iteration, which comes barely two months after the Llama 3.1 release, is the first to have vision capabilities and can “bridge the gap between vision and language by extracting details from an image, understanding the scene, and then crafting a sentence or two that could be used as an image caption to help tell the story.” Llama 3.2 is “competitive” on “image recognition and a range of visual understanding tasks” compared with similar offerings from ChatGPT and Claude, Meta says.
The social network is testing other, potentially controversial, ways to bring AI into the core features of its main apps. The company will test AI-generated translation features for Reels with “automatic dubbing and lip syncing.” According to Meta, that “will simulate the speaker’s voice in another language and sync their lips to match.” It will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America, though the company hasn't shared details on rollout timing.
Meta also plans to experiment with AI-generated content directly in the main feeds on Facebook and Instagram. With the test, Meta AI will surface AI-generated images that are meant to be personalized to each users’ interests and past activity. For example, Meta AI could surface an image “imagined for you” that features your face.
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-ai-can-now-talk-to-you-and-edit-your-photos-172853219.html?src=rss
Over the last year, Meta has made its AI assistant so ubiquitous in its apps it’s almost hard to believe that Meta AI is only a year old. But, one year after its launch at the last Connect, the company is infusing Meta AI with a load of new features in the hopes that more people will find its assistant useful.
One of the biggest changes is that users will be able to have voice chats with Meta AI. Up till now, the only way to speak with Meta AI was via the Ray-Ban Meta smart glasses. And like last year’s Meta AI launch, the company tapped a group of celebrities for the change.
Meta AI will be able to take on the voices of Awkwafina, Dame Judi Dench, John Cena, Keegan Michael Key and Kristen Bell, in addition to a handful of more generic voices. While the company is hoping the celebrities will sell users on Meta AI’s new abilities, it’s worth noting that the company quietly phased out its celebrity chatbot personas that launched at last year’s Connect.
In addition to voice chat support, Meta AI is also getting new image capabilities. Meta AI will be able to respond to requests to change and edit photos from text chats within Instagram, Messenger and WhatsApp. The company says that users can ask the AI to add or remove objects or to change elements of an image, like swapping a background or clothing item.
Meta is testing AI-generated content recommendations in the main feed of Facebook and Instagram.
Meta
The new abilities arrive alongside the company’s latest Llama 3.2 model. The new iteration, which comes barely two months after the Llama 3.1 release, is the first to have vision capabilities and can “bridge the gap between vision and language by extracting details from an image, understanding the scene, and then crafting a sentence or two that could be used as an image caption to help tell the story.” Llama 3.2 is “competitive” on “image recognition and a range of visual understanding tasks” compared with similar offerings from ChatGPT and Claude, Meta says.
The social network is testing other, potentially controversial, ways to bring AI into the core features of its main apps. The company will test AI-generated translation features for Reels with “automatic dubbing and lip syncing.” According to Meta, that “will simulate the speaker’s voice in another language and sync their lips to match.” It will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America, though the company hasn't shared details on rollout timing.
Meta also plans to experiment with AI-generated content directly in the main feeds on Facebook and Instagram. With the test, Meta AI will surface AI-generated images that are meant to be personalized to each users’ interests and past activity. For example, Meta AI could surface an image “imagined for you” that features your face.
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-ai-can-now-talk-to-you-and-edit-your-photos-172853219.html?src=rss
Meta has secured deals with several actors, including Kristen Bell, John Cena and Judi Dench, to use their voices for the Meta AI chatbot, Reuters reports. Users will be able to talk to the chatbot while listening to answers in the voice of their favorite celebrities. Other celebrities include Awkwafina and Keegan-Michael Key, a source told Reuters.
Besides these five voices, the source also said that there are more generic voice options if users prefer them. All voices will be available this week in the US and other English-speaking regions, though the source didn’t give any other specific locations.
The news follows a report from last month which claimed Meta was negotiating with actors to secure the rights to use their voices for its AI projects. Now, the deals have reportedly been struck, and the chatbot found when using Facebook, Messenger, Instagram and WhatsApp could feature these famous voices soon. The company had intended to finalize agreements before the Connect conference, where Reuters’ source says it will announce the new voice options.
This article originally appeared on Engadget at https://www.engadget.com/ai/metas-ai-chatbot-will-soon-speak-in-the-voices-of-john-cena-and-other-celebrities-160603365.html?src=rss
Meta has secured deals with several actors, including Kristen Bell, John Cena and Judi Dench, to use their voices for the Meta AI chatbot, Reuters reports. Users will be able to talk to the chatbot while listening to answers in the voice of their favorite celebrities. Other celebrities include Awkwafina and Keegan-Michael Key, a source told Reuters.
Besides these five voices, the source also said that there are more generic voice options if users prefer them. All voices will be available this week in the US and other English-speaking regions, though the source didn’t give any other specific locations.
The news follows a report from last month which claimed Meta was negotiating with actors to secure the rights to use their voices for its AI projects. Now, the deals have reportedly been struck, and the chatbot found when using Facebook, Messenger, Instagram and WhatsApp could feature these famous voices soon. The company had intended to finalize agreements before the Connect conference, where Reuters’ source says it will announce the new voice options.
This article originally appeared on Engadget at https://www.engadget.com/ai/metas-ai-chatbot-will-soon-speak-in-the-voices-of-john-cena-and-other-celebrities-160603365.html?src=rss
Danny Boyle’s zombie sequel 28 Years Later was shot using several iPhone 15 Pro Max smartphones, according to a report by Wired. This makes it the biggest movie ever made using iPhones, as the budget was around $75 million.
There are some major caveats worth going over. First of all, the sourcing on the story is anonymous, as the film’s staff was required to sign an NDA. Also, the entire film wasn’t shot using last year’s high-end Apple smartphone. Engadget has confirmed that Boyle and his team used a bunch of different cameras, with the iPhone 15 Pro Max being just one tool.
Finally, it’s not like the director just plopped the smartphone on a tripod and called it a day. Each iPhone looks to have been adapted to integrate with full-frame DSLR lenses. Speaking of, those professional-grade lenses cost a small fortune. The phones were also nestled in protective cages.
Even if the phones weren’t exclusively used to make this movie, it’s still something of a full-circle moment for Boyle and his team. The original 28 Days Later was shot primarily on a prosumer-grade camcorder that cost $4,000 at the time. This camcorder recorded footage to MiniDV tapes.
28 Years Later is the third entry in the franchise and is due to hit theaters in June 2025. The film stars Jodie Comer, Aaron Taylor-Johnson, Ralph Fiennes and Cillian Murphy. This will be the first of three new films set in the universe of fast-moving rage zombies. Plot details are non-existent, but all three upcoming movies are being written by Alex Garland. He co-wrote the first one and has since gone on to direct genre fare like Ex Machina, Annihilation and, most recently, Civil War. He also made a truly underrated TV show called Devs.
As for the intersection of smartphones and Hollywood, several films have been shot with iPhones. These include Sean Baker’s Tangerine and Steven Soderbergh’s Unsane.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/28-years-later-was-partially-shot-on-an-iphone-15-pro-max-182036483.html?src=rss
At its in-person fan event for Geeked Week this year, Netflix has shown teasers and sneak peeks of its upcoming shows, including the second season of Avatar: The Last Airbender. In addition to revealing that the new season is already in production, Netflix has also announced that Miya Cech (Are You Afraid of the Dark?) is playing earthbending master Toph.
A teaser for Squid Game season 2 shows Lee Jung-jae wearing his player 456 uniform again to compete in another round of deadly games with other contestants hoping to win millions of dollars. The next season of Squid Game will start streaming on December 26.
The streaming giant has also revealed that One Piece live action's Mr. 0 and Miss All-Sunday will be portrayed by Joe Mangianello and Lera Abova, respectively. And for Wednesday fans, Netflix has released a teaser for the second season of Wednesday that will arrive sometime in 2025.
For animation fans, Netflix has released a teaser for Tom Clancy's Splinter Cell: Deathwatch, with Liev Schreiber voicing protagonist Sam Fisher. It has also given viewers a short look at a new Devil May Cry animated series by Korean company Studio Mir, which is coming in April 2025.
Netflix has teased a new Tomb Raider animated series that's coming in October and a Rebel Moon game that's arriving in 2025, as well. Finally, the company has given Arcane fans a clear schedule for the final chapter of the critically acclaimed show: Act 1 will be available to stream on November 9, followed by Act 2 on November 16. A third and final Act will close out the show with a proper ending on November 23.
✨ flash warning ✨ A new fighter has entered the ring. Experience Vi's journey in the final chapter of Arcane when ACT 1 drops on November 9th, Act 2 drops on November 16th and Act 3 drops on November 23rd only on Netflix. #GeekedWeekpic.twitter.com/A6EN448Tli
This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/netflix-teases-the-next-seasons-of-avatar-squid-game-and-arcane-at-geeked-week-035246559.html?src=rss
Get ready to question humanity’s control over the technology that surrounds us because another season of Netflix’s Black Mirror is in the works. Earlier today, the official Black Mirror X page revealed the cast of the new season coming next year along with some other interesting clues and Easter eggs.
The video features an old, flickering computer screen that appears to unload a complete data dump of the entire cast for season 7. Some of the names that jumped out at us includes Oscar nominee Paul Giamatti, Doctor Who star and Oscar winner Peter Capaldi (he won in 1995 with his live action short film Franz Kafka’s It’s a Wonderful Life), Awkwafina, Issa Rae, Tracee Ellis Ross and Rashida Jones.
The list also included some of the cast who played virtual crew members of the USS Callister from the iconic fourth season episode of the same name. The names from the USS Callister episode that appeared on the list include Cristin Milioti, Jimmi Simpson, Billy Magnussen, Milanka Brooks and Osy Ikhile.
We’ve known for a while now that series creator Charlie Brooker has been planning to revisit the crew of the USS Callister. The season 4 opening episode starred Jesse Plemmons as the chief technology officer named Robert of a top tier game studio and a big fan of a Star Trek-esque TV show called Space Fleet. By day, he gets pushed around and little credit for the company’s success from his colleagues and staff. He uses immersive virtual reality technology to play as Space Fleet Capt. Robert Dalyaway from work on a virtual starship and takes out his frustrations and anger on the crew in increasingly cruel and inhumane ways. The crew members were replicated in the game using his boss and staff members’ DNA that Robert obtained without their permission or knowledge. The crew revolt and escape to the open Internet while leaving a seething “Capt. Robert” stranded in the game.
Of course, this wouldn’t be a true Black Mirror reveal if it didn’t contain some clues and hidden items in the teaser. The loading screen features the studio name Tuckersoft, a reference to the game studio in the interactive “Bandersnatch”movie. The cast names are listed in alphabetical order by first name but they’ve been broken into eight groups. There are some cryptic phrases between the scrolling group names like “Too soon?”, “A rose for a rose” and “Shields 58 percent.”
Could these be episode titles? The latter definitely sounds like a reference to the USS Callister and Brooker and company love symbolic episode titles taken from songs for their tech hell stories like “Shut Up and Dance” and “Hang the DJ.” The new Black Mirror episodes haven’t even landed yet and they’ve already screwing with our heads.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/black-mirror-season-7-cast-revealed-in-a-cryptic-computer-message-203343148.html?src=rss
The film adaptation of the immensely popular sci-fi novel Mickey7 has been in the works for years, but now we finally have a trailer and it’s filled with surprises. For one thing, it’s now called Mickey17 and, well, fans of the book know exactly what that implies. It means they’re in for an even crazier experience than what’s written on the page.
The movie is written and directed by one of the modern masters, Bong Joon Ho, who seems to have taken some liberties with the source material. Light spoilers, but the book follows a series of clones of the titular Mickey as they perform the grunt work of colonizing an exoplanet. The book chronicles seven (ish) Mickey variants, but the movie is amping this up to at least 17. This will give us plenty more darkly hilarious clone deaths, which the trailer shows quite a lot of.
The novel is right up Bong Joon Ho’s alley. Clones are basically second-class citizens who exist to die for their corporate overlords. This leaves plenty of room for social satire in the vein of both Snowpiercer and Parasite. The trailer leans into this stuff and the results look truly entertaining and, believe it or not, really funny. We love to see unique IPs in the cinema, don’t we folks?
The various Mickeys are played by Robert Pattinson, so that’ll get some butts in the seats. The cast also includes Naomi Ackie, Steven Yeun, Toni Collette and Mark Ruffalo. As a book reader, I know who everyone is playing except for Ruffalo. That looks like a brand-new character, though he could be an amalgamation of a couple of minor players. Adaptations require some dark alchemy at times.
This could be the first big hit of 2025. It arrives in theaters on January 31. There’s also some franchise potential here, as the book already has one sequel and author Edward Ashton has been toying with ideas for a third entry.
This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/bong-joon-hos-mickey17-trailer-is-even-crazier-than-the-book-170004844.html?src=rss