Many fans of 2014’s Alien: Isolation video game praised its graphics, story and general gameplay, making it a bit of a standout from many other games adapted from a movie. It even received seven separate DLC packs. Gamers have been enjoying this cult classic on mobile and Switch for a while now, but today, on the game’s 10th anniversary, the developers announced that they’re developing a sequel.
Yes, you heard us right. Alien: Isolation now has a sequel in early development. If you don’t believe us, take the official X account’s word for it.
We currently don’t have any other details on this sequel, but a look back at reviews for Alien: Isolation should give you hope we’ll get another solid game here. If the sequel is anything like the original, then you can expect horror, stealth and second-guessing yourself just as the claws of an alien take your life.
This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/sequel-to-2014s-alien-isolation-is-now-in-development-162213148.html?src=rss
At this point, you probably either love the idea of making realistic videos with generative AI, or you think it's a morally bankrupt endeavor that devalues artists and will usher in a disastrous era of deepfakes we'll never escape from. It's hard to find middle ground. Meta isn't going to change minds with Movie Gen, its latest video creation AI model, but no matter what you think of AI media creation, it could end up being a significant milestone for the industry.
Movie Gen can produce realistic videos alongside music and sound effects at 16 fps or 24 fps at up to 1080p (upscaled from 768 by 768 pixels). It can also generative personalized videos if you upload a photo, and crucially, it appears to be easy to edit videos using simple text commands. Notably, it can also edit normal, non-AI videos with text. It's easy to imagine how that could be useful for cleaning up something you've shot on your phone for Instagram. Movie Gen is just purely research at the moment —Meta won't be releasing it to the public, so we have a bit of time to think about what it all means.
The company describes Movie Gen as its "third wave" of generative AI research, following its initial media creation tools like Make-A-Scene, as well as more recent offerings using its Llama AI model. It's powered by a 30 billion parameter transformer model that can make 16 second-long 16 fps videos, or 10-second long 24 fps footage. It also has a 13 billion parameter audio model that can make 45 seconds of 48kHz of content like "ambient sound, sound effects (Foley), and instrumental background music" synchronized to video. There's no synchronized voice support yet "due to our design choices," the Movie Gen team wrote in their research paper.
Meta
Meta says Movie Gen was initially trained on "a combination of licensed and publicly available datasets," including around 100 million videos, a billion images and a million hours of audio. The company's language is a bit fuzzy when it comes to sourcing — Meta has already admitted to training its AI models on data from every Australian user's account, it's even less clear what the company is using outside of its own products.
As for the actual videos, Movie Gen certainly looks impressive at first glance. Meta says that in its own A/B testing, people have generally preferred its results compared to OpenAI's Sora and Runway's Gen3 model. Movie Gen's AI humans look surprisingly realistic, without many of the gross telltale signs of AI video (disturbing eyes and fingers, in particular).
Meta
"While there are many exciting use cases for these foundation models, it’s important to note that generative AI isn’t a replacement for the work of artists and animators," the Movie Gen team wrote in a blog post. "We’re sharing this research because we believe in the power of this technology to help people express themselves in new ways and to provide opportunities to people who might not otherwise have them."
It's still unclear what mainstream users will do with generative AI video, though. Are we going to fill our feeds with AI video, instead of taking our own photos and videos? Or will Movie Gen be deconstructed into individual tools that can help sharpen our own content? We can already easily remove objects from the backgrounds of photos on smartphones and computers, more sophisticated AI video editing seems like the next logical step.
This article originally appeared on Engadget at https://www.engadget.com/ai/metas-movie-gen-looks-like-a-huge-leap-forward-for-ai-video-but-you-cant-use-it-yet-165717605.html?src=rss
Meta has spent the last few years saying that “young adults” are crucial to the future of Facebook. Now, the company is introducing a number of changes to its 20-year-old social network in an effort to get younger users to spend more time in the app.
The updates include a new “local” section in the Facebook app that aims to surface information relevant to your local community, a renewed focus on events planned on the service and a new “Communities” feature for Messenger. The changes, Meta claims, will help young adults “explore their interests and connect with the world beyond their close friends.”
Emphasizing events isn’t an entirely new strategy for the company. It launched a standalone events app in 2016 and then rebranded it a year later to focus on “local” businesses and happenings. It quietly killed the app in 2021.
Meta is taking a slightly different approach this time. The new “local” section will surface Marketplace listings, Reels and posts from Facebook groups alongside event listings from your community. Local news, which Meta has also previously boosted, is notably absent Meta’s announcement.
In addition to the local tab, the company is also trying to make events more prominent in Facebook. Facebook will now provide personalized event recommendations in the form of a weekly and weekend digest that will be pushed to users via in-app notifications. The company is also changing how invitations to Facebook events work so users can send invites to their connections on Instagram and via SMS and email.
Groups on Facebook, which Meta has said is among the most-used features by young adults, is also getting attention in this update. Meta is experimenting with a “a customizable Group AI” that allows admins to create a bot that can chat with members to answer questions based on posts that have been shared in the group. Elsewhere in the app, Meta is starting to test an Instagram-like Explore section and a dedicated space for Reels inside of Facebook.
On Messenger, Meta is adding a new “Communities” feature, a concept it previously introduced on WhatsApp. Communities allows “small to medium-sized” groups to organize their conversations and interact in a way that’s more like a Facebook group. Members can create topic-based chats and there are built in moderation and admin tools for controlling who can join.
The changes are part of a broader effort by Meta to bring younger people back to its app with features tailored around how they use social media. “Facebook is still for everyone, but in order to build for the next generation of social media consumers, we’ve made significant changes with young adults in mind,” the Facebook app’s head, Tom Alison, wrote in May.
Whether Meta’s latest efforts will be successful, though, is unclear. The company says there are more than 40 million young adults on Facebook in the US and Canada, a number that’s “the highest it’s been in more than 3 years.” But that’s still a relatively small percentage of its total users in the region and an even tinier slice of its users overall.
This article originally appeared on Engadget at https://www.engadget.com/social-media/facebook-is-pushing-local-content-and-events-to-try-to-win-back-young-adults-161742961.html?src=rss
Meta just announced several updates coming to Facebook during the company’s IRL event in Austin. It's testing an Explore tab and adding a new video tab.
Let’s start with the Explore tab. If you’ve ever perused Instagram, you likely know how exactly this will work. This tab will house “a variety of content tailored to your interests.”
Meta says that the algorithm has been designed to serve up “content that doesn’t just entertain, but helps you dive deeper into your interests.” Here’s hoping I get nothing but content about wild traversal strategies in The Legend of Zelda: Echoes of Wisdom. In any event, the new Explore tab is still in the testing phase so it could be a bit before a wide rollout.
The video tab is also getting a major update to accommodate Reels. All of the video content on Facebook will now be housed behind this tab. The content will stream on a full-screen video player that lets users “seamlessly watch the best short-form, long-form and live videos in a single experience.”
The updated video tab starts rolling out to users in the “coming weeks.” This is definitely an attempt by Meta to capture some of those younger eyeballs, as the announcement was accompanied by statistics indicating that young adults on Facebook spend around 60 percent of their time watching videos and Reels.
I got news for you, Meta. My dad, who is not a young adult, also spends all of his time on Facebook watching videos and Reels. So we’ll all benefit from this expanded video tab.
This article originally appeared on Engadget at https://www.engadget.com/social-media/facebook-is-testing-an-instagram-like-explore-tab-and-introducing-a-new-video-tab-for-reels-153033149.html?src=rss
Paramount just announced that it's going ahead with a new video game based on Avatar: The Last Airbender, which will be developed by Saber Interactive. For the uninitiated, Saber is behind titles likeSnowrunner and Teardown. It also has plenty of experience making licensed content, as it published Evil Dead: The Game and World War Z: Aftermath, among others.
A new game in the Avatar-verse isn’t that notable on its own. After all, there have been plenty already. Paramount is already crowing about the title, though, calling it a “AAA RPG” and claiming it’ll be the “biggest video game in franchise history.” That’s not exactly a high bar, given the cartoon’s rocky history in gaming. There was that one good Bayonetta-like gamethat featured Avatar Korra, but everything else is pretty much trash.
This upcoming RPG won’t follow Aang or Korra. Players will control “an all-new, never-before-seen Avatar.” The game’s set “thousands of years” before the events of Avatar: The Last Airbender and The Legend of Korra. The story has been “developed in close collaboration with Avatar Studios”, though we don’t know if franchise creators Michael Dante DiMartino and Bryan Konietzko are involved in any way.
This looks to be an action RPG and not a turn-based affair, as a press release suggests “dynamic combat” and a quest to “master all four elements.” However, there’s no release date and no suggestion as to how far along the game is. Paramount says it’ll be available “soon”, but the company hasn't released a trailer or even artwork, so one person’s “soon” is another person’s “probably sometime in 2026.”
In any event, sign me up. I’m a big-time cabbage head, or honorary member of the Aang Gang or whatever fans are called. Saber Interactive has proven itself worthy with other pre-existing IPs, so why not this one? It could work.
The Avatar franchise has been relatively quiet lately, though the live-action Netflix show was renewed for two more seasons to finish up the story. Franchise creators DiMartino and Konietzko are making an animated film that follows an adult Aang and friends, but it’s been awhile since we’ve heard anything about that.
This article originally appeared on Engadget at https://www.engadget.com/gaming/saber-interactive-is-making-a-aaa-rpg-based-on-avatar-the-last-airbender-171655351.html?src=rss
Google is adding more AI to search. On Thursday, the company unveiled a long list of changes, including AI-organized web results, Google Lens updates (including video and voice) and placing links and ads inside AI Overviews.
One can suspect that AI-organized search results are where Google will eventually move across the board, but the rollout starts with a narrow scope. Beginning with recipes and meal inspiration, Google’s AI will create a “full-page experience” that includes relevant results based on your search. The company says the AI-collated pages will consist of “perspectives from across the web,” like articles, videos and forums.
Google’s AI Overviews, the snippets of AI-generated info you see above web results, are getting some enhancements, too. The company is incorporating a new link-laden design with more “prominent links to supporting webpages” within the section. Google says its tests have shown the design increased traffic to the supporting websites it links to.
Ads are also coming to AI Overviews — an inevitable outcome if ever there was one. The company says they’re rolling out in the US, so don’t be shocked if you start seeing them soon.
left to right: Google Lens speak to search, ads in AI Overviews, Lens video search
Google
Circle to Search is getting Shazam-like capabilities. The feature will now instantly search for songs you hear without switching apps. Google also noted that Circle to Search is now available on over 150 million Android devices, as it’s expanded in reach and capabilities since its January launch.
Google Lens, the company’s seven-year-old visual search feature for mobile, is getting some upgrades, too. It can now search via video and voice, letting you ask “complex questions about moving images.” The company provides the example of seeing fish at an aquarium and using Lens to ask it aloud, “Why are they swimming together?” According to Google, the AI will use the video clip and your voice recording to identify the species and explain why they hang out together.
Along similar lines, you can now ask Google Lens questions with your voice while taking a picture. “Just point your camera, hold the shutter button and ask whatever’s on your mind — the same way you’d point at something and ask your friend about it,” the company wrote.
Google Lens is also upgrading its shopping chops. The company describes the upgraded visual product search as “dramatically more helpful” than its previous version. The AI results will now include essential information about the searched product, including reviews, prices across different retailers and where to buy.
The Google Lens capabilities are all rolling out now, although some require an opt-in. Video searches are available globally for Search Labs users; you’ll find them in the “AI Overviews and more” experiment. Voice input for Lens is now available for English users in the Google app on Android and iOS. Finally, enhanced shopping with Lens starts rolling out this week.
This article originally appeared on Engadget at https://www.engadget.com/ai/google-stuffs-more-ai-into-search-160003918.html?src=rss
Soon after launching AI playlists in the US, Spotify is adding a new way to keep the music going when you lose your internet connection. The new Offline Backup feature for iOS and Android automatically creates a playlist of your queued and recently played tracks, ready for listening on flights or off-the-grid excursions. Offline Backup is for Premium users only.
The feature complements Spotify’s existing offline mode for user-triggered downloads. In contrast, the Offline Backup playlist doesn’t require any manual downloads. So, think of it as more preparation with less planning. (And, of course, the standard offline mode will still be there.)
Spotify says the playlist will “evolve,” learning your habits as you continue to listen. It will also include the tracks already cached on your device from regular use.
Spotify
Once you go offline, the Offline Backup playlist will appear automatically in your Home feed. Once it populates, you can filter and sort songs within it to more easily nail down the artist, genre or vibe you’re feeling. Spotify also lets you add the playlist to your library for easier access.
You’ll need to turn it on manually to start using the feature. You’ll find it under Data Saving and Offline or Storage in the Spotify app’s settings. Turn on the toggle for Offline Listening to activate Offline Backup.
Offline Backup is available now for Spotify Premium subscribers globally. (And you’ll need to have listened to five songs or more recently.) If you don’t see it after toggling it on and going offline, the company recommends checking for updates to the Spotify app.
This article originally appeared on Engadget at https://www.engadget.com/apps/spotify-can-now-automatically-create-a-playlist-for-airplane-mode-120038259.html?src=rss
Can you hear the soft, cherubic voices of corporate executives singing in unison? That can only mean one thing. They’ve figured out a new way to squeeze money out of our eyeballs. Amazon is adding even more ads to Prime Video, according to reporting by Financial Times. This uptick in corporate-sponsored splendor will go into effect early next year.
This comes less than a year after Amazon forced ads onto its streaming video platform, which is something all of the major streamers do now. We pay money to watch ads. It’s pretty darn cool. In any event, it remains unclear as to how many more ads will infest that next episode of Reacher or where they’ll be placed. Modern streaming shows aren’t made with advertisements in mind, so these ads just kinda pop up wherever.
Ads have turned into a serious revenue stream for Amazon because, again, they sit on top of our monthly Prime memberships that we already pay for. It costs extra to go ad-free. The company recently crowed that it drew more than $1.8 billion in advertising commitments at an upfront event in September. This exceeded the company’s own targets. Amazon also revealed that the ad tier of Prime Video reaches 19 million monthly users in the UK alone. This tier is used by over 100 million people in the US each month.
Kelly Day, vice-president of Prime Video International, told Financial Times that the platform launched with “a very light load” of ads at first, so as to prepare consumers for the coming onslaught. She said the initial rollout was a deliberate “gentle entry into advertising.”
“We know it was a bit of a contrarian approach to take,” she said. “But it’s actually gone much better than we even anticipated.” Day added that the company has not seen “a groundswell of people churning out or canceling" after it brought in advertisements.
The company is also readying an interactive ad experience that will allow Prime Video watchers to add an item to their cart straight from the video stream. This will work with physical remotes and on the app. Sweet, sweet corporate synergy. Yay!
This article originally appeared on Engadget at https://www.engadget.com/big-tech/more-ads-are-coming-to-amazon-prime-video-182906957.html?src=rss
Meta is consolidating its three creator monetization programs for Facebook to make it easier for users to start earning on the social network. The company has three ways for creators to earn on the website: Via In-stream ads, Ads on Reels and Performance bonuses. Each one has a different eligibility requirement and sign-up process. The new Facebook Content Monetization program will simplify things for creators who want to earn on the website, since they'll only need to apply and go through the onboarding process once.
In its announcement, Meta said it paid creators more than $2 billion for their Reels, videos, photos and text posts over the past year. However, it also said that creators aren't able to maximize what they could make on the platform, and only one-third of them earn from more than one of its programs. The consolidated scheme will work just like its older programs in that it has a performance-based payout model. Monetized users can still earn from the ads in their reels, longer videos, photos and text posts. Meta will give them access to a new Insights tab, though, which shows how much money they're making on different content formats. They can also see which videos and posts are making the most money. Previously, the company had separate insights tabs for each program.
The new monetization feature is still in beta mode and will be until next year. This week, Meta will start inviting 1 million creators already earning on the social network to take part in its beta testing, but it will continue sending invites to more people in the coming months. Creators don't have to take part in the test if they don't want to, but if they do, they can't rejoin Facebook's standalone monetization schemes. Those who don't get an invitation anytime soon but want to join the new program can express their interest through Facebook's official content monetization page.
Facebook
This article originally appeared on Engadget at https://www.engadget.com/social-media/meta-wants-to-make-it-easy-for-creators-to-earn-on-facebook-150037046.html?src=rss
Palworld could be on its way to a mobile device near you. Krafton, the publisher of PUBG: Battlegrounds, has agreed a licensing deal with the game’s maker, Pocketpair, to bring the smash hit to mobile.
Krafton’s PUBG Studios will develop the mobile version. No other details have been announced, other than to note that PUBG Studios will “reinterpret” Palworld’s gameplay for mobile devices, per an automated translation of a press release (which is in Korean). So it’s not completely clear whether this will be a faithful port of the full game or a spinoff that has some of the same features.
However, there’s a reason that Palworld isn’t available on PS5 in Japan for now. The game’s similarity to Pokémon (here, you also catch a variety of monsters, but some of ‘em have guns and you can also eat them) caught the attention of Nintendo and The Pokémon Company. The latter indicated in January that it was investigating the would-be rival. In September, the two companies filed suit against Pocketpair in Japan for alleged patent infringement.
This article originally appeared on Engadget at https://www.engadget.com/gaming/palworld-is-bound-for-mobile-thanks-to-the-maker-of-pubg-141104110.html?src=rss