With Recall, Microsoft is using AI to fix Windows’ eternally broken search

At its Build 2024 conference, Microsoft unveiled Rewind, a new feature that aims to make local Windows PC searches as quick and effective as web searches. Similar to third-party apps like Rewind, Microsoft’s Recall for Copilot+ PCs uses AI to retrieve virtually anything you’ve seen on your PC. Microsoft describes it like giving your PC a photographic memory.

At Monday’s event, Microsoft Product Manager Caroline Hernandez gave the example of searching for a blue dress on Pinterest using a Windows PC with Recall. Returning later, she can search the Recall timeline for “blue dress” (using her voice), which pulls all of her recent searches, saving her from having to sift through browser history. She further refined the query with more specific details like “blue pantsuit with sequined lace for Abuelita,” and Rewind brought up the relevant results.

It can also quickly find specific emails, documents or chat threads you’ve had on your PC. Microsoft says Recall uses semantic associations to make connections. For example, it connected the term “peacock” to blue hues in the dress search.

Other examples the company gave include using Recall to find a specific PowerPoint slide using her voice. Microsoft says it can start with exact information or vague contextual clues to find what you want. Another example in the demo was a marketing line from a Teams meeting that Hernandez couldn’t remember. By giving Recall contextual clues, it found it despite her not remembering the exact phrase.

Microsoft says Recall’s processing is all done locally and won’t be used to train future AI models, so your data should remain private, secure and offline. The company says over 40 local multi-modal small language models, which can recognize text, images, video and more are used to process Recall’s data.

Recall will be available exclusively on Copilot+ PCs after installing the latest Windows Updates on June 18.

Catch up on all the news from Microsoft's Copilot AI and Surface event today!

This article originally appeared on Engadget at https://www.engadget.com/with-recall-microsoft-is-using-ai-to-fix-windows-eternally-broken-search-172510698.html?src=rss

Microsoft unveils Copilot+ PCs with generative AI capabilities baked in

We’ve been hearing rumblings for months now that Microsoft was working on so-called “AI PCs.” At a pre-Build event, the company spelled out its vision.

Microsoft is calling its version Copilot+ PCs, which CEO Satya Nadella described as a "new class of Windows PCs." These contain hardware designed to handle more generative AI Copilot processes locally, rather than relying on the cloud. Doing so requires a chipset with a neural processing unit (NPU), and manufacturers such as Qualcomm have been laying the groundwork with chips like the Snapdragon X Elite

Microsoft is taking a partner-first approach to making Copilot+ PCs. Along with chipmakers like AMD, Intel and Qualcomm, major OEMs including Acer, ASUS, Dell, HP and Lenovo are on board. The first Copilot+ laptops are available to preorder today and they'll ship on June 18. Prices start at $999.

Yusuf Mehdi, Microsoft EVP and Consumer Chief Marketing Officer, said during the event that the company has completely reimagined what a Windows PC is. He claimed that Copilot+ PCs are the most powerful PCs ever (we'll need to see if that assertion holds up in real-world testing). Despite that, Mehdi said, the first generation of laptops are "unbelievably thin, light and beautiful." 

Other AI PCs on the market deliver 10 TOPs (tera operations per second). To be dubbed a Copilot+ PC, a system will need to deliver at least 40 TOPs of NPU performance and have at least 16GB of RAM and 256GB of storage. Qualcomm claims the Snapdragon X Elite delivers up to 75 TOPs overall. But the pure specs matter less than what Microsoft is able to actually do with the hardware.

Mehdi also suggested Copilot+ PCs are 58 percent faster than M3-powered MacBook Airs (though it's worth noting Apple has more powerful M3 chips in its laptops already and M4 chips on the way very soon). The company suggested that Copilot+ laptops will offer up to 22 hours of battery life while playing videos locally and up to 15 hours while browsing the web.

To help make all of this happen, the Windows Copilot Runtime has more than 40 AI models that are part of a new Windows 11 layer. They're said to be deeply integrated into Windows to help them more efficiently access hardware and to power more robust privacy and security options. The models can work across any app, Microsoft says.

As far as the Windows features go, one aspect of Copilot+ PCs is something that's been rumored for a while. It’s called Recall, and you can think of it as a more advanced version of the Timeline feature from Windows 10. You'll be able to use natural language prompts to get your PC to resurface information based on what you remember about it. You'll be able to scroll through apps, documents and messages on an explorable timeline.

According to Mehdi, Microsoft built Recall with responsible AI standards in mind. Data from it will stay on your PC and it won't be used to train Microsoft's AI models.

Additionally, you'll be able to restore old snaps in Windows Photos using a tool called Super Resolution. In addition, the app will offer an option to tell a story based on your photos with the help of an AI narrator. Live Captions, meanwhile, will offer real-time captioning and translations into English from more than 40 languages (with more to come) from both live and pre-recorded video.

Microsoft now has its own upscaling tech (akin to NVIDIA's DLSS) for games too. It's called Auto Super Resolution and it's said to use AI to upscale the resolution of graphics and improve refresh rates in real-time without impacting performance.

There’s also a new Copilot app that you can use as a standalone window, sidebar or in full screen. You’ll be able to drag and drop elements into Copilot from elsewhere in Windows. Thanks to the new Copilot key on keyboards, you’ll be able to fire up the app with the touch of a physical button. Copilot will eventually be able to let you adjust Windows settings too.

Given that Qualcomm uses Arm architecture, it's perhaps little surprise that Microsoft has rebuilt Windows 11 for Arm-based chips. Microsoft has been trying to make Arm-based Windows PCs a thing for some time, with mixed results. We had major reservations about the Arm-powered Surface Pro 9 a couple of years ago. But perhaps the company has finally cracked that nut this time around. 

To help with that, Microsoft has developed an emulator called Prism that is said to be as efficient as Apple's Rosetta 2. The aim is to help users run legacy x86/x64 apps without a hitch. Major apps such as Zoom, Chrome, Spotify and Photoshop will run natively on Arm-based Windows.

The Copilot+ PC is the natural progression of something we've seen in flagship Android phones over the last couple of years. The most recent Google Pixel devices, for instance, handle many generative AI processes on-device by tapping into the power of the company's Tensor chips.

Meanwhile, Apple is largely expected to move into the generative AI space in a major way at its Worldwide Developers Conference next month. The M4 chip that recently debuted in the new iPad Pro is said to be capable of powering GAI experiences and that chipset should be coming to Macs later this year. Apple's also said to be working on a deal with OpenAI, perhaps to bring its generative AI tech to Siri.

Catch up on all the news from the Microsoft Surface and AI event right here!

This article originally appeared on Engadget at https://www.engadget.com/microsoft-unveils-copilot-pcs-with-generative-ai-capabilities-baked-in-170445370.html?src=rss

The Morning After: What to expect from Microsoft Build 2024

Normally, Microsoft’s Build is a straightforward (often dry) showcase of the company’s software and hardware developments, with a dash of on-stage coding to excite the developer crowd. But this year, the company is likely to make some huge AI moves, following its 2023 debut of ChatGPT-powered Bing Chat. Then, there’s new Surface hardware.

In fact, Microsoft has a showcase for new Surfaces and AI in Windows 11 on May 20, while Build actual kicks off a day later. And you know what? The Surface event might be the most impactful.

Rumors suggest we’ll see some of the first systems with Qualcomm’s Arm-based Snapdragon X Elite chip alongside new features in the next major Windows 11 update.

A refresh for its consumer PCs is likely to consist of new 13- and 15-inch Surface Laptop 6 models with thinner bezels, larger trackpads, improved port selection and that X Elite chip. We might even see an Arm-based version of the Surface Pro 10 too.

While Intel confirmed Microsoft is already working on ways to make Copilot local, we could see that reach consumers as well. By local, I mean the AI assistant could answer simpler questions, like basic math or queries about files on your system, without an internet connection.

— Mat Smith

Apple will reportedly offer higher trade-in credit for old iPhones for the next two weeks

Indie developers are trying to make horse games that don’t suck

X-Men 97 didn’t have to go that hard

​​You can get these reports delivered daily direct to your inbox. Subscribe right here!

TMA
Engadget

Just like it slimmed down the latest iPad Pro, Apple may try to do the same to the iPhone. To be more precise, the company is working on a “significantly thinner” device that could arrive in 2025, according to The Information. An upgraded front-facing camera could sit alongside Face ID sensors in a smaller pill-shaped cutout, while the rear camera array could move to the center of the phone. The screen size would reportedly be between that of the current base iPhone and the iPhone Pro Max — so between 6.12 and 6.69 inches.

Continue reading.

Slack is training its machine learning models on user messages, files and other content, without explicitly asking for permission. This means your private data is being used by default. To opt out, you need your organization’s Slack administrator (IT, HR, etc.) to contact Slack on your behalf.

In response to concerns, Slack recently clarified its data use in a blog post, assuring users that customer data is not used to train generative AI products, which typically rely on external large language models (LLMs). The company uses this data to train machine learning models for features like channel and emoji recommendations and search results. However, it’s misleading, at best, to say customers can opt out when “customers” doesn’t include employees working within an organization. It is also a little misleading, implying all your data is safe from AI training, when the company apparently gets to pick and choose which AI models the statement covers.

Continue reading.

TMA
evleaks

Two reliable leaks are showing off the entry-level Moto Razr 50 and high-end Razr 50 Ultra (likely branded as the 2024 Razr and Razr+ in the US), before Motorola even told us about them. The entry-level Razr (2024) will supposedly have a 3.63-inch cover display, quite a step up from the piddly 1.5-inch cover display on the 2023 version I tested. Sadly, no sign of the wood option included in the Edge 50 phone series unveiled last month.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/the-morning-after-what-to-expect-from-microsoft-build-2024-111524762.html?src=rss

What to expect from Microsoft Build 2024: The Surface event, Windows 11 and AI

If you can't tell by now, just about every tech company is eager to pray at the altar of AI, for better or worse. Google's recent I/O developer conference was dominated by AI features, like its seemingly life-like Project Astra assistant. Just before that, OpenAI debuted GPT 4o, a free and conversational AI model that's disturbingly flirty. Next up is Microsoft Build 2024, the company's developer conference that's kicking off next week in Seattle.

Normally, Build is a fairly straightforward celebration of Microsoft's devotion to productivity, with a dash of on-stage coding to excite the developer crowd. But this year, the company is gearing up to make some more huge AI moves, following its debut of the ChatGPT-powered Bing Chat in early 2023. Take that together with rumors around new Surface hardware, and Build 2024 could potentially be one of the most important events Microsoft has ever held.

But prior to Build, Microsoft is hosting a showcase for new Surfaces and AI in Windows 11 on May 20. (It won't be livestreamed, but Engadget will be liveblogging the Surface event starting 1 PM ET.) Build kicks off a day later on May 21 (you can watch the Build event livestream on Engadget). For the average Joe, the Surface event is shaping up to be the more impactful of the two, as rumors suggest we will see some of the first systems featuring Qualcomm’s Arm-based Snapdragon X Elite chip alongside new features coming in the next major Windows 11 update.

That's not to say it's all rosy for the Windows maker. Build 2024 is the point where we'll see if AI will make or break Microsoft. Will the billions in funding towards OpenAI and Copilot projects actually pay off with useful tools for consumers? Or is the push for AI, and the fabled idea of "artificial general intelligence," inherently foolhardy as it makes computers more opaque and potentially untrustworthy? (How, exactly, do generative AI models come up with their answers? It's not always clear.)

Here are a few things we expect to see at Build 2024:

While Microsoft did push out updates to the Surface family earlier this spring, those machines were more meant for enterprise customers, so they aren’t available for purchase in regular retail stores. A Microsoft spokesperson told us at the time that it "absolutely remain[s] committed to consumer devices," and that the commercial focused announcement was "only the first part of this effort."

Instead, the company's upcoming refresh for its consumer PCs is expected to consist of new 13 and 15-inch Surface Laptop 6 models with thinner bezels, larger trackpads, improved port selection and the aforementioned X Elite chip. There’s a good chance that at the May 20th showcase, we’ll also see an Arm-based version of the Surface Pro 10, which will sport a similar design to the business model that came out in March, but with revamped accessories including a Type Cover with a dedicated Copilot key.

According to The Verge, Microsoft is confident that these new systems could outmatch Apple's M3-powered MacBook Air in raw speed and AI performance.

The company has also reportedly revamped emulation for x86 software in its Arm-based version of Windows 11. That's a good thing, since poor emulation was one of the main reasons we hated the Surface Pro 9 5G, a confounding system powered by Microsoft's SQ3 Arm chip. That mobile processor was based on Qualcomm's Snapdragon 8cx Gen 3, which was unproven in laptops at the time. Using the Surface Pro 9 5G was so frustrating we felt genuinely offended that Microsoft was selling it as a "Pro" device. So you can be sure we're skeptical about any amazing performance gains from another batch of Qualcomm Arm chips.

It'll also be interesting to see if Microsoft's new consumer devices look any different than their enterprise counterparts, which were basically just chip swaps inside of the cases from the Surface Pro 9 and Laptop 5. If Microsoft is actually betting on mobile chips for its consumer Surfaces, there's room for a complete rethinking of its designs, just like how Apple refashioned its entire laptop lineup around its M-series chips.

Aside from updated hardware, one of the biggest upgrades on these new Surfaces should be vastly improved on-device AI and machine learning performance thanks to the Snapdragon X Elite chip, which can deliver up to 45 TOPS (trillions of operations per second) from its neural processing unit (NPU). This is key because Microsoft has previously said PCs will need at least 40 TOPs in order to run Windows AI features locally. This leads us to some of the additions coming in the next major build of Microsoft’s OS, including something the company is calling its AI Explorer, expanded Studio effects and more.

According to Windows Central, AI Explorer is going to be Microsoft’s catch-all term covering a range of machine learning-based features. This is expected to include a revamped search tool that lets users look up everything from websites to files using natural language input. There may also be a new timeline that will allow people to scroll back through anything they've done recently on their computer and the addition of contextual suggestions that appear based on whatever they're currently looking at. And building off of some of the Copilot features we’ve seen previously, it seems Microsoft is planning to add support for tools like live captions, expanded Studio effects (including real-time filters) and local generative AI tools that can help create photos and more on the spot.

Microsoft wants an AI Copilot in everything. The company first launched Github Copilot in 2021 as a way to let programmers use AI to deal with mundane coding tasks. At this point, all of the company's other AI tools have also been rebranded as "Microsoft Copilot" (that includes Bing Chat, and Microsoft 365 Copilot for productivity apps). With Copilot Pro, a $20 monthly offering launched earlier this year, the company provides access to the latest GPT models from OpenAI, along with other premium features.

But there's still one downside to all of Microsoft's Copilot tools: They require an internet connection. Very little work is actually happening locally, on your device. That could change soon, though, as Intel confirmed that Microsoft is already working on ways to make Copilot local. That means it may be able to answer simpler questions, like basic math or queries about files on your system, more quickly without hitting the internet at all. As impressive as Microsoft's AI assistant can be, it still typically takes a few seconds to deal with your questions.

After all the new hardware and software are announced, Build is positioned to help developers lay even more groundwork to better support those new AI and expanded Copilot features. Microsoft has already teased things like Copilot on Edge and Copilot Plugins for 365 apps, so we’re expecting to hear more on how those will work. And by taking a look at some of the sessions already scheduled for Build, we can see there’s a massive focus on everything AI-related, with breakouts for Customizing Microsoft Copilot, Copilot in Teams, Copilot Extensions and more.

While Microsoft will surely draw a lot of attention, it’s important to mention that it won’t be the only manufacturer coming out with new AI PCs. That’s because alongside revamped Surfaces, we’re expecting to see a whole host of other laptops featuring Qualcomm’s Snapdragon X Elite Chip (or possibly the X Plus) from other major vendors like Dell, Lenovo and more.

Admittedly, following the intense focus Google put on AI at I/O 2024, the last thing people may want to hear about is yet more AI. But at this point, like most of its rivals, Microsoft is betting big on machine learning to grow and expand the capabilities of Windows PCs.

This article originally appeared on Engadget at https://www.engadget.com/what-to-expect-from-microsoft-build-2024-the-surface-event-windows-11-and-ai-182010326.html?src=rss

What to expect from Microsoft Build 2024: The Surface event, Windows 11 and AI

If you can't tell by now, just about every tech company is eager to pray at the altar of AI, for better or worse. Google's recent I/O developer conference was dominated by AI features, like its seemingly life-like Project Astra assistant. Just before that, OpenAI debuted GPT 4o, a free and conversational AI model that's disturbingly flirty. Next up is Microsoft Build 2024, the company's developer conference that's kicking off next week in Seattle.

Normally, Build is a fairly straightforward celebration of Microsoft's devotion to productivity, with a dash of on-stage coding to excite the developer crowd. But this year, the company is gearing up to make some more huge AI moves, following its debut of the ChatGPT-powered Bing Chat in early 2023. Take that together with rumors around new Surface hardware, and Build 2024 could potentially be one of the most important events Microsoft has ever held.

But prior to Build, Microsoft is hosting a showcase for new Surfaces and AI in Windows 11 on May 20. Build kicks off a day later on May 21. For the average Joe, the Surface event is shaping up to be the more impactful of the two, as rumors suggest we will see some of the first systems featuring Qualcomm’s Arm-based Snapdragon X Elite chip alongside new features coming in the next major Windows 11 update.

That's not to say it's all rosy for the Windows maker. Build 2024 is the point where we'll see if AI will make or break Microsoft. Will the billions in funding towards OpenAI and Copilot projects actually pay off with useful tools for consumers? Or is the push for AI, and the fabled idea of "artificial general intelligence," inherently foolhardy as it makes computers more opaque and potentially untrustworthy? (How, exactly, do generative AI models come up with their answers? It's not always clear.)

Here are a few things we expect to see at Build 2024:

While Microsoft did push out updates to the Surface family earlier this spring, those machines were more meant for enterprise customers, so they aren’t available for purchase in regular retail stores. A Microsoft spokesperson told us at the time that it "absolutely remain[s] committed to consumer devices," and that the commercial focused announcement was "only the first part of this effort."

Instead, the company's upcoming refresh for its consumer PCs is expected to consist of new 13 and 15-inch Surface Laptop 6 models with thinner bezels, larger trackpads, improved port selection and the aforementioned X Elite chip. There’s a good chance that at the May 20th showcase, we’ll also see an Arm-based version of the Surface Pro 10, which will sport a similar design to the business model that came out in March, but with revamped accessories including a Type Cover with a dedicated Copilot key.

According to The Verge, Microsoft is confident that these new systems could outmatch Apple's M3-powered MacBook Air in raw speed and AI performance.

The company has also reportedly revamped emulation for x86 software in its Arm-based version of Windows 11. That's a good thing, since poor emulation was one of the main reasons we hated the Surface Pro 9 5G, a confounding system powered by Microsoft's SQ3 Arm chip. That mobile processor was based on Qualcomm's Snapdragon 8cx Gen 3, which was unproven in laptops at the time. Using the Surface Pro 9 5G was so frustrating we felt genuinely offended that Microsoft was selling it as a "Pro" device. So you can be sure we're skeptical about any amazing performance gains from another batch of Qualcomm Arm chips.

It'll also be interesting to see if Microsoft's new consumer devices look any different than their enterprise counterparts, which were basically just chip swaps inside of the cases from the Surface Pro 9 and Laptop 5. If Microsoft is actually betting on mobile chips for its consumer Surfaces, there's room for a complete rethinking of its designs, just like how Apple refashioned its entire laptop lineup around its M-series chips.

Aside from updated hardware, one of the biggest upgrades on these new Surfaces should be vastly improved on-device AI and machine learning performance thanks to the Snapdragon X Elite chip, which can deliver up to 45 TOPS (trillions of operations per second) from its neural processing unit (NPU). This is key because Microsoft has previously said PCs will need at least 40 TOPs in order to run Windows AI features locally. This leads us to some of the additions coming in the next major build of Microsoft’s OS, including something the company is calling its AI Explorer, expanded Studio effects and more.

According to Windows Central, AI Explorer is going to be Microsoft’s catch-all term covering a range of machine learning-based features. This is expected to include a revamped search tool that lets users look up everything from websites to files using natural language input. There may also be a new timeline that will allow people to scroll back through anything they've done recently on their computer and the addition of contextual suggestions that appear based on whatever they're currently looking at. And building off of some of the Copilot features we’ve seen previously, it seems Microsoft is planning to add support for tools like live captions, expanded Studio effects (including real-time filters) and local generative AI tools that can help create photos and more on the spot.

Microsoft wants an AI Copilot in everything. The company first launched Github Copilot in 2021 as a way to let programmers use AI to deal with mundane coding tasks. At this point, all of the company's other AI tools have also been rebranded as "Microsoft Copilot" (that includes Bing Chat, and Microsoft 365 Copilot for productivity apps). With Copilot Pro, a $20 monthly offering launched earlier this year, the company provides access to the latest GPT models from OpenAI, along with other premium features.

But there's still one downside to all of Microsoft's Copilot tools: They require an internet connection. Very little work is actually happening locally, on your device. That could change soon, though, as Intel confirmed that Microsoft is already working on ways to make Copilot local. That means it may be able to answer simpler questions, like basic math or queries about files on your system, more quickly without hitting the internet at all. As impressive as Microsoft's AI assistant can be, it still typically takes a few seconds to deal with your questions.

After all the new hardware and software are announced, Build is positioned to help developers lay even more groundwork to better support those new AI and expanded Copilot features. Microsoft has already teased things like Copilot on Edge and Copilot Plugins for 365 apps, so we’re expecting to hear more on how those will work. And by taking a look at some of the sessions already scheduled for Build, we can see there’s a massive focus on everything AI-related, with breakouts for Customizing Microsoft Copilot, Copilot in Teams, Copilot Extensions and more.

While Microsoft will surely draw a lot of attention, it’s important to mention that it won’t be the only manufacturer coming out with new AI PCs. That’s because alongside revamped Surfaces, we’re expecting to see a whole host of other laptops featuring Qualcomm’s Snapdragon X Elite Chip (or possibly the X Plus) from other major vendors like Dell, Lenovo and more.

Admittedly, following the intense focus Google put on AI at I/O 2024, the last thing people may want to hear about is yet more AI. But at this point, like most of its rivals, Microsoft is betting big on machine learning to grow and expand the capabilities of Windows PCs.

This article originally appeared on Engadget at https://www.engadget.com/what-to-expect-from-microsoft-build-2024-the-surface-event-windows-11-and-ai-182010326.html?src=rss

Meta’s Threads gets its own Tweetdeck clone

The web version of Threads could soon be much more useful. Meta is starting to test custom Tweetdeck-like feeds that will allow users to track multiple topics, searches and accounts in a single view.

People who are part of the test can set up to “pinned columns” that will track updates around specific topics, tags, accounts or search terms. Users can also opt to have these columns automatically refresh with new content. For now, Threads will support up to 100 different columns, though a Meta spokesperson said that number may change as the test progresses. 

Based on screenshots shared by Mark Zuckerberg, the new Threads columns look a lot like Tweetdeck, the desktop app long favored by Twitter’s power users. The app is now called X Pro and only available to X’s paid subscribers.

The test is the latest sign Meta is looking to make Threads a more reliable source for real-time information. The company has also added a “recent” tab and trending topics to search. But being able to track multiple feeds of updates at once is even more useful. It could also address long-running complaints about Threads’ algorithmic “for you” feed, which tends to surface a random mix of days-old posts and bizarre personal stories from unconnected accounts.

It’s not clear how many people will be part of Meta’s initial test of the feature, though Adam Mosseri said the company is looking for feedback on the changes. But the company has often rolled out major Threads changes to small group of users first before making them more widely available.

Update May 16, 2024, 2:15 PM ET: Added details about how many columns Threads will support.

This article originally appeared on Engadget at https://www.engadget.com/metas-threads-gets-its-own-tweetdeck-clone-172131218.html?src=rss

Meta’s Threads gets its own Tweetdeck clone

The web version of Threads could soon be much more useful. Meta is starting to test custom Tweetdeck-like feeds that will allow users to track multiple topics, searches and accounts in a single view.

People who are part of the test can set up to “pinned columns” that will track updates around specific topics, tags, accounts or search terms. Users can also opt to have these columns automatically refresh with new content. For now, Threads will support up to 100 different columns, though a Meta spokesperson said that number may change as the test progresses. 

Based on screenshots shared by Mark Zuckerberg, the new Threads columns look a lot like Tweetdeck, the desktop app long favored by Twitter’s power users. The app is now called X Pro and only available to X’s paid subscribers.

The test is the latest sign Meta is looking to make Threads a more reliable source for real-time information. The company has also added a “recent” tab and trending topics to search. But being able to track multiple feeds of updates at once is even more useful. It could also address long-running complaints about Threads’ algorithmic “for you” feed, which tends to surface a random mix of days-old posts and bizarre personal stories from unconnected accounts.

It’s not clear how many people will be part of Meta’s initial test of the feature, though Adam Mosseri said the company is looking for feedback on the changes. But the company has often rolled out major Threads changes to small group of users first before making them more widely available.

Update May 16, 2024, 2:15 PM ET: Added details about how many columns Threads will support.

This article originally appeared on Engadget at https://www.engadget.com/metas-threads-gets-its-own-tweetdeck-clone-172131218.html?src=rss

Google’s accessibility app Lookout can use your phone’s camera to find and recognize objects

Google has updated some of its accessibility apps to add capabilities that will make them easier to use for people who need them. It has rolled out a new version of the Lookout app, which can read text and even lengthy documents out loud for people with low vision or blindness. The app can also read food labels, recognize currency and can tell users what it sees through the camera and in an image. Its latest version comes with a new "Find" mode that allows users to choose from seven item categories, including seating, tables, vehicles, utensils and bathrooms.

When users choose a category, the app will be able to recognize objects associated with them as the user moves their camera around a room. It will then tell them the direction or distance to the object, making it easier for users to interact with their surroundings. Google has also launched an in-app capture button, so they can take photos and quickly get AI-generated descriptions. 

A screenshot showing object categories in Google Lookout, such as Seating & Tables, Doors & Windows, Cups, etc.
Google

The company has updated its Look to Speak app, as well. Look to Speak enables users to communicate with other people by selecting from a list of phrases, which they want the app to speak out loud, using eye gestures. Now, Google has added a text-free mode that gives them the option to trigger speech by choosing from a photo book containing various emojis, symbols and photos. Even better, they can personalize what each symbol or image means for them. 

Google has also expanded its screen reader capabilities for Lens in Maps, so that it can tell the user the names and categories of the places it sees, such as ATMs and restaurants. It can also tell them how far away a particular location is. In addition, it's rolling out improvements for detailed voice guidance, which provides audio prompts that tell the user where they're supposed to go. 

Finally, Google has made Maps' wheelchair information accessible on desktop, four years after it launched on Android and iOS. The Accessible Places feature allows users to see if the place they're visiting can accommodate their needs — businesses and public venues with an accessible entrance, for example, will show a wheelchair icon. They can also use the feature to see if a location has accessible washrooms, seating and parking. The company says Maps has accessibility information for over 50 million places at the moment. Those who prefer looking up wheelchair information on Android and iOS will now also be able to easily filter reviews focusing on wheelchair access. 

Google made all these announcements at this year's I/O developer conference, where it also revealed that it open-sourced more code for the Project Gameface hands-free "mouse," allowing Android developers to use it for their apps. The tool allows users to control the cursor with their head movements and facial gestures, so that they can more easily use their computers and phones. 

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at https://www.engadget.com/googles-accessibility-app-lookout-can-use-your-phones-camera-to-find-and-recognize-objects-160007994.html?src=rss

Google’s accessibility app Lookout can use your phone’s camera to find and recognize objects

Google has updated some of its accessibility apps to add capabilities that will make them easier to use for people who need them. It has rolled out a new version of the Lookout app, which can read text and even lengthy documents out loud for people with low vision or blindness. The app can also read food labels, recognize currency and can tell users what it sees through the camera and in an image. Its latest version comes with a new "Find" mode that allows users to choose from seven item categories, including seating, tables, vehicles, utensils and bathrooms.

When users choose a category, the app will be able to recognize objects associated with them as the user moves their camera around a room. It will then tell them the direction or distance to the object, making it easier for users to interact with their surroundings. Google has also launched an in-app capture button, so they can take photos and quickly get AI-generated descriptions. 

A screenshot showing object categories in Google Lookout, such as Seating & Tables, Doors & Windows, Cups, etc.
Google

The company has updated its Look to Speak app, as well. Look to Speak enables users to communicate with other people by selecting from a list of phrases, which they want the app to speak out loud, using eye gestures. Now, Google has added a text-free mode that gives them the option to trigger speech by choosing from a photo book containing various emojis, symbols and photos. Even better, they can personalize what each symbol or image means for them. 

Google has also expanded its screen reader capabilities for Lens in Maps, so that it can tell the user the names and categories of the places it sees, such as ATMs and restaurants. It can also tell them how far away a particular location is. In addition, it's rolling out improvements for detailed voice guidance, which provides audio prompts that tell the user where they're supposed to go. 

Finally, Google has made Maps' wheelchair information accessible on desktop, four years after it launched on Android and iOS. The Accessible Places feature allows users to see if the place they're visiting can accommodate their needs — businesses and public venues with an accessible entrance, for example, will show a wheelchair icon. They can also use the feature to see if a location has accessible washrooms, seating and parking. The company says Maps has accessibility information for over 50 million places at the moment. Those who prefer looking up wheelchair information on Android and iOS will now also be able to easily filter reviews focusing on wheelchair access. 

Google made all these announcements at this year's I/O developer conference, where it also revealed that it open-sourced more code for the Project Gameface hands-free "mouse," allowing Android developers to use it for their apps. The tool allows users to control the cursor with their head movements and facial gestures, so that they can more easily use their computers and phones. 

Catch up on all the news from Google I/O 2024 right here!

This article originally appeared on Engadget at https://www.engadget.com/googles-accessibility-app-lookout-can-use-your-phones-camera-to-find-and-recognize-objects-160007994.html?src=rss

Intel’s Thunderbolt Share makes it easier to move large files between PCs

Intel has launched a new software application called Thunderbolt Share that will make controlling two or more PCs a more seamless experience. It will allow you to sync files between PCs through its interface, or see multiple computers' folders so you can drag and drop and specific documents, images and other file types. That makes collaborations easy if you're transferring particularly hefty files, say raw photos or unedited videos, between you and a colleague. You can also use the app to transfer data from an old PC to a new one, so you don't have to use an external drive to facilitate the move. 

When it comes to screen sharing, Intel says the software can retain the resolution of the source PC without compression, so long as the maximum specs only reach Full HD at up to 60 frames per second. The mouse cursor and keyboard also remain smooth and responsive between PCs, thanks to the Thunderbolt technology's high bandwidth and low latency. 

The company says it's licensing Thunderbolt Share to OEMs as a value-add feature for their upcoming PCs and accessories. You will need Windows computers with Thunderbolt 4 or 5 ports to be able to use it, and they have to be directly connected with a Thunderbolt cable, or connected to the same Thunderbolt dock or monitor. The first devices that support the application will be available in the second half of 2024 and will be coming from various manufacturers, including Lenovo, Acer, MSI, Razer, Kensington and Belkin.

This article originally appeared on Engadget at https://www.engadget.com/intels-thunderbolt-share-makes-it-easier-to-move-large-files-between-pcs-123011505.html?src=rss