Meta’s Orion prototype offers a glimpse into our AR future

If you’re excited, or even just a little curious, about the future of augmented reality, Meta’s Orion prototype makes the most compelling case yet for the technology.

For Meta, Orion is about more than finally making AR glasses a reality. It’s also the company’s best shot at becoming less dependent on Apple and Google’s app stores, and the rules that come with them. If Orion succeeds, then maybe we won’t need smartphones for much at all. Glasses, Zuckerberg has speculated, might eventually become “the main way we do computing.”

At the moment, it’s still way too early to know if Zuckerberg’s bet will actually pay off. Orion is, for now, still a prototype. Meta hasn’t said when it might become widely available or how much it might cost. That's partly because the company, which has already poured tens of billions of dollars into AR and VR research, still needs to figure out how to make Orion significantly more affordable than the $10,000 it reportedly costs to make the current version. It also needs to refine Orion’s hardware and software. And, perhaps most importantly, the company will eventually need to persuade its vast user base that AI-infused, eye-tracking glasses offer a better way to navigate the world.

Still, Meta has been eager to show off Orion since its reveal at Connect. And, after recently getting a chance to try out Orion for myself, it’s easy to see why: Orion is the most impressive AR hardware I’ve seen.

Meta has clearly gone to great lengths to make its AR glasses look, well, normal. While Snap has been mocked for its oversized Spectacles, Orion’s shape and size is closer to a traditional pair of frames.

Even so, they’re still noticeably wide and chunky. The thick black frames, which house an array of cameras, sensors and custom silicon, may work on some face shapes, but I don’t think they are particularly flattering. And while they look less cartoonish than Snap’s AR Spectacles, I’m pretty sure I’d still get some funny looks if I walked around with them in public. At 98 grams, the glasses were noticeably bulkier than my typical prescription lenses, but never felt heavy.

In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.

When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.

What the Orion glasses look like on.
The glasses were a bit wider than my face.
Karissa Bell for Engadget

You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.

That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.

Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve also tried with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.

I also answered a couple of video calls, including one from a surprisingly lifelike Codec Avatar. I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality headsets, much of this will sound familiar, and a lot of it wasn’t that different from what you can do in VR headsets.

The magic of AR, though, is that everything you see is overlaid onto the world around you and your surroundings are always fully visible. I particularly appreciated this when I got to the gaming portion of the walkthrough. I played a few rounds of a Meta-created game called Stargazer, where players control a retro-looking spacecraft by moving their head to avoid incoming obstacles while shooting enemies with finger tap gestures. Throughout that game, and a subsequent round of AR Pong, I was able to easily keep up a conversation with the people around me while I played. As someone who easily gets motion sick from VR gaming, I appreciated that I never felt disoriented or less aware of my surroundings.

Orion’s displays rely on silicon carbide lenses, micro-LED projectors and waveguides. The actual lenses are clear, though they can dim depending on your environment. One of the most impressive aspects is the 70-degree field of view. It was noticeably wider and more immersive than what I experienced with Snap’s AR Spectacles, which have a 46-degree field of view. At one point, I had three windows open in one multitasking view: Instagram Reels, a video call and a messaging inbox. And while I was definitely aware of the outer limits of the display, I could easily see all three windows without physically moving my head or adjusting my position. It’s still not the all-encompassing AR of sci-fi flicks, but it was wide enough I never struggled to keep the AR content in view.

What was slightly disappointing, though, was the resolution of Orion’s visuals. At 13 pixels per degree, the colors all seemed somewhat muted and projected text was noticeably fuzzy. None of it was difficult to make out, but it was much less vivid than what I saw on Snap’s AR Spectacles, which have a 37 pixels per degree resolution.

Meta’s VP of Wearable Devices, Ming Hua, told me that one of the company’s top priorities is to increase the brightness and resolution of Orion’s displays. She said that there’s already a version of the prototype with twice the pixel density, so there’s good reason to believe this will improve over time. She’s also optimistic that Meta will eventually be able to bring down the costs of its AR tech, eventually reducing it to something “similar to a high end phone.”

Leaving my demo at Meta’s headquarters, I was reminded of the first time I tried out a prototype of the wireless VR headset that would eventually become known as Quest, back in 2016. Called Santa Cruz at the time, it was immediately obvious, even to an infrequent VR user, that the wireless, room-tracking headset was the future of the company’s VR business. Now, it’s almost hard to believe there was a time when Meta’s headsets weren’t fully untethered.

Orion has the potential to be much bigger. Now, Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones.

For now, Orion is still just that: a glimpse. It’s far more complex than anything the company has attempted with VR. Meta still has a lot of work to do before that AR-enabled future can be a reality. But the prototype shows that much of that vision is closer than we think.

This article originally appeared on Engadget at https://www.engadget.com/ar-vr/metas-orion-prototype-offers-a-glimpse-into-our-ar-future-123038066.html?src=rss

ChromeOS update makes it easier to avoid distractions

Google has released ChromeOS M130 to the stable channel, which means an update is now making its way to your Chromebook if you haven't gotten one yet. The latest version of the OS comes with a lengthy list of new features, starting with a Focus panel where you can quickly enable or disable Do-not-Disturb mode, create new or select from existing Google Tasks, as well as play music with focus sound or YouTube Music Premium if you have a subscription. Google is also making it easier to insert emojis, GIFs and even Google Drive links with the M130's new Launcher + f shortcut. In addition, the Quick Insert physical key on the Samsung Galaxy Chromebook Plus will be available on more devices coming out next year. 

To cut the time you need to find specific files, Google has added a Suggestions section in Tote, the space where you'll find your most recently downloaded items and latest screenshots. You'll now also be able to access all your starred Drive files right on the ChromeOS shelf, even when you're offline. And if you want to pick up from where you'd left off every time you switch on your computer, then you can enable "Welcome Recap" in Settings, which will let you preview and instantly restore apps and tabs from your previous session. 

In case you use your Chromebook to record videos or audio, you can take advantage of ChromeOS M130's studio-style mic function that adds "advanced balancing, reconstruction of fine details and room adaptation" to the standard mic function's noise cancellation and de-reverberation effects. Plus, you can use Google's AI-powered Recorder app, which is debuting with the new OS and which has speech-to-text capabilities that can create transcripts labeling each speaker, as well as summarize recorded content. 

The ChromeOS M130 also integrates appearance effects into the platform's video call controls, adds support for multiple calendars and allows you to move Picture-in-Picture (PiP) windows to one side of your screen to free up space. Finally, if you have a Chromebook Plus device, you'll be able to access an AI-powered feature called "Help me read" that makes it easy to find information in any text you're reading on your browser and in your Gallery. 

This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/chromeos-update-makes-it-easier-to-avoid-distractions-120030197.html?src=rss

The Morning After: Nintendo made its own music streaming service

Addressing the needs of… someone, Nintendo has announced its own music streaming service on a mobile app for both Android and iPhone. Encompassing the music of Nintendo’s own gaming properties, from Mario to Metroid, Nintendo Music has a user interface that pretty much looks like Spotify. It’s a new addition to Switch Online subscribers, so it’s not free, but it’s a convenient extra if you’re already paying. Nintendo Music will even suggest and curate music based on your Switch activity.

One unique feature here is spoiler prevention. If you add a game, the app hides tracks and details that could give away a surprise twist, unexpected final boss or other potential spoilers, like that nihilistic ending of Animal Crossing: Happy Home Designer.

— Mat Smith

The biggest tech stories you missed

Playdate is officially getting a season two with ‘about a dozen games’ next year

Samsung could launch its extended reality wearable device next year

The next version of Android will arrive in early 2025

Get this delivered daily direct to your inbox. Subscribe right here!

OpenAI’s latest feature searches the web in response to your natural language queries, delivering “fast, timely answers with links to relevant web sources.” OpenAI says the feature looks for “original, high-quality content from the web,” integrating it into conversational answers. This includes trusted news media sources and data providers, like AccuWeather.

Continue reading.

TMA
Engadget

Now iOS 18.1 is available to the masses, Apple’s new hearing aid feature is ready for use. With an up-to-date iPhone and those earbuds, you can employ hearing assistance tools without visiting a doctor or buying pricey dedicated hearing aids. After making sure your iPhone and AirPods Pro 2 are updated, the test itself is a little hidden away inside the Health app. Here’s how to find it.

Continue reading.

After a delay in June and a second in August, Recall now won’t be available to test until December. Microsoft is once more pushing back testing of the feature intended for its Copilot+ PCs, according to The Verge. Pitched as a sort of photographic memory for Windows, it’s meant to improve the search process on PCs. But since that demands a high degree of access to your data, it has been the target of privacy and security concerns.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-nintendo-made-its-own-music-streaming-service-111636065.html?src=rss

Microsoft’s Recall AI tool for Copilot+ PCs faces a third delay

It's deja vu all over again for Microsoft's AI-powered Recall tool. After a delay in June and then a second one in August, Microsoft is once more pushing back testing of the feature intended for its Copilot+ PCs. The Verge reported that Recall now won't enter previews for Windows Insiders until December.

"We are committed to delivering a secure and trusted experience with Recall," Brandon LeBlanc, senior product manager of Windows, told the publication. "To ensure we deliver on these important updates, we’re taking additional time to refine the experience before previewing it with Windows Insiders."

When it was introduced, Microsoft positioned Recall as a way to give your computer a photographic memory, improving the search process on PCs. But since that photographic memory would demand a high degree of access to a computer's systems and data, Recall has been the target of privacy and security concerns. Microsoft has tried to assuage those worries by presenting Recall as an opt-in feature, so users will have to give explicit permission for the AI assistant to log their computing activity. The company has also detailed other privacy protections, but today's third delay could mean that it's proving more difficult than expected to keep security on lock.

This article originally appeared on Engadget at https://www.engadget.com/ai/microsofts-recall-ai-tool-for-copilot-pcs-faces-a-third-delay-191301031.html?src=rss

Humane recalls its troubled AI Pin’s Charge Case due to overheating

It’s getting harder and harder not to view the Humane AI Pin as destined to go down as one of tech’s all-time stinkers and cautionary tales. After reviews questioning why it existed, returns that outpaced its sales and a warning that its Charge Case could pose a “fire safety risk,” the company is now recalling the latter. The issue stems from the case’s battery cells, supplied by a third-party vendor, which could overheat and cause a fire hazard.

Humane posted on Thursday that it’s conducting the voluntary recall “out of an abundance of caution.” The startup says its charging case is the only accessory affected — not the battery booster, charging pad or Pin itself. “The issue is isolated to battery cells used in the Charge Case Accessory,” Humane wrote. “It is not related to its hardware design.”

The company says one of its battery suppliers is to blame. “Our investigation determined that the battery supplier was no longer meeting our quality standards and that battery cells supplied by this vendor can pose a fire risk,” Humane wrote. The company says it’s severed ties with the supplier and is currently evaluating a new one.

The Humane AI Pin on a wool top.
Hayato Huseman for Engadget

In fairness to Humane, the recall was (in its words) the result of only one incident where a user plugged it into a third-party USB-C cable and power source. It hasn’t received reports of injuries or damage. As easy as it is to poke fun at an overhyped company’s other shoe dropping, at least it’s informing consumers and conducting the recall voluntarily rather than trying to bury it for the sake of PR. Perhaps Humane can look to Samsung for inspiration on rebounding from a product that catches on fire — and not in a good way.

The Consumer Product Safety Commission (CPSC) posted a blurb about the recall with more detail. It says consumers who bought the Charge Case separately will receive a $149 refund. Those who got the case as part of the Humane AI Pin Complete System will get $129 back. In addition, Humane will supply replacement charging cases, but don’t expect them anytime soon: The estimated wait is three to six months. The CPSC says about 10,500 units are affected.

Humane advises charge case owners to “dispose of the product in accordance with any local and state laws” rather than chucking it in the trash. Presumably, that’s to avoid a real dumpster fire to match the metaphorical one at Humane.

This article originally appeared on Engadget at https://www.engadget.com/ai/humane-recalls-its-troubled-ai-pins-charge-case-due-to-overheating-185116736.html?src=rss

The next version of Android will arrive in early 2025

Android users had to wait longer than usual for the release of Android 15 this fall, but Google is already setting the timeline for the next two operating system updates. In a change of pace, the next major release for Android will arrive in the second quarter of 2025.

"We’re planning the major release for Q2 rather than Q3 to better align with the schedule of device launches across our ecosystem, so more devices can get the major release of Android sooner," the company said in a blog post addressing developers. That's good news for third-party phone manufacturers that have historically had to wait a few months before they get the latest OS updates.

In addition to the main release in the first half of the year, there will also be a minor update to Android slated for the fourth quarter of 2025. The Q2 release will be the only one next year to have behavior changes that can impact apps. The smaller release toward the end of the year will focus on "feature updates, optimizations and bug fixes," but will not have any behavior changes.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/the-next-version-of-android-will-arrive-in-early-2025-175013566.html?src=rss

How to use Apple’s AirPods Pro 2 as a hearing aid

Now that iOS 18.1 is available to the masses, Apple’s new hearing aid feature is ready for use. The tool is one of three hearing health items the company announced alongside the iPhone 16 in September. Another one of those, the “clinically-validated” hearing test, is an essential part of being able to use the AirPods Pro 2 as a hearing aid. With an up-to-date iPhone and those earbuds, you can employ hearing assistance tools without visiting a doctor or buying off-putting hearing aids. Simply take a five-minute test, and if the software determines you have mild to moderate hearing loss, you can immediately enable Apple’s FDA-approved hearing aid. Here’s a step-by-step guide on how to use it.

Despite the unchanged design, Apple has packed an assortment of updates into the new AirPods Pro. All of the conveniences from the 2019 model are here as well, alongside additions like Adaptive Transparency, Personalized Spatial Audio and a new touch gesture in tow. There’s room to further refine the familiar formula, but Apple has given iPhone owners several reasons to upgrade.
Billy Steele for Engadget

Before you can access Apple’s hearing aid, you’ll need to make sure your iPhone is updated to iOS 18.1 and your AirPods Pro 2 have the latest firmware (7B19). None of the new hearing health features will show up in the AirPods settings or in the Apple Health app if you don’t have both of those updates. What’s more, you won’t be able to run the hearing test or use the hearing aid feature on the first-gen AirPods Pro.

You can check your current iOS version from the iPhone Settings menu. Scroll down to General and tap Software Update. From here, you can see which version of iOS you’re running and if you’ve got a pending update that’s ready to download and install. Once again, you’re looking for iOS 18.1 here since this is the software version that delivers the suite of hearing health features.

To check the firmware on your AirPods Pro 2, connect the earbuds to your iPhone and navigate to the Settings menu. Here, your AirPods Pro 2 should appear near the top of the list and tapping that option will take you into the settings. You can also access AirPods Pro 2 details from the Bluetooth menu by tapping the “i” icon next to the device name.

Once you’re in the AirPods settings menu, scroll all the way down to the bottom of the main screen. One of the last things you’ll see is a bunch of firmware info, including the current version for the AirPods Pro 2. If you see 7B19, you’re good to go. If not, your earbuds haven’t updated yet, but you can try to force them to do so instead of waiting for the over-the-air process to take place on its own.

To do this, connect the AirPods Pro 2 to your iPhone for at least 30 seconds and play music to confirm the connection is stable. Then put the earbuds back on in the charging case and close the lid, keeping the AirPods Pro 2 in range of the iPhone. Now check Bluetooth settings, and if you see the AirPods Pro 2 stay connected for more than 10 seconds while in the charging case with the lid closed, then that should indicate that the update is in progress.

Apple's hearing test is a quick, easy-to-follow evaluation right in your pocket.
Billy Steele for Engadget

After you’ve confirmed that you have the necessary updates for your phone and earbuds, you’ll have to take Apple’s hearing test before the hearing aid features will show up. The only way around this is to upload an audiogram from your doctor in the Apple Health app. Either way, you’ll need to exhibit mild to moderate hearing loss (26-60 dBHL) for the Hearing Assistance section of the AirPods Pro 2 menu to be available to you.

Apple gives you two places to access its hearing test, and both of them are easy to find. The first is in the AirPods menu, which you can get to from the main Settings menu or from the Bluetooth menu. The Hearing Health section is prominently displayed on the main screen, just under the Noise Control options. In Hearing Health, Take a Hearing Test will be the third item after Hearing Protection and Hearing Assistance, and it will appear in blue.

In the Health app, the fastest way to get to the hearing test is to tap Browse on the menu on the main Summary screen. From there, select Hearing with the blue ear icon and scroll down to Get More From Health. Here, you’ll see the option to take the hearing test with the AirPods Pro 2.

There are two places to find Apple's hearing test on your iPhone.
Billy Steele for Engadget

After you take Apple’s hearing test, or upload your results from your doctor in the Health app, you’ll be able to access the Hearing Assistance section of the Hearing Health features in the AirPods settings. The hearing aid feature resides here, where you can turn it on or off as needed. It’s worth noting that Apple will ask if you want to set up Hearing Assistance immediately if your hearing test results meet the criteria for mild to moderate hearing loss.

On the main Hearing Assistance screen, you’ll see options for enabling/disabling the hearing aid and Media Assist. There are options for adjusting the hearing aid feature and choosing how the system applies Media Assist. The later tool uses your hearing profile to improve the sound for music, videos and calls. You can choose to have it only apply the personalization to either music and videos or calls and FaceTime. By default, it will re-tune the audio for all of them.

Under Adjustments beneath the hearing aid toggle, you’ll have the ability to tweak amplification, balance, tone and ambient noise reduction via individual sliders. You can also enable/disable the swipe gesture on AirPods Pro 2 that will adjust amplification when hearing aid mode is active (versus volume control for normal use). At the bottom of this menu, you can enable/disable Conversation Boost, the tool that specifically targets human voices that Apple debuted in 2021. When the hearing aid is enabled, you’ll see a second slider in the Control Center with an ear icon where you can adjust amplification and you can also tweak this setting on an Apple Watch.

Hearing aid will only be enabled when Noise Control is set to transparency, but Media Assist will still work in Adaptive, ANC and off modes. What’s more, the hearing aid and hearing protection features can be used simultaneously in transparency mode, with the later being active by default. And once again, you can turn the hearing aid tool off entirely at any time in the Hearing Assistance menu from the AirPods settings.

It could take a few days, or even a few weeks, for you to acclimate to the hearing aid feature. You can use the AirPods Pro 2 as hearing aids for up to six hours on a charge, and you’ll want to wear them as much as possible when you first start using them for this purpose. Once your hearing profile is enabled on the AirPods Pro 2, you shouldn’t share the earbuds with anyone else. This is due to the fact that adjustments have been made to compensate for the specific frequencies you have trouble hearing. That personalization would lead to weird tuning for someone else.

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/how-to-use-apples-airpods-pro-2-as-a-hearing-aid-173049967.html?src=rss

How to use Apple’s AirPods Pro 2 as a hearing aid

Now that iOS 18.1 is available to the masses, Apple’s new hearing aid feature is ready for use. The tool is one of three hearing health items the company announced alongside the iPhone 16 in September. Another one of those, the “clinically-validated” hearing test, is an essential part of being able to use the AirPods Pro 2 as a hearing aid. With an up-to-date iPhone and those earbuds, you can employ hearing assistance tools without visiting a doctor or buying off-putting hearing aids. Simply take a five-minute test, and if the software determines you have mild to moderate hearing loss, you can immediately enable Apple’s FDA-approved hearing aid. Here’s a step-by-step guide on how to use it.

Despite the unchanged design, Apple has packed an assortment of updates into the new AirPods Pro. All of the conveniences from the 2019 model are here as well, alongside additions like Adaptive Transparency, Personalized Spatial Audio and a new touch gesture in tow. There’s room to further refine the familiar formula, but Apple has given iPhone owners several reasons to upgrade.
Billy Steele for Engadget

Before you can access Apple’s hearing aid, you’ll need to make sure your iPhone is updated to iOS 18.1 and your AirPods Pro 2 have the latest firmware (7B19). None of the new hearing health features will show up in the AirPods settings or in the Apple Health app if you don’t have both of those updates. What’s more, you won’t be able to run the hearing test or use the hearing aid feature on the first-gen AirPods Pro.

You can check your current iOS version from the iPhone Settings menu. Scroll down to General and tap Software Update. From here, you can see which version of iOS you’re running and if you’ve got a pending update that’s ready to download and install. Once again, you’re looking for iOS 18.1 here since this is the software version that delivers the suite of hearing health features.

To check the firmware on your AirPods Pro 2, connect the earbuds to your iPhone and navigate to the Settings menu. Here, your AirPods Pro 2 should appear near the top of the list and tapping that option will take you into the settings. You can also access AirPods Pro 2 details from the Bluetooth menu by tapping the “i” icon next to the device name.

Once you’re in the AirPods settings menu, scroll all the way down to the bottom of the main screen. One of the last things you’ll see is a bunch of firmware info, including the current version for the AirPods Pro 2. If you see 7B19, you’re good to go. If not, your earbuds haven’t updated yet, but you can try to force them to do so instead of waiting for the over-the-air process to take place on its own.

To do this, connect the AirPods Pro 2 to your iPhone for at least 30 seconds and play music to confirm the connection is stable. Then put the earbuds back on in the charging case and close the lid, keeping the AirPods Pro 2 in range of the iPhone. Now check Bluetooth settings, and if you see the AirPods Pro 2 stay connected for more than 10 seconds while in the charging case with the lid closed, then that should indicate that the update is in progress.

Apple's hearing test is a quick, easy-to-follow evaluation right in your pocket.
Billy Steele for Engadget

After you’ve confirmed that you have the necessary updates for your phone and earbuds, you’ll have to take Apple’s hearing test before the hearing aid features will show up. The only way around this is to upload an audiogram from your doctor in the Apple Health app. Either way, you’ll need to exhibit mild to moderate hearing loss (26-60 dBHL) for the Hearing Assistance section of the AirPods Pro 2 menu to be available to you.

Apple gives you two places to access its hearing test, and both of them are easy to find. The first is in the AirPods menu, which you can get to from the main Settings menu or from the Bluetooth menu. The Hearing Health section is prominently displayed on the main screen, just under the Noise Control options. In Hearing Health, Take a Hearing Test will be the third item after Hearing Protection and Hearing Assistance, and it will appear in blue.

In the Health app, the fastest way to get to the hearing test is to tap Browse on the menu on the main Summary screen. From there, select Hearing with the blue ear icon and scroll down to Get More From Health. Here, you’ll see the option to take the hearing test with the AirPods Pro 2.

There are two places to find Apple's hearing test on your iPhone.
Billy Steele for Engadget

After you take Apple’s hearing test, or upload your results from your doctor in the Health app, you’ll be able to access the Hearing Assistance section of the Hearing Health features in the AirPods settings. The hearing aid feature resides here, where you can turn it on or off as needed. It’s worth noting that Apple will ask if you want to set up Hearing Assistance immediately if your hearing test results meet the criteria for mild to moderate hearing loss.

On the main Hearing Assistance screen, you’ll see options for enabling/disabling the hearing aid and Media Assist. There are options for adjusting the hearing aid feature and choosing how the system applies Media Assist. The later tool uses your hearing profile to improve the sound for music, videos and calls. You can choose to have it only apply the personalization to either music and videos or calls and FaceTime. By default, it will re-tune the audio for all of them.

Under Adjustments beneath the hearing aid toggle, you’ll have the ability to tweak amplification, balance, tone and ambient noise reduction via individual sliders. You can also enable/disable the swipe gesture on AirPods Pro 2 that will adjust amplification when hearing aid mode is active (versus volume control for normal use). At the bottom of this menu, you can enable/disable Conversation Boost, the tool that specifically targets human voices that Apple debuted in 2021. When the hearing aid is enabled, you’ll see a second slider in the Control Center with an ear icon where you can adjust amplification and you can also tweak this setting on an Apple Watch.

Hearing aid will only be enabled when Noise Control is set to transparency, but Media Assist will still work in Adaptive, ANC and off modes. What’s more, the hearing aid and hearing protection features can be used simultaneously in transparency mode, with the later being active by default. And once again, you can turn the hearing aid tool off entirely at any time in the Hearing Assistance menu from the AirPods settings.

It could take a few days, or even a few weeks, for you to acclimate to the hearing aid feature. You can use the AirPods Pro 2 as hearing aids for up to six hours on a charge, and you’ll want to wear them as much as possible when you first start using them for this purpose. Once your hearing profile is enabled on the AirPods Pro 2, you shouldn’t share the earbuds with anyone else. This is due to the fact that adjustments have been made to compensate for the specific frequencies you have trouble hearing. That personalization would lead to weird tuning for someone else.

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/how-to-use-apples-airpods-pro-2-as-a-hearing-aid-173049967.html?src=rss

ChatGPT Search will do the legwork for you

ChatGPT Search is here to try to combine the best of chatbots and web searches. OpenAI’s latest feature searches the web in response to your natural language queries, delivering “fast, timely answers with links to relevant web sources.”

When using ChatGPT, the bot will search the web depending on what you ask. Or, if you want to manually override its decision-making, you can tap a new web search icon below the input bar. OpenAI says the feature looks for “original, high-quality content from the web,” integrating it into conversational answers. This includes trusted news media sources and data providers like AccuWeather. The data will encompass things like weather, stocks, sports, news and maps.

Under each ChatGPT Search reply, you’ll see a Sources button. Click that, and a sidebar with references and links will open.

Weather results and a Sources button for ChatGPT Search
OpenAI

OpenAI says ChatGPT Search uses a fine-tuned version of GPT-4o, post-trained “using novel synthetic data generation techniques.” This included distilling outputs from the company’s OpenAI o1-preview. That model is much slower than GPT-4o, so perhaps training on it (rather than directly using it) will help the new feature to pinch some of its reasoning skills without laboring as long over answers.

The company used feedback from its SearchGPT test run to help tune the feature. “We brought the best of the SearchGPT experience into ChatGPT,” the company wrote.

The feature will be available today for ChatGPT Plus and Team subscribers. It will be available in the ChatGPT mobile and desktop apps and on the web. OpenAI says Enterprise and Edu users will get access in the next few weeks and trickle down to free users in the coming months.

This article originally appeared on Engadget at https://www.engadget.com/ai/chatgpt-search-will-do-the-legwork-for-you-171912610.html?src=rss

ChatGPT Search will do the legwork for you

ChatGPT Search is here to try to combine the best of chatbots and web searches. OpenAI’s latest feature searches the web in response to your natural language queries, delivering “fast, timely answers with links to relevant web sources.”

When using ChatGPT, the bot will search the web depending on what you ask. Or, if you want to manually override its decision-making, you can tap a new web search icon below the input bar. OpenAI says the feature looks for “original, high-quality content from the web,” integrating it into conversational answers. This includes trusted news media sources and data providers like AccuWeather. The data will encompass things like weather, stocks, sports, news and maps.

Under each ChatGPT Search reply, you’ll see a Sources button. Click that, and a sidebar with references and links will open.

Weather results and a Sources button for ChatGPT Search
OpenAI

OpenAI says ChatGPT Search uses a fine-tuned version of GPT-4o, post-trained “using novel synthetic data generation techniques.” This included distilling outputs from the company’s OpenAI o1-preview. That model is much slower than GPT-4o, so perhaps training on it (rather than directly using it) will help the new feature to pinch some of its reasoning skills without laboring as long over answers.

The company used feedback from its SearchGPT test run to help tune the feature. “We brought the best of the SearchGPT experience into ChatGPT,” the company wrote.

The feature will be available today for ChatGPT Plus and Team subscribers. It will be available in the ChatGPT mobile and desktop apps and on the web. OpenAI says Enterprise and Edu users will get access in the next few weeks and trickle down to free users in the coming months.

This article originally appeared on Engadget at https://www.engadget.com/ai/chatgpt-search-will-do-the-legwork-for-you-171912610.html?src=rss