Apple’s second-generation AirPods Pro are back on sale for $190

Apple’s second-generation AirPods Pro have dipped to under $200 in a deal from Amazon. The AirPods Pro, which normally cost $250, are $60 off right now, bringing the price down to just $190. That’s the same price we saw during Amazon’s Big Spring Sale. The AirPods Pro offer a number of premium features over the standard AirPods, including active noise cancellation for when you want to shut out the world, and an impressive transparency mode for when you want to hear your surroundings.

The second-generation AirPods Pro came out in 2022 and brought Apple’s H2 chip to the earbuds for a notable performance boost. It offers Adaptive Audio, which will automatically switch between Active Noise Cancellation and Transparency Mode based on what’s going on around you. With Conversation Awareness, they can lower the volume when you’re speaking and make it so other people's voices are easier to hear.

We gave this version of the AirPods Pro a review score of 88, and it’s one of our picks for the best wireless earbuds on the market. The second-generation AirPods Pro are dust, sweat and water resistant, so they should hold up well for workouts, and they achieve better battery life than the previous generation. They can get about six hours of battery life with features like ANC enabled, and that goes up to as much as 30 hours with the charging case. Apple says popping the AirPods Pro in the case for 5 minutes will give you an hour of additional listening or talking time.

AirPods Pro also offer Personalized Spatial Audio with head tracking for more immersive listening while you’re watching TV or movies. The gesture controls that were introduced with this generation of the earbuds might take some getting used to, though. With AirPods Pro, you can adjust the volume by swiping the touch control.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/apples-second-generation-airpods-pro-are-back-on-sale-for-190-142626914.html?src=rss

Apple officially allows retro game emulators on the App Store

In addition to updating its developer guidelines to allow music streaming apps to link to external website, Apple has also added new language that allows game emulators on the App Store. The updated guidelines, first noticed by 9to5Mac, now say that retro gaming console emulator apps are welcome and can even offer downloadable games. Apple also reportedly confirmed to developers in an email that they can create and offer emulators on its marketplace. 

Emulator software wasn't allowed on the App Store prior to this update, though developers have been finding ways to distribute them to iOS users. To be able to install them, users usually need to resort to jailbreaking and downloading sideloading tools or unsanctioned alternate app stores first. This rule update potentially eliminates the need for users to go through all those lengths and could bring more Android emulators to iOS.

Apple warns developers, however, that they "are responsible for all such software offered in [their] app, including ensuring that such software complies with these Guidelines and all applicable laws." Clearly, allowing emulators on the App Store doesn't mean that it's allowing pirated games, as well. Any app offering titles for download that the developer doesn't own the rights to is a no-no, so fans of specific consoles will just have to hope that their companies are planning to release official emulators for iOS. While these latest changes to Apple's developer guidelines seem to be motivated by the EU's Digital Markets Act regulation, which targets big tech companies' anti-competitive practices, the new rule on emulators applies to all developers worldwide. 

This article originally appeared on Engadget at https://www.engadget.com/apple-officially-allows-retro-game-emulators-on-the-app-store-130044937.html?src=rss

Apple officially allows retro game emulators on the App Store

In addition to updating its developer guidelines to allow music streaming apps to link to external website, Apple has also added new language that allows game emulators on the App Store. The updated guidelines, first noticed by 9to5Mac, now say that retro gaming console emulator apps are welcome and can even offer downloadable games. Apple also reportedly confirmed to developers in an email that they can create and offer emulators on its marketplace. 

Emulator software wasn't allowed on the App Store prior to this update, though developers have been finding ways to distribute them to iOS users. To be able to install them, users usually need to resort to jailbreaking and downloading sideloading tools or unsanctioned alternate app stores first. This rule update potentially eliminates the need for users to go through all those lengths and could bring more Android emulators to iOS.

Apple warns developers, however, that they "are responsible for all such software offered in [their] app, including ensuring that such software complies with these Guidelines and all applicable laws." Clearly, allowing emulators on the App Store doesn't mean that it's allowing pirated games, as well. Any app offering titles for download that the developer doesn't own the rights to is a no-no, so fans of specific consoles will just have to hope that their companies are planning to release official emulators for iOS. While these latest changes to Apple's developer guidelines seem to be motivated by the EU's Digital Markets Act regulation, which targets big tech companies' anti-competitive practices, the new rule on emulators applies to all developers worldwide. 

This article originally appeared on Engadget at https://www.engadget.com/apple-officially-allows-retro-game-emulators-on-the-app-store-130044937.html?src=rss

Apple Vision Pro owners now have more decent controller options

The Apple Vision Pro is an impressive piece of hardware, and the eye-tracking/hand gesture input combo is fantastic for navigating menus and the like. It’s not so great for gaming. There haven't been many easy ways to connect a third-party controller for playing iPad or cloud games. This is changing, however, as accessory manufacturer 8BitDo just announced Vision Pro compatibility for a number of its controllers.

These accessories are officially supported by Apple, so they should work as soon as you make a Bluetooth connection. No muss and no fuss. All told, eight devices got the Apple seal of approval here. One such gadget is the company’s Ultimate Bluetooth Controller, which we basically called the perfect gamepad for PC.

A person wearing a headset and playing Pac-Man.
8BitDo

Other compatible devices include various iterations of the SN30 Pro controller, the Lite 2 and the NES-inspired N30 Pro 2. The integration isn’t just for game controllers, as 8BitDo also announced AVP compatibility for its Retro Mechanical Keyboard. Of course, the Vision Pro works out of the box with most Bluetooth keyboards.

This is pretty big news, however, as media consumption is one of the best parts of the Vision Pro experience. Video games fall squarely in that category. Just about every iPad title works on the device. If playing Cut the Rope on a giant virtual screen doesn’t do it for you, the headset also integrates with Xbox Cloud Gaming and Nvidia GeForce Now for access to AAA titles. 

8BitDo announced official controller support for Apple devices last year, though this was primarily for smartphones, tablets and Mac computers. The integration was thanks to new controller firmware and Apple's recent iOS 16.3, iPadOS 16.3, tvOS 16.3 and macOS 13.2 updates. It looks like all of the accessories that work with iPhones and iPads also work with the Vision Pro. 

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-owners-now-have-more-decent-controller-options-150055872.html?src=rss

Apple Vision Pro owners now have more decent controller options

The Apple Vision Pro is an impressive piece of hardware, and the eye-tracking/hand gesture input combo is fantastic for navigating menus and the like. It’s not so great for gaming. There haven't been many easy ways to connect a third-party controller for playing iPad or cloud games. This is changing, however, as accessory manufacturer 8BitDo just announced Vision Pro compatibility for a number of its controllers.

These accessories are officially supported by Apple, so they should work as soon as you make a Bluetooth connection. No muss and no fuss. All told, eight devices got the Apple seal of approval here. One such gadget is the company’s Ultimate Bluetooth Controller, which we basically called the perfect gamepad for PC.

A person wearing a headset and playing Pac-Man.
8BitDo

Other compatible devices include various iterations of the SN30 Pro controller, the Lite 2 and the NES-inspired N30 Pro 2. The integration isn’t just for game controllers, as 8BitDo also announced AVP compatibility for its Retro Mechanical Keyboard. Of course, the Vision Pro works out of the box with most Bluetooth keyboards.

This is pretty big news, however, as media consumption is one of the best parts of the Vision Pro experience. Video games fall squarely in that category. Just about every iPad title works on the device. If playing Cut the Rope on a giant virtual screen doesn’t do it for you, the headset also integrates with Xbox Cloud Gaming and Nvidia GeForce Now for access to AAA titles. 

8BitDo announced official controller support for Apple devices last year, though this was primarily for smartphones, tablets and Mac computers. The integration was thanks to new controller firmware and Apple's recent iOS 16.3, iPadOS 16.3, tvOS 16.3 and macOS 13.2 updates. It looks like all of the accessories that work with iPhones and iPads also work with the Vision Pro. 

This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-owners-now-have-more-decent-controller-options-150055872.html?src=rss

Who exactly is YouTube’s multicam Coachella stream for?

YouTube is hyping its exclusive Coachella streaming coverage, which starts next week. The headlining feature is the platform’s multiview experience (already familiar to sports fans) for the two-weekend festival. Our question from this announcement is, who wants to watch several different artists’ sets at the same time — when you can only listen to one?

The multiview experience will let you watch up to four stages simultaneously, letting you pick which one to hear: exactly how multiview works for March Madness, NFL games or any other sporting event. Here’s how YouTube pitches the feature: “Two of your favorite bands playing on different stages at the same time? No problem, multiview will have you and your friends covered to catch both sets at the same time via the YouTube app on TV at no additional cost.”

Maybe I’m of the wrong generation and have too long of an attention span, but who wants to watch an artist’s set without hearing it? That’s what will happen to the three stages you aren’t listening to. Wouldn’t it be better to... watch the one you’re hearing? And then catch up on the others on-demand when you can listen to them as well?

Sports multiview makes sense because there are scores to track and timeouts, halftimes and blowouts to divert your attention to another game. You don’t need to hear an NBA game to keep an eye on the ball. (Depending on the commentators, you may prefer not to listen to it.) It’s primarily a visual experience; the audio is secondary.

But music, even when played live with all the light shows, fog machines and dancing accompanying it, is still an auditory experience first and foremost. If multiple artists you like play at once, you still can’t (and wouldn’t want to) hear more than one simultaneously. In YouTube’s multiview, you pick one stage to hear and the rest to… watch them sing and dance on mute in a little box alongside two other muted performances. Yay?

It sounds like a solution looking for a problem — YouTube applying its existing tech (which, to be fair, works very well with sports) to a music festival. Never mind that it doesn’t make a lot of sense.

Perplexed rants aside, YouTube will have six livestream feeds to bounce between (but, again, only four at once in multiview). That includes Sonora for the first weekend and Yuma for the second. This year’s headliners include Lana Del Rey, Doja Cat, No Doubt and Tyler, the Creator.

Between sets, YouTube will stream “special editorial content” from the artists onsite. Each day after the night’s final set, YouTube’s Coachella channel will repeat that day’s sets until the livestream returns the next day. That sounds like a better way to catch up on the sets you didn’t see live.

The event takes place in Indio, California, about 130 miles east of LA, from April 12 to 14 and April 19 to 21. You can tune in on YouTube’s Coachella channel.

This article originally appeared on Engadget at https://www.engadget.com/who-exactly-is-youtubes-multicam-coachella-stream-for-183744741.html?src=rss

Who exactly is YouTube’s multicam Coachella stream for?

YouTube is hyping its exclusive Coachella streaming coverage, which starts next week. The headlining feature is the platform’s multiview experience (already familiar to sports fans) for the two-weekend festival. Our question from this announcement is, who wants to watch several different artists’ sets at the same time — when you can only listen to one?

The multiview experience will let you watch up to four stages simultaneously, letting you pick which one to hear: exactly how multiview works for March Madness, NFL games or any other sporting event. Here’s how YouTube pitches the feature: “Two of your favorite bands playing on different stages at the same time? No problem, multiview will have you and your friends covered to catch both sets at the same time via the YouTube app on TV at no additional cost.”

Maybe I’m of the wrong generation and have too long of an attention span, but who wants to watch an artist’s set without hearing it? That’s what will happen to the three stages you aren’t listening to. Wouldn’t it be better to... watch the one you’re hearing? And then catch up on the others on-demand when you can listen to them as well?

Sports multiview makes sense because there are scores to track and timeouts, halftimes and blowouts to divert your attention to another game. You don’t need to hear an NBA game to keep an eye on the ball. (Depending on the commentators, you may prefer not to listen to it.) It’s primarily a visual experience; the audio is secondary.

But music, even when played live with all the light shows, fog machines and dancing accompanying it, is still an auditory experience first and foremost. If multiple artists you like play at once, you still can’t (and wouldn’t want to) hear more than one simultaneously. In YouTube’s multiview, you pick one stage to hear and the rest to… watch them sing and dance on mute in a little box alongside two other muted performances. Yay?

It sounds like a solution looking for a problem — YouTube applying its existing tech (which, to be fair, works very well with sports) to a music festival. Never mind that it doesn’t make a lot of sense.

Perplexed rants aside, YouTube will have six livestream feeds to bounce between (but, again, only four at once in multiview). That includes Sonora for the first weekend and Yuma for the second. This year’s headliners include Lana Del Rey, Doja Cat, No Doubt and Tyler, the Creator.

Between sets, YouTube will stream “special editorial content” from the artists onsite. Each day after the night’s final set, YouTube’s Coachella channel will repeat that day’s sets until the livestream returns the next day. That sounds like a better way to catch up on the sets you didn’t see live.

The event takes place in Indio, California, about 130 miles east of LA, from April 12 to 14 and April 19 to 21. You can tune in on YouTube’s Coachella channel.

This article originally appeared on Engadget at https://www.engadget.com/who-exactly-is-youtubes-multicam-coachella-stream-for-183744741.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

Microsoft may have finally made quantum computing useful

The dream of quantum computing has always been exciting: What if we could build a machine working at the quantum level that could tackle complex calculations exponentially faster than a computer limited by classical physics? But despite seeing IBM, Google and others announce iterative quantum computing hardware, they're still not being used for any practical purposes. That might change with today's announcement from Microsoft and Quantinuum, who say they've developed the most error-free quantum computing system yet.

While classical computers and electronics rely on binary bits as their basic unit of information (they can be either on or off), quantum computers work with qubits, which can exist in a superposition of two states at the same time. The trouble with qubits is that they're prone to error, which is the main reason today's quantum computers (known as Noisy Intermediate Scale Quantum [NISQ] computers) are just used for research and experimentation.

Microsoft's solution was to group physical qubits into virtual qubits, which allows it to apply error diagnostics and correction without destroying them, and run it all over Quantinuum's hardware. The result was an error rate that was 800 times better than relying on physical qubits alone. Microsoft claims it was able to run more than 14,000 experiments without any errors.

According to Jason Zander, EVP of Microsoft's Strategic Missions and Technologies division, this achievement could finally bring us to "Level 2 Resilient" quantum computing, which would be reliable enough for practical applications.

"The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems," Zander wrote in a blog post today. "In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation."

Microsoft's announcement is a "strong result," according to Aram Harrow, a professor of physics at MIT focusing on quantum information and computing. "The Quantinuum system has impressive error rates and control, so it was plausible that they could do an experiment like this, but it's encouraging to see that it worked," he said in an e-mail to Engadget. "Hopefully they'll be able to keep maintaining or even improving the error rate as they scale up."

Microsoft Quantum Computing
Microsoft

Researchers will be able to get a taste of Microsoft's reliable quantum computing via Azure Quantum Elements in the next few months, where it will be available as a private preview. The goal is to push even further to Level 3 quantum supercomputing, which will theoretically be able to tackle incredibly complex issues like climate change and exotic drug research. It's unclear how long it'll take to actually reach that point, but for now, at least we're moving one step closer towards practical quantum computing.

"Getting to a large-scale fault-tolerant quantum computer is still going to be a long road," Professor Harrow wrote. "This is an important step for this hardware platform. Along with the progress on neutral atoms, it means that the cold atom platforms are doing very well relative to their superconducting qubit competitors."

This article originally appeared on Engadget at https://www.engadget.com/microsoft-may-have-finally-made-quantum-computing-useful-164501302.html?src=rss

The best smartphone cameras for 2024: How to choose the phone with the best photography chops

I remember begging my parents to get me a phone with a camera when the earliest ones were launched. The idea of taking photos wherever I went was new and appealing, but it’s since become less of a novelty and more of a daily habit. Yes, I’m one of those. I take pictures of everything — from beautiful meals and funny signs to gorgeous landscapes and plumes of smoke billowing in the distance.

If you grew up in the Nokia 3310 era like me, then you know how far we’ve come. Gone are the 2-megapixel embarrassments that we used to post to Friendster with glee. Now, many of us use the cameras on our phones to not only capture precious memories of our adventures and loved ones, but also to share our lives with the world.

I’m lucky enough that I have access to multiple phones thanks to my job, and at times would carry a second device with me on a day-trip just because I preferred its cameras. But most people don’t have that luxury. Chances are, if you’re reading this, a phone’s cameras may be of utmost importance to you. But you’ll still want to make sure the device you end up getting doesn’t fall flat in other ways. At Engadget, we test and review dozens of smartphones every year; our top picks below represent not only the best phone cameras available right now, but also the most well-rounded options out there.

What to look for when choosing a phone for its cameras

Before scrutinizing a phone’s camera array, you’ll want to take stock of your needs — what are you using it for? If your needs are fairly simple, like taking photos and videos of your new baby or pet, most modern smartphones will serve you well. Those who plan to shoot for audiences on TikTok, Instagram or YouTube should look for video-optimizing features like stabilization and high frame rate support (for slow-motion clips).

Most smartphones today have at least two cameras on the rear and one up front. Those that cost more than $700 usually come with three, including wide-angle, telephoto or macro lenses. We’ve also reached a point where the number of megapixels (MP) doesn’t really matter anymore — most flagship phones from Apple, Samsung and Google have sensors that are either 48MP or 50MP. You’ll even come across some touting resolutions of 108MP or 200MP, in pro-level devices like the Galaxy S24 Ultra.

Most people won’t need anything that sharp, and in general, smartphone makers combine the pixels to deliver pictures that are the equivalent of 12MP anyway. The benefits of pixel-binning are fairly minor in phone cameras, though, and you’ll usually need to blow up an image to fit a 27-inch monitor before you’ll see the slightest improvements.

In fact, smartphone cameras tend to be so limited in size that there’s often little room for variation across devices. They typically use sensors from the same manufacturers and have similar aperture sizes, lens lengths and fields of view. So while it might be worth considering the impact of sensor size on things like DSLRs or mirrorless cameras, on a smartphone those differences are minimal.

Sensor size and field of view

If you still want a bit of guidance on what to look for, here are some quick tips: By and large, the bigger the sensor the better, as this will allow more light and data to be captured. Not many phone makers will list the sensor size in spec lists, so you’ll have to dig around for this info. A larger aperture (usually indicated by a smaller number with an “f/” preceding a digit) is ideal for the same reason, and it also affects the level of depth of field (or background blur) that’s not added via software. Since portrait modes are available on most phones these days, though, a big aperture isn’t as necessary to achieve this effect.

When looking for a specific field of view on a wide-angle camera, know that the most common offering from companies like Samsung and Google is about 120 degrees. Finally, most premium phones like the iPhone 15 Pro Max and Galaxy S24 Ultra offer telephoto systems that go up to 5x optical zoom with software taking that to 20x or even 100x.

Processing and extra features

These features will likely perform at a similar quality across the board, and where you really see a difference is in the processing. Samsung traditionally renders pictures that are more saturated, while Google’s Pixel phones take photos that are more neutral and evenly exposed. iPhones have historically produced pictures with color profiles that seem more accurate, though in comparison to images from the other two, they can come off yellowish. However, that was mostly resolved after Apple introduced a feature in the iPhone 13 called Photographic Styles that lets you set a profile with customizable contrast levels and color temperature that would apply to every picture taken via the native camera app.

Pro users who want to manually edit their shots should see if the phone they’re considering can take images in RAW format. Those who want to shoot a lot of videos while on the move should look for stabilization features and a decent frame rate. Most of the phones we’ve tested at Engadget record at either 60 frames per second at 1080p or 30 fps at 4K. It’s worth checking to see what the front camera shoots at, too, since they’re not usually on par with their counterparts on the rear.

Finally, while the phone’s native editor is usually not a dealbreaker (since you can install a third-party app for better controls), it’s worth noting that the latest flagships from Samsung and Google all offer AI tools that make manipulating an image a lot easier. They also offer a lot of fun, useful extras, like erasing photobombers, moving objects around or making sure everyone in the shot has their eyes open.

How we test smartphone cameras

For the last few years, I’ve reviewed flagships from Google, Samsung and Apple, and each time, I do the same set of tests. I’m especially particular when testing their cameras, and usually take all the phones I’m comparing out on a day or weekend photo-taking trip. Any time I see a photo- or video-worthy moment, I whip out all the devices and record what I can, doing my best to keep all factors identical and maintain the same angle and framing across the board.

It isn’t always easy to perfectly replicate the shooting conditions for each camera, even if I have them out immediately after I put the last one away. Of course, having them on some sort of multi-mount rack would be the most scientific way, but that makes framing shots a lot harder and is not representative of most people’s real-world use. Also, just imagine me holding up a three-prong camera rack running after the poor panicked wildlife I’m trying to photograph. It’s just not practical.

For each device, I make sure to test all modes, like portrait, night and video, as well as all the lenses, including wide, telephoto and macro. When there are new or special features, I test them as well. Since different phone displays can affect how their pictures appear, I wanted to level the playing field: I upload all the material to Google Drive in full resolution so I can compare everything on the same large screen. Because the photos from today’s phones are of mostly the same quality, I usually have to zoom in very closely to see the differences. I also frequently get a coworker who’s a photo or video expert to look at the files and weigh in.

This article originally appeared on Engadget at https://www.engadget.com/best-camera-phone-130035025.html?src=rss