Apple confirms expanded language support for Apple Intelligence in 2025

The rollout of Apple Intelligence will be fairly slow-paced, with Apple gradually adding new features and support for more languages over the coming months. The company has now confirmed support for several more languages as Apple Intelligence will be available in German, Italian, Korean, Portuguese and Vietnamese in 2025. That’s in addition to previously announced support for Chinese, French, Japanese and Spanish.

Apple will initially offer Apple Intelligence in the US in English with the release of iOS 18.1, iPadOS 18.1 and macOS Sequoia 15.1 in October. As such, you won’t have access to the tools immediately if you pick up an iPhone 16 when Apple’s latest smartphone lineup ships on Friday.

The tools will be available in localized English in Australia, Canada, New Zealand, South Africa and the UK in December. Apple will also start rolling out the features in India and Singapore in English next year. Further language support is to be announced.

There is one key thing worth noting as part of the Apple Intelligence rollout, however. Apple is not planning to broadly offer the tools in the European Union or Chinese mainland right away. So while you’ll be able to use Apple Intelligence in Portuguese or French, you might not necessarily be able to do so while you’re in Portugal or France.

“Apple Intelligence will not currently work if you are in the EU and if your Apple ID Country/Region is also in the EU,” Apple notes in a support article. “If traveling outside of the EU, Apple Intelligence will work when your device language and Siri language are set to a supported language.”

Also, as things stand, Apple Intelligence won’t work on phones bought on the Chinese mainland. Those traveling to China with an iPhone they bought elsewhere also won’t have access to the tools if their Apple ID Country/Region is set to mainland China.

Apple is hoping to bring Apple Intelligence to the EU and China, however. The company told TechCrunch that it's in talks with regulators in both markets over the issue. Apple is initially withholding the AI tools from the EU over concerns related to the Digital Markets Act.

Update 9/18 10:41AM ET: Added a note that Apple is in discussions with the EU and China over Apple Intelligence. 

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-confirms-expanded-language-support-for-apple-intelligence-in-2025-140548274.html?src=rss

Apple reveals how it’s made the iPhone 16 series (much) easier to repair

Apple has slowly been making its devices easier to fix, but the iPhone 15 fell short in a couple of key areas, according to the repairability site iFixit. Namely, the battery was hard to remove and the device suffered from a "parts pairing" issue that meant you couldn't easily replace the LiDAR sensor with one from another phone. With those two problems, iFixit gave the iPhone 15 a relatively low 4/10 repairability score

Apple has now released new updates on iPhone 16 repairability and appears to have addressed both those issues and a bunch more. Saying it tries to strike a balance between durability and repairability, it focused particularly on the "repairability" aspect with its latest devices. 

There's now an entirely new way to remove the battery that's supposed to make it easier. By running a low voltage electrical current through the new ionic liquid battery adhesive (using a 9V cell, for instance), the battery will release itself from the enclosure. This makes removal faster and safer compared to previous stretch release adhesives, according to the company.

At the same time, Apple made changes to the Face ID sensor hardware starting with the iPhone 16 and iPhone 16 Pro. Now, the TrueDepth Camera can be swapped from one unit to another without compromising security or privacy. Before, only Apple was able to do that type of repair.

Another big change is the new Repair Assistant, designed to address parts pairing issues. That lets customers and repair professionals configure both new and used Apple parts directly on the device, with no need to contact Apple personnel. Repair shops previously needed to order official components directly from Apple and get on the phone with an employee before iOS would accept individual parts replacements.

Apple added newly repairable modules too, saying the TrueDepth Camera can now be configured on-device for iPhone 12 and later, eliminating the need for a tethered Mac. In addition, the LiDAR scanner on iPhone Pro models is now serviceable with the rear camera model.

Another big change is on-device access to diagnostics. Starting with iOS 18, Apple diagnostics for repair will be available on device, so customers can determine which parts need to be replaced without the need for a second device.

Finally, the company announced new support for third-party and used Apple parts. If a third-party part can't be calibrated on Apple's cloud-based servers, the iPhone or other device will try to activate the part and operate it to its full capability, while showing the repair history within settings. Used Apple parts can soon be calibrated and will appear as a "used" part in the device's repair history. Another future update will enable True Tone for third-party displays and battery health for third-party batteries. In addition, the LiDAR Scanner and front camera will still work when the module is replaced and left unconfigured. 

All told, the iPhone 16 series looks to have one of the biggest jumps in repairability yet, with improvements in physical access, parts compatibility and parts pairing. We'll soon see if that's reflected in iFixit's impending repairability score.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-reveals-how-its-made-the-iphone-16-series-much-easier-to-repair-120055256.html?src=rss

iPhone 16 Pro and Pro Max review: Cameras and customization

It may seem like Apple is behind the competition a lot of the time. The company appeared to be slow to developments like widgets, bezel-less displays with camera notches and screens with high refresh rates. And with the iPhone 16 Pro, it appears to once again be late to the party, bringing generative-AI features and a real button for the camera to its 2024 flagship. But if you'll allow me to play therapist for a moment, I think it's not that Apple is slow. I think Apple is cautious. Perhaps overly so.

Caution on its own isn't a bad trait — in fact, it could be considered thoughtful. Rather than rush to the cutting edge with its peers, Apple deliberates, usually finding a slightly different approach that is often an improvement on what's out there. Just look at the Vision Pro headset or Apple Silicon. Or even the iPod, the iPad and the AirPods, which were far from the first of their kind when they launched.

With the iPhone 16 Pro, the focus is on cameras and Apple Intelligence. The problem is, Apple Intelligence isn't quite here yet. We can test some features in the developer beta that's currently available, but that's not necessarily the same as the experience the public will get when the update rolls out in October. It’s not unprecedented for new iPhones to launch without some marquee features, sure, and thankfully there's still plenty that the iPhone 16 Pro brings. From Camera Control, the Fusion Camera and other video-related updates to slightly bigger displays and iOS 18, the iPhone 16 Pro and Pro Max are intriguing successors, even absent the vaunted Intelligence features that are still to come.

I’m getting deja vu. Looking back at my review of the iPhone 15 Pro, I see a picture of that phone and its predecessor lined up side by side to show just how much thinner the bezels are. Apple has once again trimmed the borders on its flagship phones, but while doing that enabled it to reduce the handsets’ size in 2023, this year it allowed the company to cram in larger screens without much change in footprint.

The iPhone 16 Pro and Pro Max displays have increased in size from 6.1 inches and 6.7 inches up to 6.3 inches and 6.9 inches, respectively. Both handsets have grown ever so slightly, too, by just under 1mm in width and about 3mm in height.

Basically, the iPhone 16 Pro and Pro Max are a hair wider and taller than their predecessors, but maintain the same 8.25mm (0.32-inch) profile. And yet, in spite of this minimal change, you won’t be able to keep your old cases if you’re upgrading from an iPhone 15 Pro to an iPhone 16 Pro.

Not only would the cases not quite fit, you’d also need something with either a cutout or a sapphire crystal and conductive layer to be able to use the new Camera Control. Of course, Apple sells compatible cases, as do some third parties like Otterbox, so you have plenty of options.

I’ve spent most of this year’s hardware review season remarking how Samsung and Google’s flagships feel like iPhones, and I’ve now reached a strange inception point. As I’ve been comparing competing phones for this review, I’ve been surrounded by about a dozen handsets from all these different companies on my couch, including last year’s iPhones, the Galaxy S24 Plus and the Pixel 9 Pro and Pro XL. Trying to figure out which one is the iPhone has become more confusing than ever, as they all feel similar in build. The best way to verify at a glance is looking at their camera arrays or my wallpaper.

All that is to say that the iPhone 16 Pro feels similar to its predecessor, which is what these other companies have been attempting to emulate. Apple would be right to feel flattered by this imitation, and yet I have to wonder if it’s time to do something different. Google’s Pixel 9 Pro is actually a whole six grams lighter than the iPhone 16 Pro at 221 grams (7.79 ounces), and I’m absolutely smitten by its rich pink hue and shiny edges. Though I like the new golden Desert color for the iPhone 16 Pro, I do wish Apple’s premium flagship had more fun and vibrant exteriors. That said, I do love the base iPhone 16 in pink, teal and Ultramarine.

Close up shots of the bottom half of the iPhone 16 lineup, featuring from left to right a pink, teal, white and gold phones.
Brian Oh for Engadget

Arguably the biggest change to the iPhone 16 lineup, not to mention the iPhone 16 Pro, is the introduction of Camera Control. This is a button on the right side of the device, which has touch and pressure sensors on it to enable greater control with swipes and semi-presses. (That’s in addition to the Action Button on the top left that was added to last year’s Pros, and carries over to the iPhone 16 and iPhone 16 Plus, too.)

One of the things this was supposed to do was let you push lightly on the button to trigger focus, similar to what half pressing a DSLR shutter button would do. That function won’t be available at launch, so I can’t say if it’s effective.

But by and large, Camera Control is a very Apple approach to a feature that has been around for years. From phones by Sony and Nokia with dedicated shutter buttons to Android handsets with hardware-based double-click shortcuts, the notion of quick access to your camera without having to futz with the screen is a popular one. For good reason, too — I’ve hated having to swipe or long-press the icon on my iPhone’s lock screen in the past, and even though I could set the iPhone 15 Pro’s Action button to open the camera, it just wasn’t positioned well and I’d have to give up my mute button.

So Apple isn’t breaking new ground with its hardware shortcut for a frequently used app. But it does do a few things differently with the touch sensor. You can swipe on it to tweak things like exposure, zoom levels and tone, and the half-press still works as a way to select options or go back out of menus within the new Camera Control interface. In theory, it’s a nice way to make changes on the fly.

In reality, there were a few issues, and they largely have to do with placement. The button sits a little farther from the base of the phone than I’d like, so my fingers have to reach a bit more to press it, whether I was in landscape or portrait mode. This wasn’t usually a problem when I had both hands free and could steady the iPhone with my other hand and readjust my grip.

But if you’re trying to take a quick shot with just one hand, the button’s location can feel unintuitive. Of course, everyone has different finger lengths and ratios, so it’s entirely possible that other people find this logical. It also depends on your grip — if you’re cradling the bottom of the device in your palm, it’s harder to maneuver. If you’re covering part of the screen and reaching for the button head on, it’s slightly easier to use camera control.

The iPhone 16 Pro held up in mid air by two hands, with the camera app open and showing people walking on a New York City street.
Brian Oh for Engadget

Still, even for those with the strongest claws, swiping and half-pressing and double-half-pressing on the sensor is tricky. I was only ever really able to do that if I had my thumb holding up the bottom edge and my middle, ring and little fingers steadying the right end of the phone. Maybe this is a new camera grip I just need to relearn for this button.

The awkward placement is a minor gripe compared to what I found most annoying: the button’s touch sensor. Not only was it difficult to swipe through different settings when holding the device with one hand, it also reacts to accidental touches and swipes. Sometimes, the phone would slide down my palm and change the exposure or zoom level, completely ruining the vibe. I should point out that you can go into accessibility settings to either tweak the swipe sensitivity or turn it off altogether, if it really bothers you. Honestly, if you’re planning on making adjustments with Camera Control, it’s best to have time, patience and both hands free.

In those situations, I had a lot of fun editing settings and watching them be reflected in the viewfinder in real time. I also liked zooming in and out of subjects, recomposing a shot and tweaking exposure till I liked what I saw, before then pushing down to snap the picture. (This action does lead to some small issues, but more on the actual photo quality later.) I especially loved this while recording video, since it makes slowly zooming in or out of a subject smoother than using the onscreen slider.

Then again, for scenarios where I just want to fire off a quick shot without worrying about exposure or zoom settings, the pain of finagling with the sensor mostly goes away. In exchange, being able to rapidly snap pictures is a joy. I found myself taking more pictures than ever thanks to camera control, which if you know me is a feat worthy of the Guinness Book of Records.

A random person cut me off in line? Click. Funny sign on a building I pass by in a Lyft? Click, click. From your lock screen, you’ll have to press the button twice — once to wake the phone up and once to open the camera. Then press again to take the photo. It’s not ideal, but not too far off the same process on a Pixel phone, for instance. Plus, you can long-press the iPhone’s button to start recording a video, and it’ll automatically stop when you let go.

Close up of the iPhone 16 Pro's rear cameras with greenery in the background.
Cherlynn Low for Engadget

This sort of rapid access to the camera is the best thing about the new button, and I could see it being potentially useful not just for shutterbugs like me, but for the upcoming Visual Intelligence feature that Apple teased at its launch event. The company’s version of Google Lens could allow people to ask questions about things in the real world around them. But of course, since this wasn’t available during my review period, I wasn’t able to test it.

For now, you can go into Settings to either change the number of clicks it takes to trigger the camera app, remap it to a Code scanner or the Magnifier tool or disable it altogether. Since you can also set up the Action button to do these things, you have more choices now over where you want your camera shortcut or free up the former volume slider to do something else.

Even if you’re not a glutton for buttons, there are still some camera updates that might intrigue you. This year’s flagships sport what Apple calls a 48-megapixel Fusion Camera, which has a faster quad-pixel sensor. This enables what the company describes as “zero shutter lag,” which is wording it has used repeatedly over the years. In this case, it’s referring to how quickly the camera will capture a shot after you press the shutter button (onscreen or hardware).

I will admit I was initially confused by this update, in part because it requires relearning some behaviors I had adopted to mitigate the shortfalls of older cameras. Basically, the iPhone 16 Pro’s cameras are now so fast that when I asked someone to throw something so I could capture it in motion to see how still the images were, my shots ended up being of the person holding the object.

Our video producer and I were very confused, and it wasn’t until the “zero shutter lag” concept was explained clearer to me that I got it. I had become used to pressing the shutter early since cameras, in my experience, would be fractions of a second slow. Apple has become so fast that it actually captured the literal moment I tapped the button, instead of the split second after, when the object was in mid-air.

A woman flinging two cushions to her left, in a sample photo demonstrating the iPhone 16 Pro's 48-megapixel Fusion Camera's speed.
Brian Oh for Engadget

This is going to change how people take jump shots, I’m sure, but basically if you and your friends are taking pictures of yourselves floating in the sky, the photographer doesn’t have to hit capture before telling you to jump. I know this is a very specific and silly example, but it’s also the most relatable illustration of how much quicker the Fusion camera is.

Also, why can’t camera stories be silly and fun? That’s what a lot of the best moments in life are, and some of the new features are great in those situations. The support for 4K video at 120 fps in Dolby Vision, for example, led to some beautiful high-quality, rich and colorful clips of my friend’s adorable pomeranian trotting along on a walk. Her little tongue slowly peeking out as she bounded towards the camera looked crisp and smooth when I played it back at 25 percent and 20 percent speeds, too.

Depending on your mood, the new Photographic Styles can be fun or serious. Apple’s tweaked the built-in camera filters to not only offer more options but give you greater control. Due to how the company has refined its processing each year, there’s also an improved depth map captured when it detects a face in the scene. This, combined with a greater focus on color science around skintone, has led to what might be my favorite new iPhone 16 feature.

Whether I shot them in Portrait mode or not, photos of people that I took using the iPhone 16 Pro were a dream to edit. Simply switching between the Standard, Natural, Luminous, Quiet or Ethereal styles already resulted in improvements to the colors and shadow, but I could also tap on each thumbnail to access the new editing touchpad and drag a dot around. This let me more precisely tweak the hues and contrast levels, and an additional slider below let me adjust how warm the image was.

A composite of four sample photos featuring a woman gazing into the camera, each with a different Photographic Style applied. A label at the bottom right of each image shows which Style is used and they are, from left to right, Standard, Ethereal, Luminous and Vibrant.
Cherlynn Low for Engadget

An ugly selfie with my cousin in the hideous overhead lights of a meeting room became a beautiful snapshot after I switched to the Ethereal or Luminous styles. Both of those are quickly becoming my favorites, but I’m more impressed with how well Apple was able to segment the subject from the background. In almost every shot I edited, adjusting the slider mostly only changed the background, keeping people and their complexions within the realm of reality instead of applying harsh oversaturation or extreme contrast levels to them. They also added a background blur that lent a pleasant soft focus effect, and most of the time the system accurately identified outlines of people in the scene.

Perhaps my favorite part is the fact that you can change between styles after you’ve shot the photo on the iPhone 16. As someone who dwells on her Instagram filters and edit tools for some time before each post, I definitely appreciate how much nicer Apple’s versions are and only wish I could retroactively apply them to photos I had taken at a recent wedding. Alas, since the edits are dependent on information captured when the photos were taken, these new retouching features will only work for pictures taken with an iPhone 16 or 16 Pro.

One final camera update I’ll touch on before telling you about actual photo quality is Audio Mix. This uses the spatial audio now recorded by default with the new studio mics on the iPhone 16 Pro (or even the system on the iPhone 16 and 16 Plus) to understand the direction of sound sources in your footage. Then, when you edit the clip, you can choose between Standard, In-frame, Studio and Cinematic mixes, as well as drag a slider to reduce background noise.

You’ll have to be recording in fairly specific acoustic scenarios to get the most out of Audio Mix. I tested it in a variety of situations, like my cousin talking on his phone on a busy New York street, me interviewing my fellow gym buddies after a tiring workout with the background music quietly playing or my friend talking to me while his wife talks about something else off-camera in their fairly quiet kitchen.

For the most part, going to Cinematic or Studio modes from Standard resulted in a noticeable reduction in environmental noise. My favorite is Studio, which generally seemed to improve voice clarity as well, making people sound like they could be talking on a podcast. In-frame, however, rarely did what I expected and occasionally produced some warped distortion. It appears there might need to be more distance between various sources of sound for this to work best, and I have to spend more time testing to better understand this tool. You can check out our review video for examples of a clip with different audio mixes, but for now, while the promised improvements aren’t what I expected, there at least appears to be some benefit to Audio Mix.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro.
Cherlynn Low for Engadget

On to the actual photos and how they hold up against the competition. I’ve long considered Google’s Pixel phones to be the gold standard in smartphone photography, since I prefer the company’s color and detail processing. I know some people feel that Google tends to oversharpen, so bear in mind that, as with most things, your preference may be different from mine.

When I compared photos I took with both phones on the same laptop screen, the differences were minimal. Occasionally, Google would expose better, being more able to retain shadows near a bright light source than the iPhone 16 Pro. But the Pixel’s nightscape shots had more light leakage into the sky, whereas Apple was more adept at keeping the background dark against the outline of a skyscraper.

Honestly at this point we’re really nitpicking and pixel-peeping to find differences. Both companies deliver great cameras, and though I still prefer Google’s approach to Portrait shots, Apple has been slowly but surely closing the gap with improvements to its depth maps every year.

I will mention, though, that a lot more of the photos I shot on the iPhone 16 Pro came out blurrier than the Pixel 9 Pro, and it might have to do with the fact that I was using the Camera Control to snap them. This was the issue I alluded to earlier, where using a physical button to take a picture is more likely to introduce shake than a software shutter. It’s not like Samsung or Google phones are immune to this problem, though I will say that the way Camera Control is built, where the recessed button depresses into the phone’s frame, does leave it a bit more vulnerable to this than, say, using a volume rocker might.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro, featuring colorful motorcycle parts on a table.
Cherlynn Low for Engadget

Oh and finally, a quick note for my Gen Z readers: I know how much you all prefer flash photography compared to night modes in low light scenarios. (Thanks to my much younger cousin for the valuable insight.) I’ve done the testing and can say that I prefer Google’s Pixel 9 Pro for its software, warmer flash compared to the iPhone 16 Pro’s, which is stronger and brighter, leading to my face looking washed out.

It’s been about two months since the public beta for iOS 18 was released, and it was nice to get a taste of upcoming features like the new customizable home pages, expanded Tapback reactions and the redesigned Photos app. With the iPhone 16 launch, iOS 18 is basically ready for primetime… with some caveats.

This year, more than ever, it’s hard to figure out what’s coming to your iPhone and what isn’t. With the release of Apple Intelligence slated for October, features like writing tools, Cleanup for photos and the redesigned Siri won’t be ready till next month. And even then, your non-pro iPhone 15 won’t be compatible.

Plus, some features that were teased at WWDC, like Genmoji, still haven’t been added to the iOS 18.1 developer beta, which is where most Apple Intelligence features have been arriving as a preview for app makers. Within the iPhone 16 lineup, too, there are things coming only to the Pro models, like multilayer recording in Voice Memos.

It’s confusing, and can make choosing your iPhone a trickier decision. But for this review, at least the iPhone 16 Pro and Pro Max are getting everything. I cannot wait to try out multi-track recording in Voice Memos, and I hope Apple sees this yearning as a sign that it should bring this to more devices.

It was nice to get time with iOS 18, even in the absence of Apple Intelligence. Honestly, I’m not even sure I’d like those features that much. In a similar way, Gemini AI was nice on the Pixel 9 Pro series, but didn’t feel like must-haves.

Some of the new iOS 18 touches I noticed immediately were the refreshed Control Center, which took some getting used to as I had to re-learn how to swipe back to the home page, since there are more pages to scroll through now. I especially enjoyed seeing the new little chat bubble appear on my voice recordings, indicating that a transcript had been generated for them. And though I haven’t exchanged messages with Android-toting friends yet, I’m glad to see RCS support is finally live this week.

The bottom half of both the iPhone 16 Pro and iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

Though I was excited for the new custom routes tool in Maps, I struggled to actually create them. You can set your start and end points and have the app close the loop for you, or just tap landmarks or points on the map to get the route to basically connect the dots. Unfortunately, no matter how many times I tried to get the route to cut through a building where I knew a pedestrian walkway existed, Maps resisted me at every turn, forcing the route to go through more established (and therefore more crowded) paths instead. It’s not unreasonable, but certainly not the open-world route-creation feature I was envisioning.

The best thing about iOS 18, and also some new features in the iPhone 16 lineup (like in the camera controls) is the customizability. I do appreciate that if you don’t like something, you can usually turn it off. With the new ability to place apps outside of a rigid grid, you can now lay your home screen out just the way you like. The redesigned Photos app lets you create and pin collections so you can more easily find the pictures most important to you. And again, I’m glad Apple is giving people the option to turn off Camera Control altogether or adjust its sensitivity.

The iPhone 16 Pro and Pro Max are powered by Apple’s A18 Pro chip, which are built on “second-generation 3-nanometer technology and [feature] a new architecture with smaller, faster transistors.” All this is meant to deliver “unprecedented efficiency,” according to Apple’s press release.

Some small software glitches aside, I’ve never run into slowdown on the iPhone 16 Pro, but I was certainly surprised by the smaller handset’s battery life. In general, the iPhone 16 Pro would barely last a full day, which is reminiscent of the iPhone 15 Pro, too. It’s worth noting that before this review I was primarily using an iPhone 15 Pro Max as my daily driver, which usually gets through a day and a half with no problem, so the drop in endurance is even more pronounced for me.

Most days, I’d pick up the iPhone 16 Pro at about 9AM and would get to about 9pm before getting low battery alerts. If I started the day a bit later, closer to 11AM for instance, I got to 1am before the iPhone 16 Pro ran completely dry. On Sunday, I unplugged the phone at about 9:30AM and was shocked on the train home to get a warning that remaining power was at just 20 percent. It was only 6:50PM, and the night had barely just started!

You’ll get significantly better battery life on the iPhone 16 Pro Max, which delivers the same almost two-day runtime as its predecessor. And sure, a phone with a smaller battery not lasting as long makes mathematical sense. But considering the Pixel 9 Pro is a comparably sized handset and manages to last about two days, there’s no excuse for the iPhone 16 Pro to conk out before the night is up.

A white iPhone 16 Pro and a desert iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

One of the best things about the iPhone 16 Pro lineup is that, unlike last year, there isn’t much of a tradeoff in cameras if you opt for the smaller device. The iPhone 15 Pro Max had a 5x telephoto zoom camera, while the iPhone 15 Pro only went up to 3x. As a budding photographer of skittish wild animals, I opted for the Max, especially since it was much lighter than its predecessor thanks to the titanium build.

With the iPhone 16 Pro having essentially the same camera system as the Pro Max, I thought it was time for me to go back to a size that was much easier on my hands. Alas, with the disappointing battery performance, I might just have to stick with a Max, and you might too.

There’s also the non-Pro iPhone 16 models to consider, and just as there were fewer differences than ever between the Pro and Pro Max, the tradeoffs aren’t as significant this year, either. Apple brought the previously Pro-exclusive Action button to the iPhone 16 and iPhone 16 Plus, while also including the Camera Control on its less-premium phones. The main things that set the two lines apart this year are processors, screen quality, camera sensors and onboard mics. You’ll lose support for ProRaw photos and multi-layer recording by opting for the cheaper devices, too.

Otherwise, you’ll still have all the iOS 18 and Apple Intelligence features coming to the Pros, as well as spatial audio recording, which enables the Audio Mix I described in the camera section earlier.

The iPhone 16 Pro and iPhone 16 Pro Max held in mid air with their backs facing up.
Cherlynn Low for Engadget

Apple’s caution is sometimes warranted. Especially at a time when mistrust of AI-generated content runs rampant, the company taking its time to get Apple Intelligence right is understandable. But its deliberation doesn’t always lead to winners. While I appreciate the attempt to differentiate camera control with the touch sensor for more versatility, I’m not yet convinced of its usefulness.

The good news is, and I cannot stress this enough, you have the option to tune it to your liking. And that’s a theme I’m seeing in recent Apple features that hint at more thoughtfulness than usual. If you don’t like something, or if something isn’t right for your needs, you can adjust or disable it. In iOS 18, you have greater control over your home screen’s app layout and can pin custom collections for easier reach in the Photos app. The Action button introduced last year could have been a spectacular fail had Apple not let you still keep it as a mute switch, but it managed to give people more functionality while maintaining the status quo for those who are just as resistant to change.

Change is scary. Change is hard. But without change there is no progress. Apple’s cautious approach is a tricky balancing act that’s evident on the iPhone 16 Pro. Some new features, like Audio Mix and custom routes in Maps, deliver mixed results. Others, like Photographic Styles, are hits. Then there are the basic ingredients, like good battery life and durable, attractive designs, that Apple cannot neglect.

The iPhone 16 Pro’s subpar battery life holds it back from beating the competition, which is stiffer than ever this year, especially from Google. Luckily for Apple, most people who have iPhones are going to stick with iPhones — it’s just easier. For those already sucked into the ecosystem, the iPhone 16 Pro (and particularly the Pro Max) are worth the upgrade from a model that’s at least two years old. If you already have an iPhone 15 Pro (or even a 14 Pro), for the sake of our planet and your wallet, you might prefer to hold off on upgrading, especially since this year’s devices aren’t that much different.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-and-pro-max-review-cameras-and-customization-120052372.html?src=rss

Apple halts iPadOS 18 update for M4 iPad Pro after bricking reports

Apple has temporarily paused the rollout of iPadOS 18 for M4 iPad Pro models, some of the most expensive iPads that the company sells, after some users complained that the update bricked their devices. Apple acknowledged the issue in a statement to Engadget, saying, “We have temporarily removed the iPadOS 18 update for M4 iPad Pro models as we work to resolve an issue that is impacting a small number of devices.”

The issue first came to light through Reddit, where a growing number of M4 iPad Pro users described how their iPads became unusable after they tried installing the latest version of iPadOS. “At some point during the update my iPad turned off, and would no longer turn on,” a user named tcorey23 posted on Reddit. “I just took it to the Apple Store who confirmed it’s completely bricked, but they said they had to send it out to their engineers before they can give me a replacement even though I have Apple care.”

Another Reddit user called Lisegot wrote that the Apple Store they took their bricked M4 iPad Pro to did not have a replacement in stock, which meant they they would need to wait five to seven days for a working iPad. “No one was particularly apologetic and they even insinuated that there was no way for them to know whether the update caused this,” they wrote.

Having a software bug brick an iPad is rare. ArsTechnica, which first reported this story, pointed out that iPads can typically be put into recovery mode if a software update goes bad.

If you own an M4 iPad Pro, Apple will no longer offer you iPadOS 18 until it fixes the issue. It’s not clear when it will be fixed.

This article originally appeared on Engadget at https://www.engadget.com/mobile/tablets/apple-halts-ipados-18-update-for-m4-ipad-pro-after-bricking-reports-000258237.html?src=rss

Apple Music brings its audio haptics feature to all users as part of iOS 18

Apple’s Music Haptics feature is now live, as part of the official release of iOS 18. This is an accessibility tool that integrates with Apple Music on iPhones. Simply put, it uses the phone’s speaker-based haptics system, which the company refers to as the Taptic Engine, to create “taps, textures and refined vibrations to the audio of the song.”

This is quite obviously aimed toward those affected by hearing loss, allowing them to feel the music. It works with Apple Music, but also with Apple Music Classical and Shazam. The company says it’ll also integrate with some third-party apps, so long as the iPhone is connected to Wi-Fi or cellular. 

To get started, just head into the Accessibility settings menu and turn on “Music Haptics.” An easily identifiable logo will appear on the Now Playing screen in the Apple Music app when activated. Tapping this logo will pause the feature and tapping it again will turn it back on. Music Haptics is supported globally on iPhone 12 and later, as long as the device is updated to iOS 18.

To commemorate the launch, Apple Music has released a series of playlists that take advantage of the haptic technology. These channels have names like Haptics Beats and Haptics Bass, so they are filled with songs with plenty of opportunity for taps and vibrations.

People have already been experimenting with the feature. Some users have suggested that it “sounds like an Atari game” when a phone is placed on a box with Music Haptics turned on. I don’t agree but, well, listen for yourself.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-music-brings-its-audio-haptics-feature-to-all-users-as-part-of-ios-18-184753345.html?src=rss

AirPods Pro 2’s new features have arrived. Here’s what to expect

Prior to iOS 18's arrival, Apple released a firmware update for the AirPods Pro 2 that will deliver new features the company announced at WWDC in June. Now that the latest version of the mobile OS is available, your iPhone can fully employ the new tools, which include Siri Interactions, Voice Isolation and more. Your AirPods Pro 2 should have already installed the update and be ready to go when you upgrade to iOS 18, so here's what to expect when you use the new features. 

Siri Interactions allow you to interact with your phone at times when you can't or don't want to speak or reach for your phone. Machine learning on the H2 chip and transformer models on a source device (iPhone, iPad, Mac and Apple Watch) can detect when you nod affirmatively or shake your head. This can be used any time Siri asks a yes or no question, like accepting or rejecting calls, responding to or dismissing messages and engaging with or dismissing notifications. 

So far, Siri Interactions have worked as described for me. I like that the tech recognizes smaller head movements, so you don't have to exaggerate them to get the system to respond. I've found the feature most helpful for incoming calls and texts, especially when my hands or full or when I'm in a setting where I can't immediately speak. 

Voice Isolation is a new feature that taps the AirPods Pro 2 H2 chip and the source device (iPhone, iPad or Mac) for advanced machine learning to enhance how you sound on calls. The tech isolates your voice so it can effectively cancel significant amounts of background noise, and for some distractions, it will eliminate them entirely. During my tests, Voice Isolation totally blocked a noisy fan and running water. It's truly impressive how the roar that's otherwise obvious on a call is completely absent when this is enabled. It's also great that the tool works its magic with minimal impact to overall voice quality.

AirPods Pro (2022) review
Billy Steele/Engadget

The feature is enabled automatically in your microphone settings, where you'll find options for Automatic, Standard and Voice Isolation. Here, you can activate Voice Isolation while you're on a call if you don't want the system to handle things on its own. The tool will also be supported in FaceTime and any third-party apps that use CallKit. Those include WebEx, Zoom, WhatsApp and many more. 

As a reminder, Siri Interactions and Voice Isolation are also available on the AirPods 4.

In addition to those two headliners, the update equips the AirPods Pro 2 with "the best wireless audio latency Apple has ever delivered for mobile gaming." What's more, gamers can expect improved voice quality, thanks to 16-bit, 48kHz audio when chatting during sessions. Apple says it also improved Personalized Volume on the AirPods Pro 2, but didn't go into specifics there. Personalized Volume is the tool that adjusts the media levels on your AirPods Pro 2nd based on changes in environmental conditions and your volume preferences. Apple says that the feature learns your listening preferences over time to fine-tune adjustments as they're needed.

One of the biggest announcements from the iPhone 16 event was Apple's plan to turn the AirPods Pro 2 into a set of over-the-counter hearing aids for people with mild to moderate hearing loss. While the company has received FDA approval for the first software-based hearing aid solution that will be available without a prescription, the feature and the accompanying Hearing Test aren't ready just yet. Apple is planning to release the suite of hearing features as part of an update sometime this fall. 

The AirPods Pro 2 update is available for free over the air from your iPhone. You can check the version number under the AirPods settings when the earbuds are connected to an iOS device. You'll want to look for 7A294 to be sure you're running the latest version. If not, you can trigger the update by listening to music for around 30 seconds and then putting the AirPods Pro back in the case. If you notice that the earbuds don't immediately disconnect on the Bluetooth menu, that means the update is happening, so keep the case closed and near your phone until it completes. AirPods Pro will disconnect when the process is over. You'll need to make sure your iPhone is updated to iOS 18 as well. 

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/airpods-pro-2s-new-features-have-arrived-heres-what-to-expect-172023882.html?src=rss

Apple has released iOS 18. Here’s how to update your iPhone

Finally out of beta, iOS 18 arrived for public availability as of Monday afternoon. You can download and install it if your device is compatible, but it already comes with all iPhone 16, iPhone 16 Plus and iPhone 16 Pro models that will be available on September 20. Those with eligible devices can update them by going to Settings > General > About > Software Update and starting the download and installation processes.

To see if your device is eligible, we have a list of iPhone models that can support iOS 18. Check it out and see if yours will work.

Some of the “hidden” features our editor Cherlynn spotted include Apple Maps upgrades, Calendar integration with Reminders and expanded Tapback options in Messages, letting you see who reacted with which emoji. Safari is getting a “Highlights” function, which generates a summary of web pages you’re on via machine learning. Our UK bureau chief Mat Smith also tried out some early iOS 18 features in July, and his main takeaway was that Apple Intelligence is the real star. Unfortunately, Apple Intelligence isn’t out today, but its first features will become available in October as part of a subsequent update.

Besides iOS 18, all of Apple's other major sibling operating system updates are available as well. That includes iPadOS 18, visionOS 2, macOS Sequoia, tvOS 18 and watchOS 11, all of which are coming to their respective devices today. Make sure to check if your devices are eligible for the update and that they have enough space. You may have to free up a few gigabytes of storage first.

Update, September 16, 8:17PM ET: Added more complete list of additional Apple OS updates that are now available, and additional context about Apple Intelligence (some, not all, of the features are arriving beginning in October).  

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-has-released-ios-18-heres-how-to-update-your-iphone-171444043.html?src=rss

Apple’s 13-inch M2 iPad Air is back on sale for $720

It’s not too late to get that bright student in your life a back-to-school gift that could help with their studies. An iPad can be useful for note taking and writing papers (especially with a keyboard attachment), carrying out research and definitely not streaming a new TV obsession during class.

So if you’ve been lagging on a back-to-school gift or even just want to treat yourself to one of Apple’s iPads, you may be pleased to learn that the iPad Air is currently on sale. The 13-inch M2 iPad Air with 128GB of storage is available in purple, space gray and starlight for $720. That's a discount of $79, which is almost a record low. Be sure to clip the coupon on Amazon before heading to the checkout to get the full discount.

The 11-inch M2 iPad Air is also on sale. It's $50 off at $549 in the space gray, starlight and blue colorways.

The most recent version of the iPad Air is our pick for the best iPad for most people. Sure, it's more expensive than the base iPad, but it's much more capable. Other than the screen real estate, the main difference between the two iPad Air sizes is that the 13-inch variant has a brighter display (600 nits vs 500 nits). Although the fully laminated display is an improvement from the previous iPad Air, the refresh rate is 60Hz, so it doesn't look as smooth as the iPad Pro's 120Hz OLED panel.

It can handle basic tasks like web browsing and video streaming with ease, thanks to its M2 chipset and 8GB of RAM. The hardware can handle more demanding tasks too — it can run high-end games like Death Stranding and the Resident Evil 4 remake. The tablet is also compatible with Apple Intelligence, the suite of AI tools that Apple will start rolling out in October.

On top of all that, the M2 iPad Air should run for up to 10 hours or so on a single charge. It has a USB-C port for charging and peripherals, while the Touch ID fingerprint scanner is built into the power button.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/deals/apples-13-inch-m2-ipad-air-is-back-on-sale-for-720-151549796.html?src=rss

Engadget review recap: Foldable, wearable, floatable

Hardware season is in full swing. Apple launched the iPhone 16, AirPods 4 and Apple Watch Series 10 on Monday this week. On the same day, at the very same time, the review embargo for the Pixel Watch 3 lifted, and we managed to get most of our piece up then. Not only that, we also saw Sony announce the PlayStation 5 Pro this week, plus in the last two weeks there has been plenty of news out of the IFA conference in Berlin. There were things like Huawei's tri-fold phone, reMarkable's Paper Pro tablet, DJI's $200 Neo drone, a new GoPro as well as more concept Lenovo laptops.

As you can imagine, it's been a hectic couple of weeks for those of us who cover consumer tech, and the events are far from over. Reviews of all the big products announced recently will also be coming soon, if they haven't already, and I am once again back to help you catch up on all the reviews we published in the last two weeks. I will also explain why there are some products we haven't written up, like the OnePlus Pad 2.

by Cherlynn Low and Sam Rutherford

Since our foldables expert Sam Rutherford is on parental leave, the task of reviewing the Pixel 9 Pro Fold became mine. But Sam, being the responsible and helpful reviewer that he is, took time out to share his thoughts and impressions with me. He even took the review photos for our piece, and I especially appreciate his using mahjong tiles as an interesting backdrop for his pictures. While I focused my testing on the Pixel 9 Pro Fold as a viable smartphone alternative and its use as a multimedia consumption device, Sam provided his insight by comparing Google's foldable to Samsung's Galaxy Z Fold 6

Our review brings together those two perspectives, making for a fairly comprehensive analysis, if I do say so myself. We've got camera comparisons between Google and Samsung's offerings, with evaluation of both their software, battery performance, build, shape and more. 

I was also able to shoot a video encompassing all our reviews of the Pixel 9 family of phones, which covers the Pixel 9, Pixel 9 Pro and Pro XL as well as the Pixel 9 Pro Fold. The footage goes into some extra detail around things like the Add Me and Made You Look camera updates, as well as what Emergency SOS via Satellite looks like on a Pixel phone. Check it out at the top of this article!

by Cherlynn Low

Google didn't make our lives very easy with its Monday embargo on September 9th, especially with Apple's iPhone 16 launch event happening the same day. But the good news is, our review units had arrived about two weeks prior, so we had enough time with the Pixel Watch 3 to get a better sense for it in the real world. I spent my time with the smaller 41mm model while Sam was able to share some testing insight of the new larger 45mm variant.

Within a couple of days, Sam and I were trading notes about how impressed we were with the Pixel Watch 3's battery life. Since I hate wearing watches to sleep, Sam graciously filled me in on the watch's sleep-tracking and auto bedtime features. Once again, teamwork made the dream work here (quite literally for me, as I would not have been able to fall asleep otherwise). 

On my end, I focused on workouts and activity-tracking, double-wristing the Pixel Watch 3 with my Apple Watch Series 9 everywhere I went for two weeks. I was stoked that the Google smartwatch was better at automatically detecting my every walk, run and bike ride, but found it a little too thick compared to the competition. I also enjoyed the new customizable run workouts that let me set sprint and rest segments during my treadmill sessions.

I know that a day and a half sounds just about average in terms of battery life for modern smartwatches, but considering older models could barely last 24 hours, the improvements to runtime feel huge. Together with some Google and Pixel integrations, the Pixel Watch 3 finally feels like it's ready to take on the likes of Samsung and Apple's flagship wearables. 

by Billy Steele 

In this review, Billy once again shows us what to look for when getting gear for a party outside. Judging by the beautiful pictures accompanying his review, it's clear that when he's not testing headphones or grilling meats in his backyard, Mr. Grilly Steele spends ample time at the beach for (work-sanctioned) speaker testing. I support it.

With the UE Everboom, Billy makes clear that the sound quality isn't stellar, and music lacks in the mid range. But if it's volume that you want, the Everboom delivers, beaming sound out in 360 degrees. Plus, it does so in a rugged, waterproof body that can also survive a toss into the pool, since, like the company's other speakers, it floats!

However, with a score of just 75, the Everboom didn't quite make the cut to be one of the products we award the Recommended title. You'll likely find a better device for your needs from competing brands like Marshall and Beats. 

With everything that's been happening in the industry and in our own lives lately, we have yet to review the OnePlus Pad 2. Or the 2024 Moto Razrs or Galaxy Watch Ultra, either, for that matter. We continue to test our review units so that experience can inform our evaluation of other products we write up. But time is a resource we never have enough of, and with companies constantly launching new products, it's hard to keep up. 

Our lead tablet reviewer (and deputy editor) Nathan Ingraham has been spending time with the OnePlus Pad 2, and he does have some thoughts to share. He's a fan of its build and display, finding it light and well-balanced despite being fairly large with a 12.1-inch screen. Speaking of, he also likes the display, appreciating its 301ppi pixel density. In fact, he called this "one of the nicest tablet screens I've seen outside of the iPad." 

I'll have to get Nate to look at a Samsung Galaxy Tab with a nice AMOLED panel before we make that official, but there are other things that set the OnePlus Pad 2 apart. If you own a OnePlus phone, Nate noted that "there are some smart software features" that could make this tablet a better option than an iPad or Galaxy Tab. But, as Nate points out, "the Android software situation, as always, is a rather unimpressive mixed bag." He called out multitasking and a lack of apps that make good use of larger screens as two areas that need improvement.

Still, for $550 (and currently going for $499 direct from OnePlus), Nate thinks the Pad 2 is a fair value. "I still can't recommend it over an iPad, but at least it doesn't break the bank."

Like I said at the start of this recap, there are plenty of gadgets hiding in our homes, ready to be tested. There's plenty more to come, including new Copilot+ PCs with AMD and Intel chips, as well as cameras, earbuds and more. If anyone watching this week's news were so inclined, they could probably make a very educated guess as to what we're getting ready to publish reviews of, especially given established timelines from previous years. That's all I'll say for now. 

I did want to say how thankful we are for your patience as we make it through this intensely hectic time of year, and you may see reviews go up this month that are updated after publish with additional details. This might be done in an effort to get articles up in a timely manner while still being able to provide comprehensive insight on our experiences over time. For instance, my Pixel Watch 3 review went up on Monday, but I was able to update it on Wednesday morning with a whole section on the Wear OS and Fitbit app experience. 

It's not anything that was new or that impacted my score, but was simply extra detail that I didn't have the time to write up while concurrently preparing for the iPhone event. With the packed tech launch calendar coming up, you may see us adopt a similar approach on reviews that might have tight deadlines. 

As always, we appreciate all your time reading and watching our work. Have a wonderful weekend.

This article originally appeared on Engadget at https://www.engadget.com/engadget-review-recap-foldable-wearable-floatable-140035783.html?src=rss

iPads will support third-party app stores in Europe starting September 16

Apple has revealed it will allow iPad users in the EU to install third-party app stores on their tablets (without having to sideload them) starting on September 16. You'll need to install iPadOS 18, which will be available broadly on Monday, to do so.

Back in April, the European Commission designated iPadOS as a "core platform service," meaning that like iOS, the App Store and Safari, the operating system is subject to stricter rules under the bloc's Digital Markets Act. As TechCrunch notes, Apple had six months to update iPadOS so that it complied with the DMA, which included opening up the platform to third-party app marketplaces.

Epic Games has already pledged to bring its app marketplace to iPadOS, meaning that folks in the EU should be able to play Fortnite and Fall Guys natively on compatible iPads in the near future. Several other third-party app stores have arrived on iOS in the EU since Apple added official support in March.

While the likes of AltStore PAL and the Epic Games Store aren't subject to Apple's usual app review policies, the company notarizes them for security purposes. The developers of third-party app marketplaces also need to pay a Core Technology Fee to Apple once they meet certain thresholds (the EU opened an investigation into this fee in March).

One other key change coming to iPads with the rollout of iPadOS 18 is under the surface, but one that may ultimately change how EU users browse the web on their iPads. Apple will allow third-party browsers to use their own engines on iPadOS instead of having to employ its own WebKit. This means that the likes of Mozilla and Google will be able to offer iPad versions of Firefox and Chrome that run on their own tech.

This article originally appeared on Engadget at https://www.engadget.com/mobile/tablets/ipads-will-support-third-party-app-stores-in-europe-starting-september-16-180414833.html?src=rss