Apple iPhone 16 and iPhone 16 Plus review: Closing the gap to the Pro

The “regular” iPhone has become like a second child. Year after year, this model has gotten the hand-me-downs from the previous version of the iPhone Pro – the older, smarter sibling. The iPhone 15 received the iPhone 14 Pro’s Dynamic Island and A16 Bionic processor, and the iPhone 14 before that got the A15 Bionic chip and a larger Plus variant with the same screen size as the iPhone 13 Pro Max. For the iPhone 16 ($799 & up), there are trickle-down items once more. But this time around, that’s not the entire story for the Apple phone that’s the best option for most people.

Surprisingly, Apple gave some of the most attractive features it has for 2024 to both the regular and Pro iPhones at the same time. This means you won’t have to wait a year to get expanded camera tools and another brand new button. Sure, Apple Intelligence is still in the works, but that’s the case for the iPhone 16 Pro too. The important thing there is that the iPhone 16 is just as ready when the AI features arrive.

So, for perhaps the first time – or at least the first time in years – Apple has closed the gap between the iPhone and iPhone Pro in a significant way. ProRAW stills and ProRES video are still exclusive to the priciest iPhones, and a new “studio-quality” four-microphone setup is reserved for them too. Frustratingly, you’ll still have to spend more for a 120Hz display. But, as far as the fun new tools that will matter to most of us, you won’t have to worry about missing out this time.

Another year has passed and we still don’t have a significant redesign for any iPhone, let alone the base-level model. As such, I’ll spend my time here discussing what’s new. Apple was content to add new colors once again, opting for a lineup of ultramarine (blueish purple), teal, pink, white and black. The colors are bolder than what was available on the iPhone 15, although I’d like to see a blue and perhaps a bright yellow or orange. Additionally, there’s no Product Red option once again — we haven’t seen that hue since the iPhone 14.

The main change in appearance on the iPhone 16 is the addition of two new buttons. Of course, one of those, the reconfigurable action button above the volume rockers, comes from the Pro-grade iPhones. By default, the control does the task of the switch it replaces: activating silent mode. But, you can also set the action button to open the camera, turn on the flashlight, start a Voice Memo, initiate a Shazam query and more. You can even assign a custom shortcut if none of the presets fit your needs.

While Apple undoubtedly expanded the utility of this switch by making it customizable, regular iPhone users will have to get used to the fact that the volume control is no longer the top button on the left. This means that when you reach for the side to change the loudness, you’ll need to remember it’s the middle and bottom buttons. Of course, the action button is smaller than the other two, so with some patience you can differentiate them by touch.

The new Camera Control button can open the camera app from anywhere.
Billy Steele for Engadget

Near the bottom of the right side, there’s a new Camera Control button for quick access to the camera and its tools. A press will open the camera app from any screen, and a long press will jump straight to 4K Dolby Vision video capture at 60 fps. Once you’re there, this button becomes a touch-sensitive slider for things like zoom, exposure and lens selection. With zoom, for example, you can scroll through all of the options with a swipe. Then with a double “light press,” which took a lot of practice to finally master, you can access the other options. Fully pressing the button once will take a photo — you won’t have to lift a finger to tap the onscreen buttons.

Around back, Apple rearranged the cameras so they’re stacked vertically instead of diagonally. It’s certainly cleaner than the previous look, and the company still favors a smaller bump in the top left over something that takes up more space or spans the entire width of the rear panel (Hi Google). The key reason the company reoriented the rear cameras is to allow for spatial photos and videos, since the layout now enables the iPhone 16 to capture stereoscopic info from the Fusion and Ultra Wide cameras.

The iPhone 16 and 16 Plus have a new 48-megapixel Fusion camera that packs a quad-pixel sensor for high resolution and fine detail. Essentially, it’s two cameras in one, combining – or fusing, hence the name – a 48MP frame and a 12MP one that’s fine-tuned for light capture. By default, you’ll get a 24MP image, one that Apple says offers the best mix of detail, low-light performance and an efficient file size. There’s also a new anti-reflective coating on the main (and ultrawide) camera to reduce flares.

The 12MP ultrawide camera got an upgrade too. This sensor now has a faster aperture and larger pixels, with better performance in low-light conditions. There’s a new macro mode, unlocked by autofocus and able to capture minute detail. This is one of my favorite features as sharp images of smaller objects have never been in the iPhone camera’s arsenal (only the Pros), and the macro tool has worked well for me so far.

The iPhone 16, like its predecessors, takes decent stills. You’ll consistently get crisp, clean detail in well-lit shots and realistic color reproduction that doesn’t skew too warm or too cool. At a concert, I noticed that the iPhone 16’s low-light performance is noticeably better than the iPhone 15. Where the previous model struggled at times in dimly lit venues, my 2x zoom shots with this new model produced better results. There wasn’t a marked improvement across the board, but most of the images were certainly sharper.

Macro mode on the iPhone 16 camera is excellent.
Macro mode on the iPhone 16 camera is excellent.
Billy Steele for Engadget

The most significant update to the camera on the iPhone 16 is Photographic Styles. Apple has more computational image data from years of honing its cameras, so the system has a better understanding of skin tones, color, highlights and shadows. Plus, the phone is able to process all of this in real time, so you can adjust skin undertones and mood styles before you even snap a picture. Of course, you can experiment with them after shooting, and you can also assign styles to a gallery of images simultaneously.

Photographic Styles are massively expanded and way more useful, especially when you use them to preview a shot before you commit. My favorite element of the updated workflow is a new control pad where you can swipe around to adjust tone and color. There’s also a slider under it to alter the color intensity of the style you’ve selected. For me, the new tools in Photographic Styles make me feel like I don’t need to hop over to another app immediately to edit since I have a lot more options available right in the Camera app.

As I’ve already mentioned, Camera Control is handy for getting quick shots, and the touch-sensitivity is helpful with settings, but I have some gripes with the button. Like my colleague Cherlynn Low mentioned in her iPhone 16 Pro review, the placement causes issues depending on how you hold your phone, and may lead to some inadvertent presses. You can adjust the sensitivity of the button, or disable it entirely, which is a customization you might want to explore. What’s more, the touch-enabled sliding controls are more accurately triggered if you hold the phone with your thumbs along the bottom while shooting. So, this means you may need to alter your grip for prime performance.

Like I noted earlier, the new camera layout enables spatial capture of both video and photos on the iPhone 16. This content can then be viewed on Apple Vision Pro, with stills in the HEIC format and footage at 1080p/30fps. It’s great that this isn’t reserved for the iPhone 16 Pro, but the downside (for any iPhone) is file size. When you swipe over to Spatial Mode in the camera app, you’ll get a warning that a minute of spatial video is 130MB and a single spatial photo is 5MB. I don’t have one of Apple’s headsets, so I didn’t spend too much time here since the photos and videos just appear normal on an iPhone screen.

I’d argue the most significant advantage of Spatial Mode is Audio Mix. Here, the iPhone 16 uses the sound input from the spatial capture along with “advanced intelligence” to isolate a person’s voice from background noise. There are four options for Audio Mix, offering different methods for eliminating or incorporating environmental sounds. Like Cherlynn discovered on the iPhone 16 Pro, I found the Studio and Cinematic options work best, with each one taking a different approach to background noise. The former makes it sound like the speaker is in a studio while the latter incorporates environmental noise in surround sound with voices focused in the center – like in a movie. However, like her, I quickly realized I need a lot more time with this tool to get comfortable with it.

Plain ol' black is an option this time around.
Billy Steele for Engadget

Apple proudly proclaimed the iPhone 16 is "built for Apple Intelligence,” but you’ll have to wait a while longer to use it. That means things like AI-driven writing tools, summaries of audio transcripts, a prioritized inbox and more will work on the base iPhone 16 when they arrive, so you won’t need a Pro to use them. Genmoji and the Clean Up photo-editing assist are sure to be popular as well, and I’m confident we’re all ready for a long overdue Siri upgrade. There’s a lot to look forward to, but none of it is ready for the iPhone 16’s debut. The iOS 18.1 public beta arrived this week, so we’re inching closer to a proper debut.

Sure, it would’ve been nice for the excitement around the new iPhones to include the first crack at Apple’s AI. But, I’d rather the company fine-tune things before a wider release to make sure Apple Intelligence is fully ready and, more importantly, fully reliable. Google has already debuted some form of AI on its Pixel series, so Apple is a bit behind. I don't mind waiting longer for a useful tool than rushing a company into making buggy software.

What will be available on launch day is iOS 18, which delivers a number of handy updates to the iPhone, and many of which deal with customization. For the first time, Apple is allowing users to customize more than the layout on their Home Screen. You can now apply tint and color to icons, resize widgets and apps and lock certain apps to hide sensitive info. Those Lock Screen controls can also be customized for things you use most often, which is more handy now since the iPhone 16 has a dedicated camera button on its frame. There’s a big overhaul to the Photos app too, mostly focused on organization, that provides a welcome bit of automatization.

The iPhone 16 uses Apple’s new A18 chip with a 6-core CPU and 5-core GPU. There’s also a 16-core Neural Engine, which is the same as both the iPhone 15 and the iPhone 16 Pro. With the A18, the base-level iPhone jumped two generations ahead compared to the A16 Bionic inside the iPhone 15. The new chip provides the necessary horsepower for Apple’s AI and demanding camera features like Photographic Styles and the Camera Control button. I never noticed any lag on the iPhone 15, even with resource-heavy tasks, and those shouldn’t be a problem on the iPhone 16, either. But, we’ll have to wait and see how well the iPhone 16 handles Apple Intelligence this fall.

Of course, the A18 is more efficient than its predecessors, which is a benefit that extends to battery life. Apple promises up to 22 hours of local video playback on the iPhone 16 and up to 27 hours on the 16 Plus. For streaming video, those numbers drop to 18 and 24 hours respectively, and they’re all slight increases from the iPhone 15 and 15 Pro.

Starting at 7AM, I ran my battery test on the iPhone 16 and had 25 percent left at midnight. That’s doing what I’d consider “normal” use: a mix of calls, email, social, music and video. I also have a Dexcom continuous glucose monitor (CGM) that’s running over Bluetooth and I used the AirPods 4 several times during the day. And, of course, I was shooting photos and a few short video clips to test out those new features. While getting through the day with no problem is good, I’d love it if I didn’t have to charge the iPhone every night, or rely on low-power mode to avoid doing so.

On a related note, Apple has increased charging speeds via MagSafe, where you can get a 50 percent top up in around 30 minutes via 25W charging from a 30W power adapter or higher.

With the iPhone 16, Apple has almost closed the gap between its best phone for most people and the one intended for the most demanding power users. It’s a relief to not pine for what could be coming on the iPhone 17 since a lot of the new features on the iPhone 16 Pro are already here. And while some of them will require time to master, it’s great that they’re on the iPhone 16 at all. There are some Pro features you’ll still have to spend more for, like ProRAW photos, ProRES video, a 120Hz display, a 5x telephoto camera and multi-track recording in Voice Memos. But those are luxuries not everyone needs. For this reason, the regular iPhone will likely suit your needs just fine, since splurging on the high-end model has become more of an indulgence than a necessity.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-iphone-16-and-iphone-16-plus-review-closing-the-gap-to-the-pro-120050824.html?src=rss

The iOS 18.1 public beta is here, bringing Apple Intelligence (almost) to the masses

Apple Intelligence is edging closer to being ready for primetime. Apple has released the public beta of iOS 18.1, which includes some of the major generative AI features that the company has been talking up over the last few months.

We'll have to wait a few more weeks for the public versions of iOS 18.1, iPadOS 18.1 and macOS Sequoia 18.1 to bring Apple Intelligence features to everyone with a compatible device. The public betas should be more stable and less risky to install than the developer betas, but it's still definitely worth backing up your data to your computer and/or iCloud before putting this build of iOS 18.1 on your iPhone.

Right now, the only iPhones that support Apple Intelligence are the iPhone 15 Pro and iPhone 15 Pro Max, but that will change on Friday when Apple ships the iPhone 16 lineup. M-series iPads and Macs will support Apple Intelligence too.

For now, you'll need to have your device and Siri language set to US English to access Apple Intelligence tools. If you want to use Apple Intelligence in a language other than English (or in a localized version of English), you may need to wait until at least December for the public versions of the operating systems that support it.

Apple is gradually rolling out Apple Intelligence tools over the coming months, so not all of them will be available right away. The initial wave of features includes the ability to transcribe phone calls (and audio notes in the Notes app) and get summaries of the key details. Writing tools (rewriting, proofreading and summarizing), email prioritization and smart replies, notification summaries and photo clean up features are also on the docket. You'll be able to create memories in the revamped Photos apps and check out the first incarnation of the redesigned, glowing Siri (including the ability to type requests to the assistant).

You'll need to wait longer for certain other features, including ChatGPT integration, Genmoji, Image Playground (i.e. image generation) and Siri's ability to better understand personal context. Apple will roll those out over the coming months.

On your iPhone, go to Settings > General > Software Update > Beta Updates and select the iOS 18 public beta option. Once the iOS 18.1 public beta is available for your device, you'll be able to see it on the software update page. You might need to free up some space before you can install the beta. To enable Apple Intelligence, go to Settings > Apple Intelligence & Siri > Join the Apple Intelligence waitlist.

The public beta installation process is almost identical on iPad. On your Mac, you'll need to go to System Settings > General > Software Update. Click the info symbol next to the ”Beta updates" option and you should be able to install the iOS 18.1 public beta from there when it's available.

This article originally appeared on Engadget at https://www.engadget.com/ai/the-ios-181-public-beta-is-here-bringing-apple-intelligence-almost-to-the-masses-175248580.html?src=rss

Early Prime Day deals include this Anker 10K magnetic power bank for only $40

I've been a big fan of Anker ever since I picked up the Nano Portable Charger a while back. And now, again, I'm tempted to pick up another of the brand's power banks thanks to early Prime Day deals. There's currently a 50 percent discount on an older version of our top pick for iPhones in our best power banks guide. The Anker 633 magnetic battery pack is currently on sale for $40, down from $80 — a new all-time low price. 

The MagGo charger has a 10,000mAh battery and offers 20W of high-speed power with a USB-C charging cable (which it includes). The power bank is also a great wireless option, with magnets grabbing hold of your phone and a kickstand keeping it elevated during charging. When you're on the go or storing it, that kickstand folds right in to create a solid, smooth block. 

On a related note, Anker has just recalled some of its products from January 3 to September 17, 2024. Anker found that some of the lithium-ion batteries it used have a manufacturing defect that can present a fire risk. This power bank on sale is not impacted, but two different magnetic power banks are. You can see exactly which items and specific serial numbers have been recalled here, and thankfully, Anker already scrubbed those listings from Amazon so no one can buy them anymore.

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice in the lead up to October Prime Day 2024.

This article originally appeared on Engadget at https://www.engadget.com/deals/early-prime-day-deals-include-this-anker-10k-magnetic-power-bank-for-only-40-141229742.html?src=rss

Apple confirms expanded language support for Apple Intelligence in 2025

The rollout of Apple Intelligence will be fairly slow-paced, with Apple gradually adding new features and support for more languages over the coming months. The company has now confirmed support for several more languages as Apple Intelligence will be available in German, Italian, Korean, Portuguese and Vietnamese in 2025. That’s in addition to previously announced support for Chinese, French, Japanese and Spanish.

Apple will initially offer Apple Intelligence in the US in English with the release of iOS 18.1, iPadOS 18.1 and macOS Sequoia 15.1 in October. As such, you won’t have access to the tools immediately if you pick up an iPhone 16 when Apple’s latest smartphone lineup ships on Friday.

The tools will be available in localized English in Australia, Canada, New Zealand, South Africa and the UK in December. Apple will also start rolling out the features in India and Singapore in English next year. Further language support is to be announced.

There is one key thing worth noting as part of the Apple Intelligence rollout, however. Apple is not planning to broadly offer the tools in the European Union or Chinese mainland right away. So while you’ll be able to use Apple Intelligence in Portuguese or French, you might not necessarily be able to do so while you’re in Portugal or France.

“Apple Intelligence will not currently work if you are in the EU and if your Apple ID Country/Region is also in the EU,” Apple notes in a support article. “If traveling outside of the EU, Apple Intelligence will work when your device language and Siri language are set to a supported language.”

Also, as things stand, Apple Intelligence won’t work on phones bought on the Chinese mainland. Those traveling to China with an iPhone they bought elsewhere also won’t have access to the tools if their Apple ID Country/Region is set to mainland China.

Apple is hoping to bring Apple Intelligence to the EU and China, however. The company told TechCrunch that it's in talks with regulators in both markets over the issue. Apple is initially withholding the AI tools from the EU over concerns related to the Digital Markets Act.

Update 9/18 10:41AM ET: Added a note that Apple is in discussions with the EU and China over Apple Intelligence. 

This article originally appeared on Engadget at https://www.engadget.com/ai/apple-confirms-expanded-language-support-for-apple-intelligence-in-2025-140548274.html?src=rss

Apple reveals how it’s made the iPhone 16 series (much) easier to repair

Apple has slowly been making its devices easier to fix, but the iPhone 15 fell short in a couple of key areas, according to the repairability site iFixit. Namely, the battery was hard to remove and the device suffered from a "parts pairing" issue that meant you couldn't easily replace the LiDAR sensor with one from another phone. With those two problems, iFixit gave the iPhone 15 a relatively low 4/10 repairability score

Apple has now released new updates on iPhone 16 repairability and appears to have addressed both those issues and a bunch more. Saying it tries to strike a balance between durability and repairability, it focused particularly on the "repairability" aspect with its latest devices. 

There's now an entirely new way to remove the battery that's supposed to make it easier. By running a low voltage electrical current through the new ionic liquid battery adhesive (using a 9V cell, for instance), the battery will release itself from the enclosure. This makes removal faster and safer compared to previous stretch release adhesives, according to the company.

At the same time, Apple made changes to the Face ID sensor hardware starting with the iPhone 16 and iPhone 16 Pro. Now, the TrueDepth Camera can be swapped from one unit to another without compromising security or privacy. Before, only Apple was able to do that type of repair.

Another big change is the new Repair Assistant, designed to address parts pairing issues. That lets customers and repair professionals configure both new and used Apple parts directly on the device, with no need to contact Apple personnel. Repair shops previously needed to order official components directly from Apple and get on the phone with an employee before iOS would accept individual parts replacements.

Apple added newly repairable modules too, saying the TrueDepth Camera can now be configured on-device for iPhone 12 and later, eliminating the need for a tethered Mac. In addition, the LiDAR scanner on iPhone Pro models is now serviceable with the rear camera model.

Another big change is on-device access to diagnostics. Starting with iOS 18, Apple diagnostics for repair will be available on device, so customers can determine which parts need to be replaced without the need for a second device.

Finally, the company announced new support for third-party and used Apple parts. If a third-party part can't be calibrated on Apple's cloud-based servers, the iPhone or other device will try to activate the part and operate it to its full capability, while showing the repair history within settings. Used Apple parts can soon be calibrated and will appear as a "used" part in the device's repair history. Another future update will enable True Tone for third-party displays and battery health for third-party batteries. In addition, the LiDAR Scanner and front camera will still work when the module is replaced and left unconfigured. 

All told, the iPhone 16 series looks to have one of the biggest jumps in repairability yet, with improvements in physical access, parts compatibility and parts pairing. We'll soon see if that's reflected in iFixit's impending repairability score.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-reveals-how-its-made-the-iphone-16-series-much-easier-to-repair-120055256.html?src=rss

iPhone 16 Pro and Pro Max review: Apple focuses on cameras and customization

It may seem like Apple is behind the competition a lot of the time. The company appeared to be slow to developments like widgets, bezel-less displays with camera notches and screens with high refresh rates. And with the iPhone 16 Pro, it appears to once again be late to the party, bringing generative-AI features and a real button for the camera to its 2024 flagship. But if you'll allow me to play therapist for a moment, I think it's not that Apple is slow. I think Apple is cautious. Perhaps overly so.

Caution on its own isn't a bad trait — in fact, it could be considered thoughtful. Rather than rush to the cutting edge with its peers, Apple deliberates, usually finding a slightly different approach that is often an improvement on what's out there. Just look at the Vision Pro headset or Apple Silicon. Or even the iPod, the iPad and the AirPods, which were far from the first of their kind when they launched.

With the iPhone 16 Pro, the focus is on cameras and Apple Intelligence. The problem is, Apple Intelligence isn't quite here yet. We can test some features in the developer beta that's currently available, but that's not necessarily the same as the experience the public will get when the update rolls out in October. It’s not unprecedented for new iPhones to launch without some marquee features, sure, and thankfully there's still plenty that the iPhone 16 Pro brings. From Camera Control, the Fusion Camera and other video-related updates to slightly bigger displays and iOS 18, the iPhone 16 Pro and Pro Max are intriguing successors, even absent the vaunted Intelligence features that are still to come.

I’m getting deja vu. Looking back at my review of the iPhone 15 Pro, I see a picture of that phone and its predecessor lined up side by side to show just how much thinner the bezels are. Apple has once again trimmed the borders on its flagship phones, but while doing that enabled it to reduce the handsets’ size in 2023, this year it allowed the company to cram in larger screens without much change in footprint.

The iPhone 16 Pro and Pro Max displays have increased in size from 6.1 inches and 6.7 inches up to 6.3 inches and 6.9 inches, respectively. Both handsets have grown ever so slightly, too, by just under 1mm in width and about 3mm in height.

Basically, the iPhone 16 Pro and Pro Max are a hair wider and taller than their predecessors, but maintain the same 8.25mm (0.32-inch) profile. And yet, in spite of this minimal change, you won’t be able to keep your old cases if you’re upgrading from an iPhone 15 Pro to an iPhone 16 Pro.

Not only would the cases not quite fit, you’d also need something with either a cutout or a sapphire crystal and conductive layer to be able to use the new Camera Control. Of course, Apple sells compatible cases, as do some third parties like Otterbox, so you have plenty of options.

I’ve spent most of this year’s hardware review season remarking how Samsung and Google’s flagships feel like iPhones, and I’ve now reached a strange inception point. As I’ve been comparing competing phones for this review, I’ve been surrounded by about a dozen handsets from all these different companies on my couch, including last year’s iPhones, the Galaxy S24 Plus and the Pixel 9 Pro and Pro XL. Trying to figure out which one is the iPhone has become more confusing than ever, as they all feel similar in build. The best way to verify at a glance is looking at their camera arrays or my wallpaper.

All that is to say that the iPhone 16 Pro feels similar to its predecessor, which is what these other companies have been attempting to emulate. Apple would be right to feel flattered by this imitation, and yet I have to wonder if it’s time to do something different. Google’s Pixel 9 Pro is actually a whole six grams lighter than the iPhone 16 Pro at 221 grams (7.79 ounces), and I’m absolutely smitten by its rich pink hue and shiny edges. Though I like the new golden Desert color for the iPhone 16 Pro, I do wish Apple’s premium flagship had more fun and vibrant exteriors. That said, I do love the base iPhone 16 in pink, teal and Ultramarine.

Close up shots of the bottom half of the iPhone 16 lineup, featuring from left to right a pink, teal, white and gold phones.
Brian Oh for Engadget

Arguably the biggest change to the iPhone 16 lineup, not to mention the iPhone 16 Pro, is the introduction of Camera Control. This is a button on the right side of the device, which has touch and pressure sensors on it to enable greater control with swipes and semi-presses. (That’s in addition to the Action Button on the top left that was added to last year’s Pros, and carries over to the iPhone 16 and iPhone 16 Plus, too.)

One of the things this was supposed to do was let you push lightly on the button to trigger focus, similar to what half pressing a DSLR shutter button would do. That function won’t be available at launch, so I can’t say if it’s effective.

But by and large, Camera Control is a very Apple approach to a feature that has been around for years. From phones by Sony and Nokia with dedicated shutter buttons to Android handsets with hardware-based double-click shortcuts, the notion of quick access to your camera without having to futz with the screen is a popular one. For good reason, too — I’ve hated having to swipe or long-press the icon on my iPhone’s lock screen in the past, and even though I could set the iPhone 15 Pro’s Action button to open the camera, it just wasn’t positioned well and I’d have to give up my mute button.

So Apple isn’t breaking new ground with its hardware shortcut for a frequently used app. But it does do a few things differently with the touch sensor. You can swipe on it to tweak things like exposure, zoom levels and tone, and the half-press still works as a way to select options or go back out of menus within the new Camera Control interface. In theory, it’s a nice way to make changes on the fly.

In reality, there were a few issues, and they largely have to do with placement. The button sits a little farther from the base of the phone than I’d like, so my fingers have to reach a bit more to press it, whether I was in landscape or portrait mode. This wasn’t usually a problem when I had both hands free and could steady the iPhone with my other hand and readjust my grip.

But if you’re trying to take a quick shot with just one hand, the button’s location can feel unintuitive. Of course, everyone has different finger lengths and ratios, so it’s entirely possible that other people find this logical. It also depends on your grip — if you’re cradling the bottom of the device in your palm, it’s harder to maneuver. If you’re covering part of the screen and reaching for the button head on, it’s slightly easier to use camera control.

The iPhone 16 Pro held up in mid air by two hands, with the camera app open and showing people walking on a New York City street.
Brian Oh for Engadget

Still, even for those with the strongest claws, swiping and half-pressing and double-half-pressing on the sensor is tricky. I was only ever really able to do that if I had my thumb holding up the bottom edge and my middle, ring and little fingers steadying the right end of the phone. Maybe this is a new camera grip I just need to relearn for this button.

The awkward placement is a minor gripe compared to what I found most annoying: the button’s touch sensor. Not only was it difficult to swipe through different settings when holding the device with one hand, it also reacts to accidental touches and swipes. Sometimes, the phone would slide down my palm and change the exposure or zoom level, completely ruining the vibe. I should point out that you can go into accessibility settings to either tweak the swipe sensitivity or turn it off altogether, if it really bothers you. Honestly, if you’re planning on making adjustments with Camera Control, it’s best to have time, patience and both hands free.

In those situations, I had a lot of fun editing settings and watching them be reflected in the viewfinder in real time. I also liked zooming in and out of subjects, recomposing a shot and tweaking exposure till I liked what I saw, before then pushing down to snap the picture. (This action does lead to some small issues, but more on the actual photo quality later.) I especially loved this while recording video, since it makes slowly zooming in or out of a subject smoother than using the onscreen slider.

Then again, for scenarios where I just want to fire off a quick shot without worrying about exposure or zoom settings, the pain of finagling with the sensor mostly goes away. In exchange, being able to rapidly snap pictures is a joy. I found myself taking more pictures than ever thanks to camera control, which if you know me is a feat worthy of the Guinness Book of Records.

A random person cut me off in line? Click. Funny sign on a building I pass by in a Lyft? Click, click. From your lock screen, you’ll have to press the button twice — once to wake the phone up and once to open the camera. Then press again to take the photo. It’s not ideal, but not too far off the same process on a Pixel phone, for instance. Plus, you can long-press the iPhone’s button to start recording a video, and it’ll automatically stop when you let go.

Close up of the iPhone 16 Pro's rear cameras with greenery in the background.
Cherlynn Low for Engadget

This sort of rapid access to the camera is the best thing about the new button, and I could see it being potentially useful not just for shutterbugs like me, but for the upcoming Visual Intelligence feature that Apple teased at its launch event. The company’s version of Google Lens could allow people to ask questions about things in the real world around them. But of course, since this wasn’t available during my review period, I wasn’t able to test it.

For now, you can go into Settings to either change the number of clicks it takes to trigger the camera app, remap it to a Code scanner or the Magnifier tool or disable it altogether. Since you can also set up the Action button to do these things, you have more choices now over where you want your camera shortcut or free up the former volume slider to do something else.

Even if you’re not a glutton for buttons, there are still some camera updates that might intrigue you. This year’s flagships sport what Apple calls a 48-megapixel Fusion Camera, which has a faster quad-pixel sensor. This enables what the company describes as “zero shutter lag,” which is wording it has used repeatedly over the years. In this case, it’s referring to how quickly the camera will capture a shot after you press the shutter button (onscreen or hardware).

I will admit I was initially confused by this update, in part because it requires relearning some behaviors I had adopted to mitigate the shortfalls of older cameras. Basically, the iPhone 16 Pro’s cameras are now so fast that when I asked someone to throw something so I could capture it in motion to see how still the images were, my shots ended up being of the person holding the object.

Our video producer and I were very confused, and it wasn’t until the “zero shutter lag” concept was explained clearer to me that I got it. I had become used to pressing the shutter early since cameras, in my experience, would be fractions of a second slow. Apple has become so fast that it actually captured the literal moment I tapped the button, instead of the split second after, when the object was in mid-air.

A woman flinging two cushions to her left, in a sample photo demonstrating the iPhone 16 Pro's 48-megapixel Fusion Camera's speed.
Brian Oh for Engadget

This is going to change how people take jump shots, I’m sure, but basically if you and your friends are taking pictures of yourselves floating in the sky, the photographer doesn’t have to hit capture before telling you to jump. I know this is a very specific and silly example, but it’s also the most relatable illustration of how much quicker the Fusion camera is.

Also, why can’t camera stories be silly and fun? That’s what a lot of the best moments in life are, and some of the new features are great in those situations. The support for 4K video at 120 fps in Dolby Vision, for example, led to some beautiful high-quality, rich and colorful clips of my friend’s adorable pomeranian trotting along on a walk. Her little tongue slowly peeking out as she bounded towards the camera looked crisp and smooth when I played it back at 25 percent and 20 percent speeds, too.

Depending on your mood, the new Photographic Styles can be fun or serious. Apple’s tweaked the built-in camera filters to not only offer more options but give you greater control. Due to how the company has refined its processing each year, there’s also an improved depth map captured when it detects a face in the scene. This, combined with a greater focus on color science around skintone, has led to what might be my favorite new iPhone 16 feature.

Whether I shot them in Portrait mode or not, photos of people that I took using the iPhone 16 Pro were a dream to edit. Simply switching between the Standard, Natural, Luminous, Quiet or Ethereal styles already resulted in improvements to the colors and shadow, but I could also tap on each thumbnail to access the new editing touchpad and drag a dot around. This let me more precisely tweak the hues and contrast levels, and an additional slider below let me adjust how warm the image was.

A composite of four sample photos featuring a woman gazing into the camera, each with a different Photographic Style applied. A label at the bottom right of each image shows which Style is used and they are, from left to right, Standard, Ethereal, Luminous and Vibrant.
Cherlynn Low for Engadget

An ugly selfie with my cousin in the hideous overhead lights of a meeting room became a beautiful snapshot after I switched to the Ethereal or Luminous styles. Both of those are quickly becoming my favorites, but I’m more impressed with how well Apple was able to segment the subject from the background. In almost every shot I edited, adjusting the slider mostly only changed the background, keeping people and their complexions within the realm of reality instead of applying harsh oversaturation or extreme contrast levels to them. They also added a background blur that lent a pleasant soft focus effect, and most of the time the system accurately identified outlines of people in the scene.

Perhaps my favorite part is the fact that you can change between styles after you’ve shot the photo on the iPhone 16. As someone who dwells on her Instagram filters and edit tools for some time before each post, I definitely appreciate how much nicer Apple’s versions are and only wish I could retroactively apply them to photos I had taken at a recent wedding. Alas, since the edits are dependent on information captured when the photos were taken, these new retouching features will only work for pictures taken with an iPhone 16 or 16 Pro.

One final camera update I’ll touch on before telling you about actual photo quality is Audio Mix. This uses the spatial audio now recorded by default with the new studio mics on the iPhone 16 Pro (or even the system on the iPhone 16 and 16 Plus) to understand the direction of sound sources in your footage. Then, when you edit the clip, you can choose between Standard, In-frame, Studio and Cinematic mixes, as well as drag a slider to reduce background noise.

You’ll have to be recording in fairly specific acoustic scenarios to get the most out of Audio Mix. I tested it in a variety of situations, like my cousin talking on his phone on a busy New York street, me interviewing my fellow gym buddies after a tiring workout with the background music quietly playing or my friend talking to me while his wife talks about something else off-camera in their fairly quiet kitchen.

For the most part, going to Cinematic or Studio modes from Standard resulted in a noticeable reduction in environmental noise. My favorite is Studio, which generally seemed to improve voice clarity as well, making people sound like they could be talking on a podcast. In-frame, however, rarely did what I expected and occasionally produced some warped distortion. It appears there might need to be more distance between various sources of sound for this to work best, and I have to spend more time testing to better understand this tool. You can check out our review video for examples of a clip with different audio mixes, but for now, while the promised improvements aren’t what I expected, there at least appears to be some benefit to Audio Mix.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro.
Cherlynn Low for Engadget

On to the actual photos and how they hold up against the competition. I’ve long considered Google’s Pixel phones to be the gold standard in smartphone photography, since I prefer the company’s color and detail processing. I know some people feel that Google tends to oversharpen, so bear in mind that, as with most things, your preference may be different from mine.

When I compared photos I took with both phones on the same laptop screen, the differences were minimal. Occasionally, Google would expose better, being more able to retain shadows near a bright light source than the iPhone 16 Pro. But the Pixel’s nightscape shots had more light leakage into the sky, whereas Apple was more adept at keeping the background dark against the outline of a skyscraper.

Honestly at this point we’re really nitpicking and pixel-peeping to find differences. Both companies deliver great cameras, and though I still prefer Google’s approach to Portrait shots, Apple has been slowly but surely closing the gap with improvements to its depth maps every year.

I will mention, though, that a lot more of the photos I shot on the iPhone 16 Pro came out blurrier than the Pixel 9 Pro, and it might have to do with the fact that I was using the Camera Control to snap them. This was the issue I alluded to earlier, where using a physical button to take a picture is more likely to introduce shake than a software shutter. It’s not like Samsung or Google phones are immune to this problem, though I will say that the way Camera Control is built, where the recessed button depresses into the phone’s frame, does leave it a bit more vulnerable to this than, say, using a volume rocker might.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro, featuring colorful motorcycle parts on a table.
Cherlynn Low for Engadget

Oh and finally, a quick note for my Gen Z readers: I know how much you all prefer flash photography compared to night modes in low light scenarios. (Thanks to my much younger cousin for the valuable insight.) I’ve done the testing and can say that I prefer Google’s Pixel 9 Pro for its software, warmer flash compared to the iPhone 16 Pro’s, which is stronger and brighter, leading to my face looking washed out.

It’s been about two months since the public beta for iOS 18 was released, and it was nice to get a taste of upcoming features like the new customizable home pages, expanded Tapback reactions and the redesigned Photos app. With the iPhone 16 launch, iOS 18 is basically ready for primetime… with some caveats.

This year, more than ever, it’s hard to figure out what’s coming to your iPhone and what isn’t. With the release of Apple Intelligence slated for October, features like writing tools, Cleanup for photos and the redesigned Siri won’t be ready till next month. And even then, your non-pro iPhone 15 won’t be compatible.

Plus, some features that were teased at WWDC, like Genmoji, still haven’t been added to the iOS 18.1 developer beta, which is where most Apple Intelligence features have been arriving as a preview for app makers. Within the iPhone 16 lineup, too, there are things coming only to the Pro models, like multilayer recording in Voice Memos.

It’s confusing, and can make choosing your iPhone a trickier decision. But for this review, at least the iPhone 16 Pro and Pro Max are getting everything. I cannot wait to try out multi-track recording in Voice Memos, and I hope Apple sees this yearning as a sign that it should bring this to more devices.

It was nice to get time with iOS 18, even in the absence of Apple Intelligence. Honestly, I’m not even sure I’d like those features that much. In a similar way, Gemini AI was nice on the Pixel 9 Pro series, but didn’t feel like must-haves.

Some of the new iOS 18 touches I noticed immediately were the refreshed Control Center, which took some getting used to as I had to re-learn how to swipe back to the home page, since there are more pages to scroll through now. I especially enjoyed seeing the new little chat bubble appear on my voice recordings, indicating that a transcript had been generated for them. And though I haven’t exchanged messages with Android-toting friends yet, I’m glad to see RCS support is finally live this week.

The bottom half of both the iPhone 16 Pro and iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

Though I was excited for the new custom routes tool in Maps, I struggled to actually create them. You can set your start and end points and have the app close the loop for you, or just tap landmarks or points on the map to get the route to basically connect the dots. Unfortunately, no matter how many times I tried to get the route to cut through a building where I knew a pedestrian walkway existed, Maps resisted me at every turn, forcing the route to go through more established (and therefore more crowded) paths instead. It’s not unreasonable, but certainly not the open-world route-creation feature I was envisioning.

The best thing about iOS 18, and also some new features in the iPhone 16 lineup (like in the camera controls) is the customizability. I do appreciate that if you don’t like something, you can usually turn it off. With the new ability to place apps outside of a rigid grid, you can now lay your home screen out just the way you like. The redesigned Photos app lets you create and pin collections so you can more easily find the pictures most important to you. And again, I’m glad Apple is giving people the option to turn off Camera Control altogether or adjust its sensitivity.

The iPhone 16 Pro and Pro Max are powered by Apple’s A18 Pro chip, which are built on “second-generation 3-nanometer technology and [feature] a new architecture with smaller, faster transistors.” All this is meant to deliver “unprecedented efficiency,” according to Apple’s press release.

Some small software glitches aside, I’ve never run into slowdown on the iPhone 16 Pro, but I was certainly surprised by the smaller handset’s battery life. In general, the iPhone 16 Pro would barely last a full day, which is reminiscent of the iPhone 15 Pro, too. It’s worth noting that before this review I was primarily using an iPhone 15 Pro Max as my daily driver, which usually gets through a day and a half with no problem, so the drop in endurance is even more pronounced for me.

Most days, I’d pick up the iPhone 16 Pro at about 9AM and would get to about 9pm before getting low battery alerts. If I started the day a bit later, closer to 11AM for instance, I got to 1am before the iPhone 16 Pro ran completely dry. On Sunday, I unplugged the phone at about 9:30AM and was shocked on the train home to get a warning that remaining power was at just 20 percent. It was only 6:50PM, and the night had barely just started!

You’ll get significantly better battery life on the iPhone 16 Pro Max, which delivers the same almost two-day runtime as its predecessor. And sure, a phone with a smaller battery not lasting as long makes mathematical sense. But considering the Pixel 9 Pro is a comparably sized handset and manages to last about two days, there’s no excuse for the iPhone 16 Pro to conk out before the night is up.

A white iPhone 16 Pro and a desert iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

One of the best things about the iPhone 16 Pro lineup is that, unlike last year, there isn’t much of a tradeoff in cameras if you opt for the smaller device. The iPhone 15 Pro Max had a 5x telephoto zoom camera, while the iPhone 15 Pro only went up to 3x. As a budding photographer of skittish wild animals, I opted for the Max, especially since it was much lighter than its predecessor thanks to the titanium build.

With the iPhone 16 Pro having essentially the same camera system as the Pro Max, I thought it was time for me to go back to a size that was much easier on my hands. Alas, with the disappointing battery performance, I might just have to stick with a Max, and you might too.

There’s also the non-Pro iPhone 16 models to consider, and you can check out my colleague Billy Steele's review of the iPhone 16 and iPhone 16 Plus for more details. Just as there were fewer differences than ever between the Pro and Pro Max, the tradeoffs aren’t as significant this year, either. Apple brought the previously Pro-exclusive Action button to the iPhone 16 and iPhone 16 Plus, while also including the Camera Control on its less-premium phones. 

The main things that set the two lines apart this year are processors, screen quality, camera sensors and onboard mics. You’ll lose support for ProRaw photos and multi-layer recording by opting for the cheaper devices, too. Basically, if you want all the best features Apple has to offer, or you plan on using your phone to create high-quality videos and get 5x telephoto zoom in your photos, the Pros are the way to go. 

Otherwise, you’ll still have all the iOS 18 and Apple Intelligence features coming to the Pros, as well as spatial audio recording, which enables the Audio Mix I described in the camera section earlier.

The iPhone 16 Pro and iPhone 16 Pro Max held in mid air with their backs facing up.
Cherlynn Low for Engadget

Apple’s caution is sometimes warranted. Especially at a time when mistrust of AI-generated content runs rampant, the company taking its time to get Apple Intelligence right is understandable. But its deliberation doesn’t always lead to winners. While I appreciate the attempt to differentiate camera control with the touch sensor for more versatility, I’m not yet convinced of its usefulness.

The good news is, and I cannot stress this enough, you have the option to tune it to your liking. And that’s a theme I’m seeing in recent Apple features that hint at more thoughtfulness than usual. If you don’t like something, or if something isn’t right for your needs, you can adjust or disable it. In iOS 18, you have greater control over your home screen’s app layout and can pin custom collections for easier reach in the Photos app. The Action button introduced last year could have been a spectacular fail had Apple not let you still keep it as a mute switch, but it managed to give people more functionality while maintaining the status quo for those who are just as resistant to change.

Change is scary. Change is hard. But without change there is no progress. Apple’s cautious approach is a tricky balancing act that’s evident on the iPhone 16 Pro. Some new features, like Audio Mix and custom routes in Maps, deliver mixed results. Others, like Photographic Styles, are hits. Then there are the basic ingredients, like good battery life and durable, attractive designs, that Apple cannot neglect.

The iPhone 16 Pro’s subpar battery life holds it back from beating the competition, which is stiffer than ever this year, especially from Google. Luckily for Apple, most people who have iPhones are going to stick with iPhones — it’s just easier. For those already sucked into the ecosystem, the iPhone 16 Pro (and particularly the Pro Max) are worth the upgrade from a model that’s at least two years old. If you already have an iPhone 15 Pro (or even a 14 Pro), for the sake of our planet and your wallet, you might prefer to hold off on upgrading, especially since this year’s devices aren’t that much different.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-and-pro-max-review-apple-focuses-on-cameras-and-customization-120052459.html?src=rss

Apple halts iPadOS 18 update for M4 iPad Pro after bricking reports

Apple has temporarily paused the rollout of iPadOS 18 for M4 iPad Pro models, some of the most expensive iPads that the company sells, after some users complained that the update bricked their devices. Apple acknowledged the issue in a statement to Engadget, saying, “We have temporarily removed the iPadOS 18 update for M4 iPad Pro models as we work to resolve an issue that is impacting a small number of devices.”

The issue first came to light through Reddit, where a growing number of M4 iPad Pro users described how their iPads became unusable after they tried installing the latest version of iPadOS. “At some point during the update my iPad turned off, and would no longer turn on,” a user named tcorey23 posted on Reddit. “I just took it to the Apple Store who confirmed it’s completely bricked, but they said they had to send it out to their engineers before they can give me a replacement even though I have Apple care.”

Another Reddit user called Lisegot wrote that the Apple Store they took their bricked M4 iPad Pro to did not have a replacement in stock, which meant they they would need to wait five to seven days for a working iPad. “No one was particularly apologetic and they even insinuated that there was no way for them to know whether the update caused this,” they wrote.

Having a software bug brick an iPad is rare. ArsTechnica, which first reported this story, pointed out that iPads can typically be put into recovery mode if a software update goes bad.

If you own an M4 iPad Pro, Apple will no longer offer you iPadOS 18 until it fixes the issue. It’s not clear when it will be fixed.

This article originally appeared on Engadget at https://www.engadget.com/mobile/tablets/apple-halts-ipados-18-update-for-m4-ipad-pro-after-bricking-reports-000258237.html?src=rss

Apple Music brings its audio haptics feature to all users as part of iOS 18

Apple’s Music Haptics feature is now live, as part of the official release of iOS 18. This is an accessibility tool that integrates with Apple Music on iPhones. Simply put, it uses the phone’s speaker-based haptics system, which the company refers to as the Taptic Engine, to create “taps, textures and refined vibrations to the audio of the song.”

This is quite obviously aimed toward those affected by hearing loss, allowing them to feel the music. It works with Apple Music, but also with Apple Music Classical and Shazam. The company says it’ll also integrate with some third-party apps, so long as the iPhone is connected to Wi-Fi or cellular. 

To get started, just head into the Accessibility settings menu and turn on “Music Haptics.” An easily identifiable logo will appear on the Now Playing screen in the Apple Music app when activated. Tapping this logo will pause the feature and tapping it again will turn it back on. Music Haptics is supported globally on iPhone 12 and later, as long as the device is updated to iOS 18.

To commemorate the launch, Apple Music has released a series of playlists that take advantage of the haptic technology. These channels have names like Haptics Beats and Haptics Bass, so they are filled with songs with plenty of opportunity for taps and vibrations.

People have already been experimenting with the feature. Some users have suggested that it “sounds like an Atari game” when a phone is placed on a box with Music Haptics turned on. I don’t agree but, well, listen for yourself.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-music-brings-its-audio-haptics-feature-to-all-users-as-part-of-ios-18-184753345.html?src=rss

AirPods Pro 2’s new features have arrived. Here’s what to expect

Prior to iOS 18's arrival, Apple released a firmware update for the AirPods Pro 2 that will deliver new features the company announced at WWDC in June. Now that the latest version of the mobile OS is available, your iPhone can fully employ the new tools, which include Siri Interactions, Voice Isolation and more. Your AirPods Pro 2 should have already installed the update and be ready to go when you upgrade to iOS 18, so here's what to expect when you use the new features. 

Siri Interactions allow you to interact with your phone at times when you can't or don't want to speak or reach for your phone. Machine learning on the H2 chip and transformer models on a source device (iPhone, iPad, Mac and Apple Watch) can detect when you nod affirmatively or shake your head. This can be used any time Siri asks a yes or no question, like accepting or rejecting calls, responding to or dismissing messages and engaging with or dismissing notifications. 

So far, Siri Interactions have worked as described for me. I like that the tech recognizes smaller head movements, so you don't have to exaggerate them to get the system to respond. I've found the feature most helpful for incoming calls and texts, especially when my hands or full or when I'm in a setting where I can't immediately speak. 

Voice Isolation is a new feature that taps the AirPods Pro 2 H2 chip and the source device (iPhone, iPad or Mac) for advanced machine learning to enhance how you sound on calls. The tech isolates your voice so it can effectively cancel significant amounts of background noise, and for some distractions, it will eliminate them entirely. During my tests, Voice Isolation totally blocked a noisy fan and running water. It's truly impressive how the roar that's otherwise obvious on a call is completely absent when this is enabled. It's also great that the tool works its magic with minimal impact to overall voice quality.

AirPods Pro (2022) review
Billy Steele/Engadget

The feature is enabled automatically in your microphone settings, where you'll find options for Automatic, Standard and Voice Isolation. Here, you can activate Voice Isolation while you're on a call if you don't want the system to handle things on its own. The tool will also be supported in FaceTime and any third-party apps that use CallKit. Those include WebEx, Zoom, WhatsApp and many more. 

As a reminder, Siri Interactions and Voice Isolation are also available on the AirPods 4.

In addition to those two headliners, the update equips the AirPods Pro 2 with "the best wireless audio latency Apple has ever delivered for mobile gaming." What's more, gamers can expect improved voice quality, thanks to 16-bit, 48kHz audio when chatting during sessions. Apple says it also improved Personalized Volume on the AirPods Pro 2, but didn't go into specifics there. Personalized Volume is the tool that adjusts the media levels on your AirPods Pro 2nd based on changes in environmental conditions and your volume preferences. Apple says that the feature learns your listening preferences over time to fine-tune adjustments as they're needed.

One of the biggest announcements from the iPhone 16 event was Apple's plan to turn the AirPods Pro 2 into a set of over-the-counter hearing aids for people with mild to moderate hearing loss. While the company has received FDA approval for the first software-based hearing aid solution that will be available without a prescription, the feature and the accompanying Hearing Test aren't ready just yet. Apple is planning to release the suite of hearing features as part of an update sometime this fall. 

The AirPods Pro 2 update is available for free over the air from your iPhone. You can check the version number under the AirPods settings when the earbuds are connected to an iOS device. You'll want to look for 7A294 to be sure you're running the latest version. If not, you can trigger the update by listening to music for around 30 seconds and then putting the AirPods Pro back in the case. If you notice that the earbuds don't immediately disconnect on the Bluetooth menu, that means the update is happening, so keep the case closed and near your phone until it completes. AirPods Pro will disconnect when the process is over. You'll need to make sure your iPhone is updated to iOS 18 as well. 

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/airpods-pro-2s-new-features-have-arrived-heres-what-to-expect-172023882.html?src=rss

Apple has released iOS 18. Here’s how to update your iPhone

Finally out of beta, iOS 18 arrived for public availability as of Monday afternoon. You can download and install it if your device is compatible, but it already comes with all iPhone 16, iPhone 16 Plus and iPhone 16 Pro models that will be available on September 20. Those with eligible devices can update them by going to Settings > General > About > Software Update and starting the download and installation processes.

To see if your device is eligible, we have a list of iPhone models that can support iOS 18. Check it out and see if yours will work.

Some of the “hidden” features our editor Cherlynn spotted include Apple Maps upgrades, Calendar integration with Reminders and expanded Tapback options in Messages, letting you see who reacted with which emoji. Safari is getting a “Highlights” function, which generates a summary of web pages you’re on via machine learning. Our UK bureau chief Mat Smith also tried out some early iOS 18 features in July, and his main takeaway was that Apple Intelligence is the real star. Unfortunately, Apple Intelligence isn’t out today, but its first features will become available in October as part of a subsequent update.

Besides iOS 18, all of Apple's other major sibling operating system updates are available as well. That includes iPadOS 18, visionOS 2, macOS Sequoia, tvOS 18 and watchOS 11, all of which are coming to their respective devices today. Make sure to check if your devices are eligible for the update and that they have enough space. You may have to free up a few gigabytes of storage first.

Update, September 16, 8:17PM ET: Added more complete list of additional Apple OS updates that are now available, and additional context about Apple Intelligence (some, not all, of the features are arriving beginning in October).  

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-has-released-ios-18-heres-how-to-update-your-iphone-171444043.html?src=rss