macOS Sequoia review: iPhone mirroring is more useful than you think

Apple's macOS updates have been so dull lately, the most interesting part of last year's macOS Sonoma ended up being widgets. Widgets! Thankfully, macOS Sequoia has a lot more going on — or at least it will, once Apple Intelligence rolls out over the next few months. For now, though, Sequoia delivers a few helpful features like iPhone Mirroring, a full-fledged Passwords app and automatic transcription in the Notes app. At the very least, it's got a lot more going on than widgets.

Heading into WWDC earlier this year, I was hoping that Apple would let Vision Pro users mirror their iPhones just as easily as they can mirror their Macs. Well, we didn't get that, but iPhone Mirroring on macOS Sequoia is close to what I'd want on the Vision Pro. Once you've got a Mac (with an Apple Silicon chip, or one of the last Intel models with a T2 security chip) running the new OS, as well as an iPhone running iOS 18, you can easily pair the two using the iPhone Mirroring app.

Once that connection is made, you'll see a complete replication of your phone within the app. It took me a few minutes to get used to navigating iOS with a trackpad and keyboard (there are a few new hotkeys worth learning), but once I did, I had no trouble opening my usual iPhone apps and games. If you're spoiled by the 120Hz ProMotion screen from an iPhone Pro, you'll notice that the mirrored connection doesn't look nearly as smooth, but from my testing it held a steady 60fps throughout games and videos. I didn't notice any annoying audio or video lag either.

macOS Sequoia
Apple

While it's nice to be able to launch my iPhone from my Mac, I was surprised at what ended up being the most useful aspect of this feature: Notifications. Once you've connected your phone, its alerts pop up in your Mac's Notification Center, and it takes just one click to launch the app it's tied to. That's useful for alerts from Instagram, DoorDash and other popular apps that have no real Mac options, aside from launching their websites in a browser.

iPhone Mirroring is also a sneaky way to get in a few rounds of Vampire Survivors during interminably long meetings or classes. (Not that I would ever do such a thing.) While many mobile games have made their way over to the Mac App Store, there are still thousands that haven't, so it's nice to have a way to access them on a larger screen. Not every game works well on Macs — it's just tough to replicate a handheld touchscreen experience with a large trackpad — but mirroring is a decent option for slower-paced titles. I didn't encounter any strange framerate or lagging issues, and sound carried over flawlessly as well.

I typically always have my phone within reach, even when I'm working at a desk. But picking it up would inevitably disrupt my workflow — it's just far too easy to get a notification and find yourself scrolling TikTok or Instagram, with no memory of how you got there. With iPhone Mirroring, I can just keep on working on my Mac without missing any updates from my phone. It's also been useful when my iPhone is connected to a wireless charger and I desperately need more power before I run out the house.

If you're the sort of person who leaves your phone around your home, I'd bet mirroring would also be helpful. The feature requires having both Bluetooth and Wi-Fi turned on, and the connection range is around 50 feet, or what I'd expect from Bluetooth. Thick walls and other obstructions can also reduce that range significantly. In my testing, I could leave my iPhone in my backyard and still be able to mirror it in my living room 40 feet away. Naturally, the further you get, the choppier the experience.

Sure, Apple isn't the first company to bring smartphone mirroring to PCs. Samsung and other Android phone makers have been offering it for years, and Microsoft also has the "Phone Link" app (formerly Your Phone) for mirroring and file syncing. But those implementations differ dramatically depending on the smartphone you're using, they don't seamlessly integrate notifications and simply put, they would often fail to connect. Once you set up iPhone Mirroring, getting into your phone takes just a few seconds. It just works. And after testing the feature for weeks, I haven't run into any major connection issues.

Apple macOS Sequoia
Photo by Devindra Hardawar/Engadget

It's 2024 and Apple has finally made it easier to position Mac windows around your monitor. Now you can drag apps to the sides or corners of your screen, and they'll automatically adjust themselves. It's allowed me to quickly place a browser I'm using for research alongside an Evernote window or Google Doc. Similar to Stage Manager in macOS Ventura, the tiling shortcuts are a significant shift for Mac window management.

And, of course, they're also clearly similar to Windows 10 and 11's snapping feature. Given that much of Apple's UI focus is on iOS, iPadOS and VisionOS these days, it's easy to feel like the Mac has been left behind a bit. I don't blame Apple for cribbing Microsoft's UI innovations, especially when it makes life easier for Mac users.

Apple macOS Sequoia
Photo by Devindra Hardawar/Engadget

Apple has offered lighting adjustments and portrait background blurring in video chats for years, and now it's using that same machine learning technology to completely replace your backgrounds. Admittedly, this isn't a very new or exciting feature. But it's worth highlighting because it works across every video chat app on your Mac, and since it's relying on Apple's Neural Engine, it looks much better than software-based background replacements.

Apple's technology does a better job of keeping your hair and clothes within focus, but still separated from artificial backgrounds. And best of all, it doesn't look like a cheap green screen effect. You can choose from a few color gradients, shots of Apple Park or your own pictures or videos.

Here are a few other upgrades I appreciated:

  • The Passwords app does a decent job of collecting your stored passwords, but it's clearly just a first attempt. It's not nearly as smart about plugging in my passwords into browser fields as apps like 1Password and LastPass.

  • The Notes app now lets you record voice notes and automatically transcribes them. You can also continue to jot down text during a voice recording, making it a useful way to keep track of interviews and lectures. I'm hoping future updates add features like multi-speaker detection.

  • Being able to jot down math equations in Notes is cool, but it's not something I rely on daily. I'm sure it'll be very useful to high school and college kids taking advanced math courses, though.

  • Messages finally gets rich text formatting and a send later option. Huzzah!

You’d be forgiven for completely ignoring the last batch of macOS updates, especially if you haven’t been excited about Stage Manager or, sigh, widgets. But if you’re a Mac and iPhone owner, Sequoia is worth an immediate upgrade. Being able to mirror your iPhone and its notifications is genuinely useful, and it’s stuffed with other helpful features. And of course, if you want to get some Apple Intelligence action next month, you’ll have no choice but to upgrade. (We’ll have further impressions on all of Apple’s AI features as they launch.)

Sure, it’s a bit ironic that Apple’s aging desktop OS is getting a shot of life via its mobile platform, but honestly, the best recent Mac features have been directly lifted from iOS and iPadOS. It’s clear that Apple is prioritizing the devices that get updated far more frequently than laptops and desktops. I can’t blame the company for being realistic – for now, I’m just glad it’s thoughtfully trying to make its devices play nice together. (And seriously, just bring iPhone mirroring to the Vision Pro already.)

This article originally appeared on Engadget at https://www.engadget.com/computing/macos-sequoia-review-iphone-mirroring-is-more-useful-than-you-think-140008463.html?src=rss

Google TV Streamer goes on sale today with home panel, sports page and more

Last month the Google TV Streamer was announced as a replacement for the Chromecast line and it's arriving in stores today for $100. As part of that, Google is bringing TV Streamer features like the the smart home panel and AI to Google TVs from Hisense and others.

A key feature is the previously announced home panel that's now coming to Google TVs as well. You can see and control all compatible smart home devices (lights, thermostats, cameras, etc.) directly on your TV either with your voice or the Google Assistant. New doorbell notifications also show you who's at the front door without pausing your program. 

Google also announced a few new features we haven't seen yet. If you want to use your TV as a picture frame, the Ambient screensaver displays Google photos and even lets you create AI-generated designs through a series of prompts. The latter is a pretty novel feature, if you don't mind looking at soulless machine-created art.

Google TV Streamer arrives and its features are coming to all Google TVs
Google

Speaking of AI, Google is adding "enhanced" Gemini-created AI overviews (above) for popular movies and series on streaming services like HBO. "These overviews include full summaries, audience reviews and season-by-season breakdowns" to help you choose a show, Google said. 

Other new features include a new sports page in the For You tab that puts games, YouTube highlights, commentary and more in one place. Google also added a guide for its live TV Freeplay service (previously soft-launched) so you can better keep track of its 150 or so free channels. The new features are set to roll out today on the Google TV Streamer and smart Google TVs from Hisense and TCL, or projectors from Epson and XGIMI.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/google-tv-streamer-goes-on-sale-today-with-home-panel-sports-page-and-more-140005554.html?src=rss

The Apple Watch Ultra 2 drops to a record-low price ahead of October Prime Day

The Apple Watch Ultra 2 is the high-end option in the company's smartwatch lineup, meaning it costs a pretty penny. Ahead of October Prime Day, however, the Apple Watch Ultra 2 has dropped to a record-low price of $689, down from $799. The 14 percent discount is available on models with a Rugged Titanium Case for small to large wrists. This includes watches fitted with the Blue Alpine Loop, Indigo Alpine Loop, Orange Ocean Band and more. 

Apple released the Ultra 2 in 2022 and, despite rumors, has yet to announce a successor. Our review gave the Ultra 2 an 85 thanks to features like its long battery life. It lasts for about 36 hours, letting us easily use it for three days without a recharge. Apple also claims it can get to about 60 hours with Lower Power Mode enabled, a setting that turns off the Always On Display and features like cellular connections and heart rate notifications. 

We named the Ultra 2 the best Apple Watch for adventurer due to its sizable battery life, compass app, water temperature gauge and loud onboard siren in case you get in trouble. Plus, it can set waypoints and offers the Wayfinder watch face in Night mode. One of our quips about the watch, though, was that it was too easy to accidentally press the action button instead of the crown. 

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice in the lead up to October Prime Day 2024.

This article originally appeared on Engadget at https://www.engadget.com/deals/the-apple-watch-ultra-2-drops-to-a-record-low-price-ahead-of-october-prime-day-133416959.html?src=rss

See the iPhone 16’s game-changing battery removal process in new iFixit teardown

Apple introduced some major repairability improvements with the iPhone 16 lineup, but nothing stands out as much as the new battery removal process for the base iPhone 16. Doing away with the usual pull tabs, Apple is using an adhesive that debonds in response to a low electrical current. It only takes about a minute and a half for it to come unstuck, per Apple’s repair guide. A teardown by iFixit shows the process in action, and it sure looks easier than ever. iFixit tech Shahram Mokhtari said, “I’m not sure we’ve ever had a battery removal process go so cleanly and smoothly.”

Only the iPhone 16 and iPhone 16 Plus have the new adhesive, and they’ve earned a 7/10 on iFixit’s repairability scale. “Apple definitely seems to be leveling up on repairability,” Mokhtari, adding Apple has “landed another repairability win” with this year’s base iPhones thanks to the new battery removal procedure. 

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/see-the-iphone-16s-game-changing-battery-removal-process-in-new-ifixit-teardown-213911136.html?src=rss

Jony Ive confirms he’s working with Sam Altman on a secret project

Rumors emerged last year of a collaboration between former Apple designer Jony Ive and OpenAI CEO Sam Altman, but the two have until now kept quiet about it. In a profile by The New York Times that was published this weekend, though, Ive confirms his company LoveFrom is leading the design on an AI product being built with Altman. Also on board are Tang Tan and Evans Hankey, both of whom held big design roles at Apple.

There’s so far a team of about 10 employees involved with the project, based in a San Francisco office building that’s one of several properties Ive has purchased on a single city block, according to the Times. But we still don’t know much about the product they’re working on. The report describes Tan and Hankey wheeling chairs between the LoveFrom properties that were “topped by papers and cardboard boxes with the earliest ideas for a product that uses A.I. to create a computing experience that is less socially disruptive than the iPhone.”

Since Ive left Apple in 2019 to start LoveFrom, the design firm has worked on a typeface and even a red clown nose, but we haven’t seen much in the way of hardware — just that $60,000 turntable. While an AI product seems to be on the horizon, there’s currently no timeline for when it’ll make its debut.

This article originally appeared on Engadget at https://www.engadget.com/ai/jony-ive-confirms-hes-working-with-sam-altman-on-a-secret-project-163201291.html?src=rss

Early Prime Day deals include the Pixel Buds A-Series for only $64

Amazon’s next Prime Day event is right around the corner, and the deals have already started trickling in. Google’s Pixel Buds A-Series earbuds have dropped down to just $64 from their normal price of $99. The A-Series, released in 2021, was Google’s more budget-friendly version of its 2020 Pixel Buds. They lack more advanced features like wireless charging and active noise cancellation, but the sound quality and battery life are decent for the price. If you’re an Android user looking for a good pair of earbuds that won’t break the bank, you might want to check these out.

The Pixel Buds A-Series may be a few years old now, but it’s still a nice pair of earbuds. We gave the Pixel Buds A-Series a score of 84 in our review when the model was first released, and were especially impressed with the sound quality, Google Assistant integration and comfort. The A-Series buds have a small “stabilizer arc” appendage to help them sit securely in the ears. There are some on-board controls, including play/pause, answer call and skip tracks, but they don’t have physical volume controls — for that, you’d need to use Google Assistant or adjust the volume on your device.

Google says the Pixel Buds A-Series earbuds get about five hours of listening time on a charge, or 2.5 hours of talk time. With the charging case, listening time goes up to about 24 hours. With the current deal, you can get the Pixel Buds A-Series earbuds in Clearly White or Dark Olive for $35 off the usual price. For a dollar more, you can grab them in Charcoal. (The pale blue Sea color option unfortunately isn’t covered in the discount). At $64, the Pixel Buds A-Series is almost at a record low, and cheaper even than during Prime Day in July.

Follow @EngadgetDeals on Twitter for the latest tech deals and buying advice in the lead up to October Prime Day 2024.

This article originally appeared on Engadget at https://www.engadget.com/deals/early-prime-day-deals-include-the-pixel-buds-a-series-for-only-64-152116488.html?src=rss

The Arc browser that lets you customize websites had a serious vulnerability

One of the feature that separates the Arc browser from its competitors is the ability to customize websites. The feature called "Boosts" allows users to change a website's background color, switch to a font they like or one that makes it easier for them to read and even remove an unwanted elements from the page completely. Their alterations aren't supposed to be be visible to anyone else, but they can share them across devices. Now, Arc's creator, the Browser Company, has admitted that a security researcher found a serious flaw that would've allowed attackers to use Boosts to compromise their targets' systems. 

The company used Firebase, which the security researcher known as "xyzeva" described as a "database-as-a-backend service" in their post about the vulnerability, to support several Arc features. For Boosts, in particular, it's used to share and sync customizations across devices. In xyzeva's post, they showed how the browser relies on a creator's identification (creatorID) to load Boosts on a device. They also shared how someone could change that element to their target's identification tag and assign that target Boosts that they had created. 

If a bad actor makes a Boost with a malicious payload, for instance, they can just change their creatorID to the creatorID of their intended target. When the intended victim then visits the website on Arc, they could unknowingly download the hacker's malware. And as the researcher explained, it's pretty easy to get user IDs for the browser. A user who refer someone to Arc will share their ID to the recipient, and if they also created an account from a referral, the person who sent it will also get their ID. Users can also share their Boosts with others, and Arc has a page with public Boosts that contain the creatorIDs of the people who made them. 

In its post, the Browser Company said xyzeva notified it about the security issue on August 25 and that it issued a fix a day later with the researcher's help. It also assured users that nobody got to exploit the vulnerability, no user was affected. The company has also implemented several security measures to prevent a similar situation, including moving off Firebase, disabling Javascript on synced Boosts by default, establishing a bug bounty program and hiring a new senior security engineer.

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/the-arc-browser-that-lets-you-customize-websites-had-a-serious-vulnerability-133053134.html?src=rss

Twitch will do a better job of telling rulebreakers why their accounts were suspended

TwitchCon San Diego is taking place this weekend and, as always, the platform had some news to share during the opening ceremony. For one thing, Twitch CEO Dan Clancy said the service will offer streamers and viewers who break the rules more clarity over why their accounts were suspended.

Soon, Twitch will share any chat excerpt that led to a suspension with the user in question via email and the appeals portal. Eventually, this will expand to clips, so streamers can see how they were deemed to have broken the rules on a livestream or VOD. "We want to give you this information so that you can see what you did, what policies were violated, and if you feel our decision was incorrect, you can appeal," Twitch wrote in a blog post.

The service is also aware that permanent strikes on an account can pose a problem for long-time streamers who may eventually get banned for a smaller slip up. To that end, Twitch is bringing in a strike expiration policy starting in early 2025. "Low-severity strikes will no longer put streamers’ livelihoods at risk, but we’ll still enforce the rules for major violations," Twitch said. "Plus, we’re adding more transparency by showing you exactly what led to a strike."

On the broadcasting front, viewers of streamers who are using Twitch's Enhanced Broadcasting feature will be able to watch streams in 2K starting early next year. This option will be available in select regions at first, with Twitch planning to expand it elsewhere throughout 2025. Also of note, Clancy said that "we're working on 4K."

Also coming in 2025 is the option for those using Enhanced Broadcasting to stream vertical and landscape video at the same time. The idea here is to offer viewers an optimal experience depending on which device they're using to watch streams.

Elsewhere, Twitch is planning some improvements to navigation in its overhauled mobile app, such as letting you access your Followed channels with a single swipe and prioritizing audio from the picture-in-picture player. Streamers will have access to a feature called Clip Carousel, which will highlight the best clips from their latest stream and make them easy to share on desktop and mobile. The platform says it'll be easier for viewers to create clips on mobile devices too.

In addition, Twitch will roll out a shared chat option in the Stream Together feature next week, allowing up to six creators who are streaming together to combine their chats. Streamers' mods will be able to moderate all of the messages in a shared chat and time out or ban anyone who crosses a line. Creators who hop on a Stream Together session can also turn off Shared Chat for their own community.

Last but not least, Twitch will expand its Unity Guilds and Creator Clubs. The idea behind both is to help streamers forge connections, learn from each other and grow with the help of Twitch staff. Over the last year, Twitch has opened up the Black Guild, Women’s Guild and Hispanic and Latin Guild, and it just announced a Pride Guild for the LGBTQIA+ community. All four guilds will expand to accept members from around the world next year.

Creator Clubs are a newer thing that Twitch debuted last month for the DJ and IRL categories. Twitch says that engagement has been higher than expected. Four more Creator Clubs are coming soon for the Artists/Makers, Music, VTubers and Coworking/Coding categories.

This article originally appeared on Engadget at https://www.engadget.com/entertainment/streaming/twitch-will-do-a-better-job-of-telling-rulebreakers-why-their-accounts-were-suspended-191502111.html?src=rss

‘We’ve got to make it happen’: How Apple designed AirPods 4 for effective ANC

The AirPods story actually begins with the iPod.

With Apple’s popular personal music player, the company shipped its first set of earbuds. Sure, they were wired and very basic, but the accessory laid the groundwork for what would eventually become AirPods. Along the way, the EarPods would be bundled with the iPhone in 2007, and a 2012 redesign produced something more akin to what would eventually become the first-gen AirPods in 2016. The work the company did to improve the fit of EarPods continues to pay off as Apple prepares to ship the noise-canceling AirPods 4.

“We had started trying to learn a bit about human physiology and what shapes would fit better in people's ears,” Apple’s Vice President of Hardware Engineering Kate Bergeron told me about those early days. “We started doing some MRI scans and trying to figure out how to gather data, but we didn't have a sense of how many scans we'd be looking for, or how many different kinds of ears we needed.”

Over the years, Apple has developed more efficient methods for gathering data, so it was able to build out its database of ear shapes quicker than in the early days of EarPods. Bergeron explained that she expects the company to be “continuing that journey” forever when it comes to developing new versions of AirPods.

The AirPods 4 have a smaller case than the third-gen version.
Billy Steele for Engadget

During what Bergeron described as “the dark days of COVID,” a small group from the AirPods team was trying to solve a dilemma. They wanted to bring effective active noise cancellation (ANC) to the open design of the “regular” AirPods. The crew had already successfully done so on two models of the AirPods Pro and on the AirPods Max headphones. But this time around, it was essential that the open nature of the AirPods remain while also providing the technology to block out distractions.

So in 2021, over the course of several days, Bergeron and AirPods marketing director Eric Treski met up at one of Apple’s acoustic labs for a demo. At that point, the team was unsure if they had anything viable, but they wanted the executive’s feedback on it nonetheless.

“We were just blown away,” Bergeron recalled. “We said ‘we absolutely have something here, we need to go after this and we’ve got to make it happen.’” Acoustic and computational work that was required for an effective ANC algorithm was happening simultaneously with iterations on improving the fit and overall comfort for the AirPods 4.

After testing the AirPods 4, I can say that the fit and comfort have improved since the third-gen model. But Apple also expanded the earbuds’ capabilities with the H2 chip and microphones from the AirPods Pro 2. This combination of advanced tech enables Apple to continuously monitor fit in a user’s ear, updating the ANC algorithm in real time so that the noise blocking is still effective even as the AirPods move around.

“It’s even computationally more intense in many ways than it is with the AirPods Pro,” Bergeron said. “The ear tip gives you a fit that’s pretty consistent.”

Apple’s journey with ANC began with the development of the first-generation AirPods Pro that debuted in 2019. Effective active noise cancellation was usually more common on over-ear headphones, with a few exceptions, but Apple realized that making a distraction-free listening experience “pocketable” was attractive to its users. Of course, the company would follow up with its own headphones, the AirPods Max, before the powerful second-gen AirPods Pro.

Treski explained that the ANC setup, or the third generation of Adaptive EQ as he described it, is constantly managing and adjusting any equalizers for both active noise cancellation and audio quality at the same time – and in real time. So in addition to the revised shape, the acoustic architecture of the AirPods 4 is also instrumental in providing effective ANC on the open earbuds.

“It’s really, really hard to create this great ANC quality in a non-ear-tip product,” he said. “The power of the H2 allows that, so we’re actually doing a lot with the H2 chip to manage ANC quality and listen from the mics for environmental noise to make sure we’re canceling as much as possible.”

Apple refined the shape on the AirPods 4 for a better fit.
Billy Steele for Engadget

The lack of an ear tip on the AirPods 4 also creates a challenge for transparency mode. Treski noted that it’s “arguably even harder” than mastering ANC on open earbuds since you’re having to blend ambient sound from the microphones with what you’re hearing naturally through your unplugged ears. There’s a perfect mix that will seem real to your brain, but also it has to all be done with extremely low latency so the automatic adjustments don’t lead to any delays in what comes through the AirPods.

The new shape for the AirPods 4 also provided an opportunity to improve overall sound quality on the earbuds. The front end of the buds, which Bergeron revealed the team calls the “snorkel,” is very different from the AirPods 3. Since the previous model was more open, she said, the engineers had more freedom to operate. With the new version, the driver had to be adjusted so that it didn’t reflect sound to the internal microphone that monitors noise inside your ear. That’s why the drivers are now pointed down your ear canal, and why they’re slightly recessed.

“In order to get the improved fit, that necessitated adjusting the driver and the front of the product,” she said. “The mechanical engineers are doing the packaging of the entire product, trying to fit everything in. Acoustic engineers are saying, ‘okay, based on those constraints, this is the best place that we can put the driver.’”

The design overhaul on the AirPods 4 extends to the case as well. Apple was able to slim down the accessory while also simplifying how you interact with it. The end result is “the same magic experience,” Bergeron noted, but the lack of a button allowed engineers to eliminate overall thickness and rely on an accelerometer. Removing the button also gets rid of one place where liquid could potentially get in, so the case has the same IP54 rating as the new AirPods.

“We get a double win there for sure,” Bergeron said.

This article originally appeared on Engadget at https://www.engadget.com/audio/headphones/weve-got-to-make-it-happen-how-apple-designed-airpods-4-for-effective-anc-130008844.html?src=rss

Apple iPhone 16 and iPhone 16 Plus review: Closing the gap to the Pro

The “regular” iPhone has become like a second child. Year after year, this model has gotten the hand-me-downs from the previous version of the iPhone Pro – the older, smarter sibling. The iPhone 15 received the iPhone 14 Pro’s Dynamic Island and A16 Bionic processor, and the iPhone 14 before that got the A15 Bionic chip and a larger Plus variant with the same screen size as the iPhone 13 Pro Max. For the iPhone 16 ($799 & up), there are trickle-down items once more. But this time around, that’s not the entire story for the Apple phone that’s the best option for most people.

Surprisingly, Apple gave some of the most attractive features it has for 2024 to both the regular and Pro iPhones at the same time. This means you won’t have to wait a year to get expanded camera tools and another brand new button. Sure, Apple Intelligence is still in the works, but that’s the case for the iPhone 16 Pro too. The important thing there is that the iPhone 16 is just as ready when the AI features arrive.

So, for perhaps the first time – or at least the first time in years – Apple has closed the gap between the iPhone and iPhone Pro in a significant way. ProRAW stills and ProRES video are still exclusive to the priciest iPhones, and a new “studio-quality” four-microphone setup is reserved for them too. Frustratingly, you’ll still have to spend more for a 120Hz display. But, as far as the fun new tools that will matter to most of us, you won’t have to worry about missing out this time.

Another year has passed and we still don’t have a significant redesign for any iPhone, let alone the base-level model. As such, I’ll spend my time here discussing what’s new. Apple was content to add new colors once again, opting for a lineup of ultramarine (blueish purple), teal, pink, white and black. The colors are bolder than what was available on the iPhone 15, although I’d like to see a blue and perhaps a bright yellow or orange. Additionally, there’s no Product Red option once again — we haven’t seen that hue since the iPhone 14.

The main change in appearance on the iPhone 16 is the addition of two new buttons. Of course, one of those, the reconfigurable action button above the volume rockers, comes from the Pro-grade iPhones. By default, the control does the task of the switch it replaces: activating silent mode. But, you can also set the action button to open the camera, turn on the flashlight, start a Voice Memo, initiate a Shazam query and more. You can even assign a custom shortcut if none of the presets fit your needs.

While Apple undoubtedly expanded the utility of this switch by making it customizable, regular iPhone users will have to get used to the fact that the volume control is no longer the top button on the left. This means that when you reach for the side to change the loudness, you’ll need to remember it’s the middle and bottom buttons. Of course, the action button is smaller than the other two, so with some patience you can differentiate them by touch.

The new Camera Control button can open the camera app from anywhere.
Billy Steele for Engadget

Near the bottom of the right side, there’s a new Camera Control button for quick access to the camera and its tools. A press will open the camera app from any screen, and a long press will jump straight to 4K Dolby Vision video capture at 60 fps. Once you’re there, this button becomes a touch-sensitive slider for things like zoom, exposure and lens selection. With zoom, for example, you can scroll through all of the options with a swipe. Then with a double “light press,” which took a lot of practice to finally master, you can access the other options. Fully pressing the button once will take a photo — you won’t have to lift a finger to tap the onscreen buttons.

Around back, Apple rearranged the cameras so they’re stacked vertically instead of diagonally. It’s certainly cleaner than the previous look, and the company still favors a smaller bump in the top left over something that takes up more space or spans the entire width of the rear panel (Hi Google). The key reason the company reoriented the rear cameras is to allow for spatial photos and videos, since the layout now enables the iPhone 16 to capture stereoscopic info from the Fusion and Ultra Wide cameras.

The iPhone 16 and 16 Plus have a new 48-megapixel Fusion camera that packs a quad-pixel sensor for high resolution and fine detail. Essentially, it’s two cameras in one, combining – or fusing, hence the name – a 48MP frame and a 12MP one that’s fine-tuned for light capture. By default, you’ll get a 24MP image, one that Apple says offers the best mix of detail, low-light performance and an efficient file size. There’s also a new anti-reflective coating on the main (and ultrawide) camera to reduce flares.

The 12MP ultrawide camera got an upgrade too. This sensor now has a faster aperture and larger pixels, with better performance in low-light conditions. There’s a new macro mode, unlocked by autofocus and able to capture minute detail. This is one of my favorite features as sharp images of smaller objects have never been in the iPhone camera’s arsenal (only the Pros), and the macro tool has worked well for me so far.

The iPhone 16, like its predecessors, takes decent stills. You’ll consistently get crisp, clean detail in well-lit shots and realistic color reproduction that doesn’t skew too warm or too cool. At a concert, I noticed that the iPhone 16’s low-light performance is noticeably better than the iPhone 15. Where the previous model struggled at times in dimly lit venues, my 2x zoom shots with this new model produced better results. There wasn’t a marked improvement across the board, but most of the images were certainly sharper.

Macro mode on the iPhone 16 camera is excellent.
Macro mode on the iPhone 16 camera is excellent.
Billy Steele for Engadget

The most significant update to the camera on the iPhone 16 is Photographic Styles. Apple has more computational image data from years of honing its cameras, so the system has a better understanding of skin tones, color, highlights and shadows. Plus, the phone is able to process all of this in real time, so you can adjust skin undertones and mood styles before you even snap a picture. Of course, you can experiment with them after shooting, and you can also assign styles to a gallery of images simultaneously.

Photographic Styles are massively expanded and way more useful, especially when you use them to preview a shot before you commit. My favorite element of the updated workflow is a new control pad where you can swipe around to adjust tone and color. There’s also a slider under it to alter the color intensity of the style you’ve selected. For me, the new tools in Photographic Styles make me feel like I don’t need to hop over to another app immediately to edit since I have a lot more options available right in the Camera app.

As I’ve already mentioned, Camera Control is handy for getting quick shots, and the touch-sensitivity is helpful with settings, but I have some gripes with the button. Like my colleague Cherlynn Low mentioned in her iPhone 16 Pro review, the placement causes issues depending on how you hold your phone, and may lead to some inadvertent presses. You can adjust the sensitivity of the button, or disable it entirely, which is a customization you might want to explore. What’s more, the touch-enabled sliding controls are more accurately triggered if you hold the phone with your thumbs along the bottom while shooting. So, this means you may need to alter your grip for prime performance.

Like I noted earlier, the new camera layout enables spatial capture of both video and photos on the iPhone 16. This content can then be viewed on Apple Vision Pro, with stills in the HEIC format and footage at 1080p/30fps. It’s great that this isn’t reserved for the iPhone 16 Pro, but the downside (for any iPhone) is file size. When you swipe over to Spatial Mode in the camera app, you’ll get a warning that a minute of spatial video is 130MB and a single spatial photo is 5MB. I don’t have one of Apple’s headsets, so I didn’t spend too much time here since the photos and videos just appear normal on an iPhone screen.

I’d argue the most significant advantage of Spatial Mode is Audio Mix. Here, the iPhone 16 uses the sound input from the spatial capture along with “advanced intelligence” to isolate a person’s voice from background noise. There are four options for Audio Mix, offering different methods for eliminating or incorporating environmental sounds. Like Cherlynn discovered on the iPhone 16 Pro, I found the Studio and Cinematic options work best, with each one taking a different approach to background noise. The former makes it sound like the speaker is in a studio while the latter incorporates environmental noise in surround sound with voices focused in the center – like in a movie. However, like her, I quickly realized I need a lot more time with this tool to get comfortable with it.

Plain ol' black is an option this time around.
Billy Steele for Engadget

Apple proudly proclaimed the iPhone 16 is "built for Apple Intelligence,” but you’ll have to wait a while longer to use it. That means things like AI-driven writing tools, summaries of audio transcripts, a prioritized inbox and more will work on the base iPhone 16 when they arrive, so you won’t need a Pro to use them. Genmoji and the Clean Up photo-editing assist are sure to be popular as well, and I’m confident we’re all ready for a long overdue Siri upgrade. There’s a lot to look forward to, but none of it is ready for the iPhone 16’s debut. The iOS 18.1 public beta arrived this week, so we’re inching closer to a proper debut.

Sure, it would’ve been nice for the excitement around the new iPhones to include the first crack at Apple’s AI. But, I’d rather the company fine-tune things before a wider release to make sure Apple Intelligence is fully ready and, more importantly, fully reliable. Google has already debuted some form of AI on its Pixel series, so Apple is a bit behind. I don't mind waiting longer for a useful tool than rushing a company into making buggy software.

What will be available on launch day is iOS 18, which delivers a number of handy updates to the iPhone, and many of which deal with customization. For the first time, Apple is allowing users to customize more than the layout on their Home Screen. You can now apply tint and color to icons, resize widgets and apps and lock certain apps to hide sensitive info. Those Lock Screen controls can also be customized for things you use most often, which is more handy now since the iPhone 16 has a dedicated camera button on its frame. There’s a big overhaul to the Photos app too, mostly focused on organization, that provides a welcome bit of automatization.

The iPhone 16 uses Apple’s new A18 chip with a 6-core CPU and 5-core GPU. There’s also a 16-core Neural Engine, which is the same as both the iPhone 15 and the iPhone 16 Pro. With the A18, the base-level iPhone jumped two generations ahead compared to the A16 Bionic inside the iPhone 15. The new chip provides the necessary horsepower for Apple’s AI and demanding camera features like Photographic Styles and the Camera Control button. I never noticed any lag on the iPhone 15, even with resource-heavy tasks, and those shouldn’t be a problem on the iPhone 16, either. But, we’ll have to wait and see how well the iPhone 16 handles Apple Intelligence this fall.

Of course, the A18 is more efficient than its predecessors, which is a benefit that extends to battery life. Apple promises up to 22 hours of local video playback on the iPhone 16 and up to 27 hours on the 16 Plus. For streaming video, those numbers drop to 18 and 24 hours respectively, and they’re all slight increases from the iPhone 15 and 15 Pro.

Starting at 7AM, I ran my battery test on the iPhone 16 and had 25 percent left at midnight. That’s doing what I’d consider “normal” use: a mix of calls, email, social, music and video. I also have a Dexcom continuous glucose monitor (CGM) that’s running over Bluetooth and I used the AirPods 4 several times during the day. And, of course, I was shooting photos and a few short video clips to test out those new features. While getting through the day with no problem is good, I’d love it if I didn’t have to charge the iPhone every night, or rely on low-power mode to avoid doing so.

On a related note, Apple has increased charging speeds via MagSafe, where you can get a 50 percent top up in around 30 minutes via 25W charging from a 30W power adapter or higher.

With the iPhone 16, Apple has almost closed the gap between its best phone for most people and the one intended for the most demanding power users. It’s a relief to not pine for what could be coming on the iPhone 17 since a lot of the new features on the iPhone 16 Pro are already here. And while some of them will require time to master, it’s great that they’re on the iPhone 16 at all. There are some Pro features you’ll still have to spend more for, like ProRAW photos, ProRES video, a 120Hz display, a 5x telephoto camera and multi-track recording in Voice Memos. But those are luxuries not everyone needs. For this reason, the regular iPhone will likely suit your needs just fine, since splurging on the high-end model has become more of an indulgence than a necessity.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-iphone-16-and-iphone-16-plus-review-closing-the-gap-to-the-pro-120050824.html?src=rss