iPhone 16 Pro and Pro Max review: Cameras and customization

It may seem like Apple is behind the competition a lot of the time. The company appeared to be slow to developments like widgets, bezel-less displays with camera notches and screens with high refresh rates. And with the iPhone 16 Pro, it appears to once again be late to the party, bringing generative-AI features and a real button for the camera to its 2024 flagship. But if you'll allow me to play therapist for a moment, I think it's not that Apple is slow. I think Apple is cautious. Perhaps overly so.

Caution on its own isn't a bad trait — in fact, it could be considered thoughtful. Rather than rush to the cutting edge with its peers, Apple deliberates, usually finding a slightly different approach that is often an improvement on what's out there. Just look at the Vision Pro headset or Apple Silicon. Or even the iPod, the iPad and the AirPods, which were far from the first of their kind when they launched.

With the iPhone 16 Pro, the focus is on cameras and Apple Intelligence. The problem is, Apple Intelligence isn't quite here yet. We can test some features in the developer beta that's currently available, but that's not necessarily the same as the experience the public will get when the update rolls out in October. It’s not unprecedented for new iPhones to launch without some marquee features, sure, and thankfully there's still plenty that the iPhone 16 Pro brings. From Camera Control, the Fusion Camera and other video-related updates to slightly bigger displays and iOS 18, the iPhone 16 Pro and Pro Max are intriguing successors, even absent the vaunted Intelligence features that are still to come.

I’m getting deja vu. Looking back at my review of the iPhone 15 Pro, I see a picture of that phone and its predecessor lined up side by side to show just how much thinner the bezels are. Apple has once again trimmed the borders on its flagship phones, but while doing that enabled it to reduce the handsets’ size in 2023, this year it allowed the company to cram in larger screens without much change in footprint.

The iPhone 16 Pro and Pro Max displays have increased in size from 6.1 inches and 6.7 inches up to 6.3 inches and 6.9 inches, respectively. Both handsets have grown ever so slightly, too, by just under 1mm in width and about 3mm in height.

Basically, the iPhone 16 Pro and Pro Max are a hair wider and taller than their predecessors, but maintain the same 8.25mm (0.32-inch) profile. And yet, in spite of this minimal change, you won’t be able to keep your old cases if you’re upgrading from an iPhone 15 Pro to an iPhone 16 Pro.

Not only would the cases not quite fit, you’d also need something with either a cutout or a sapphire crystal and conductive layer to be able to use the new Camera Control. Of course, Apple sells compatible cases, as do some third parties like Otterbox, so you have plenty of options.

I’ve spent most of this year’s hardware review season remarking how Samsung and Google’s flagships feel like iPhones, and I’ve now reached a strange inception point. As I’ve been comparing competing phones for this review, I’ve been surrounded by about a dozen handsets from all these different companies on my couch, including last year’s iPhones, the Galaxy S24 Plus and the Pixel 9 Pro and Pro XL. Trying to figure out which one is the iPhone has become more confusing than ever, as they all feel similar in build. The best way to verify at a glance is looking at their camera arrays or my wallpaper.

All that is to say that the iPhone 16 Pro feels similar to its predecessor, which is what these other companies have been attempting to emulate. Apple would be right to feel flattered by this imitation, and yet I have to wonder if it’s time to do something different. Google’s Pixel 9 Pro is actually a whole six grams lighter than the iPhone 16 Pro at 221 grams (7.79 ounces), and I’m absolutely smitten by its rich pink hue and shiny edges. Though I like the new golden Desert color for the iPhone 16 Pro, I do wish Apple’s premium flagship had more fun and vibrant exteriors. That said, I do love the base iPhone 16 in pink, teal and Ultramarine.

Close up shots of the bottom half of the iPhone 16 lineup, featuring from left to right a pink, teal, white and gold phones.
Brian Oh for Engadget

Arguably the biggest change to the iPhone 16 lineup, not to mention the iPhone 16 Pro, is the introduction of Camera Control. This is a button on the right side of the device, which has touch and pressure sensors on it to enable greater control with swipes and semi-presses. (That’s in addition to the Action Button on the top left that was added to last year’s Pros, and carries over to the iPhone 16 and iPhone 16 Plus, too.)

One of the things this was supposed to do was let you push lightly on the button to trigger focus, similar to what half pressing a DSLR shutter button would do. That function won’t be available at launch, so I can’t say if it’s effective.

But by and large, Camera Control is a very Apple approach to a feature that has been around for years. From phones by Sony and Nokia with dedicated shutter buttons to Android handsets with hardware-based double-click shortcuts, the notion of quick access to your camera without having to futz with the screen is a popular one. For good reason, too — I’ve hated having to swipe or long-press the icon on my iPhone’s lock screen in the past, and even though I could set the iPhone 15 Pro’s Action button to open the camera, it just wasn’t positioned well and I’d have to give up my mute button.

So Apple isn’t breaking new ground with its hardware shortcut for a frequently used app. But it does do a few things differently with the touch sensor. You can swipe on it to tweak things like exposure, zoom levels and tone, and the half-press still works as a way to select options or go back out of menus within the new Camera Control interface. In theory, it’s a nice way to make changes on the fly.

In reality, there were a few issues, and they largely have to do with placement. The button sits a little farther from the base of the phone than I’d like, so my fingers have to reach a bit more to press it, whether I was in landscape or portrait mode. This wasn’t usually a problem when I had both hands free and could steady the iPhone with my other hand and readjust my grip.

But if you’re trying to take a quick shot with just one hand, the button’s location can feel unintuitive. Of course, everyone has different finger lengths and ratios, so it’s entirely possible that other people find this logical. It also depends on your grip — if you’re cradling the bottom of the device in your palm, it’s harder to maneuver. If you’re covering part of the screen and reaching for the button head on, it’s slightly easier to use camera control.

The iPhone 16 Pro held up in mid air by two hands, with the camera app open and showing people walking on a New York City street.
Brian Oh for Engadget

Still, even for those with the strongest claws, swiping and half-pressing and double-half-pressing on the sensor is tricky. I was only ever really able to do that if I had my thumb holding up the bottom edge and my middle, ring and little fingers steadying the right end of the phone. Maybe this is a new camera grip I just need to relearn for this button.

The awkward placement is a minor gripe compared to what I found most annoying: the button’s touch sensor. Not only was it difficult to swipe through different settings when holding the device with one hand, it also reacts to accidental touches and swipes. Sometimes, the phone would slide down my palm and change the exposure or zoom level, completely ruining the vibe. I should point out that you can go into accessibility settings to either tweak the swipe sensitivity or turn it off altogether, if it really bothers you. Honestly, if you’re planning on making adjustments with Camera Control, it’s best to have time, patience and both hands free.

In those situations, I had a lot of fun editing settings and watching them be reflected in the viewfinder in real time. I also liked zooming in and out of subjects, recomposing a shot and tweaking exposure till I liked what I saw, before then pushing down to snap the picture. (This action does lead to some small issues, but more on the actual photo quality later.) I especially loved this while recording video, since it makes slowly zooming in or out of a subject smoother than using the onscreen slider.

Then again, for scenarios where I just want to fire off a quick shot without worrying about exposure or zoom settings, the pain of finagling with the sensor mostly goes away. In exchange, being able to rapidly snap pictures is a joy. I found myself taking more pictures than ever thanks to camera control, which if you know me is a feat worthy of the Guinness Book of Records.

A random person cut me off in line? Click. Funny sign on a building I pass by in a Lyft? Click, click. From your lock screen, you’ll have to press the button twice — once to wake the phone up and once to open the camera. Then press again to take the photo. It’s not ideal, but not too far off the same process on a Pixel phone, for instance. Plus, you can long-press the iPhone’s button to start recording a video, and it’ll automatically stop when you let go.

Close up of the iPhone 16 Pro's rear cameras with greenery in the background.
Cherlynn Low for Engadget

This sort of rapid access to the camera is the best thing about the new button, and I could see it being potentially useful not just for shutterbugs like me, but for the upcoming Visual Intelligence feature that Apple teased at its launch event. The company’s version of Google Lens could allow people to ask questions about things in the real world around them. But of course, since this wasn’t available during my review period, I wasn’t able to test it.

For now, you can go into Settings to either change the number of clicks it takes to trigger the camera app, remap it to a Code scanner or the Magnifier tool or disable it altogether. Since you can also set up the Action button to do these things, you have more choices now over where you want your camera shortcut or free up the former volume slider to do something else.

Even if you’re not a glutton for buttons, there are still some camera updates that might intrigue you. This year’s flagships sport what Apple calls a 48-megapixel Fusion Camera, which has a faster quad-pixel sensor. This enables what the company describes as “zero shutter lag,” which is wording it has used repeatedly over the years. In this case, it’s referring to how quickly the camera will capture a shot after you press the shutter button (onscreen or hardware).

I will admit I was initially confused by this update, in part because it requires relearning some behaviors I had adopted to mitigate the shortfalls of older cameras. Basically, the iPhone 16 Pro’s cameras are now so fast that when I asked someone to throw something so I could capture it in motion to see how still the images were, my shots ended up being of the person holding the object.

Our video producer and I were very confused, and it wasn’t until the “zero shutter lag” concept was explained clearer to me that I got it. I had become used to pressing the shutter early since cameras, in my experience, would be fractions of a second slow. Apple has become so fast that it actually captured the literal moment I tapped the button, instead of the split second after, when the object was in mid-air.

A woman flinging two cushions to her left, in a sample photo demonstrating the iPhone 16 Pro's 48-megapixel Fusion Camera's speed.
Brian Oh for Engadget

This is going to change how people take jump shots, I’m sure, but basically if you and your friends are taking pictures of yourselves floating in the sky, the photographer doesn’t have to hit capture before telling you to jump. I know this is a very specific and silly example, but it’s also the most relatable illustration of how much quicker the Fusion camera is.

Also, why can’t camera stories be silly and fun? That’s what a lot of the best moments in life are, and some of the new features are great in those situations. The support for 4K video at 120 fps in Dolby Vision, for example, led to some beautiful high-quality, rich and colorful clips of my friend’s adorable pomeranian trotting along on a walk. Her little tongue slowly peeking out as she bounded towards the camera looked crisp and smooth when I played it back at 25 percent and 20 percent speeds, too.

Depending on your mood, the new Photographic Styles can be fun or serious. Apple’s tweaked the built-in camera filters to not only offer more options but give you greater control. Due to how the company has refined its processing each year, there’s also an improved depth map captured when it detects a face in the scene. This, combined with a greater focus on color science around skintone, has led to what might be my favorite new iPhone 16 feature.

Whether I shot them in Portrait mode or not, photos of people that I took using the iPhone 16 Pro were a dream to edit. Simply switching between the Standard, Natural, Luminous, Quiet or Ethereal styles already resulted in improvements to the colors and shadow, but I could also tap on each thumbnail to access the new editing touchpad and drag a dot around. This let me more precisely tweak the hues and contrast levels, and an additional slider below let me adjust how warm the image was.

A composite of four sample photos featuring a woman gazing into the camera, each with a different Photographic Style applied. A label at the bottom right of each image shows which Style is used and they are, from left to right, Standard, Ethereal, Luminous and Vibrant.
Cherlynn Low for Engadget

An ugly selfie with my cousin in the hideous overhead lights of a meeting room became a beautiful snapshot after I switched to the Ethereal or Luminous styles. Both of those are quickly becoming my favorites, but I’m more impressed with how well Apple was able to segment the subject from the background. In almost every shot I edited, adjusting the slider mostly only changed the background, keeping people and their complexions within the realm of reality instead of applying harsh oversaturation or extreme contrast levels to them. They also added a background blur that lent a pleasant soft focus effect, and most of the time the system accurately identified outlines of people in the scene.

Perhaps my favorite part is the fact that you can change between styles after you’ve shot the photo on the iPhone 16. As someone who dwells on her Instagram filters and edit tools for some time before each post, I definitely appreciate how much nicer Apple’s versions are and only wish I could retroactively apply them to photos I had taken at a recent wedding. Alas, since the edits are dependent on information captured when the photos were taken, these new retouching features will only work for pictures taken with an iPhone 16 or 16 Pro.

One final camera update I’ll touch on before telling you about actual photo quality is Audio Mix. This uses the spatial audio now recorded by default with the new studio mics on the iPhone 16 Pro (or even the system on the iPhone 16 and 16 Plus) to understand the direction of sound sources in your footage. Then, when you edit the clip, you can choose between Standard, In-frame, Studio and Cinematic mixes, as well as drag a slider to reduce background noise.

You’ll have to be recording in fairly specific acoustic scenarios to get the most out of Audio Mix. I tested it in a variety of situations, like my cousin talking on his phone on a busy New York street, me interviewing my fellow gym buddies after a tiring workout with the background music quietly playing or my friend talking to me while his wife talks about something else off-camera in their fairly quiet kitchen.

For the most part, going to Cinematic or Studio modes from Standard resulted in a noticeable reduction in environmental noise. My favorite is Studio, which generally seemed to improve voice clarity as well, making people sound like they could be talking on a podcast. In-frame, however, rarely did what I expected and occasionally produced some warped distortion. It appears there might need to be more distance between various sources of sound for this to work best, and I have to spend more time testing to better understand this tool. You can check out our review video for examples of a clip with different audio mixes, but for now, while the promised improvements aren’t what I expected, there at least appears to be some benefit to Audio Mix.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro.
Cherlynn Low for Engadget

On to the actual photos and how they hold up against the competition. I’ve long considered Google’s Pixel phones to be the gold standard in smartphone photography, since I prefer the company’s color and detail processing. I know some people feel that Google tends to oversharpen, so bear in mind that, as with most things, your preference may be different from mine.

When I compared photos I took with both phones on the same laptop screen, the differences were minimal. Occasionally, Google would expose better, being more able to retain shadows near a bright light source than the iPhone 16 Pro. But the Pixel’s nightscape shots had more light leakage into the sky, whereas Apple was more adept at keeping the background dark against the outline of a skyscraper.

Honestly at this point we’re really nitpicking and pixel-peeping to find differences. Both companies deliver great cameras, and though I still prefer Google’s approach to Portrait shots, Apple has been slowly but surely closing the gap with improvements to its depth maps every year.

I will mention, though, that a lot more of the photos I shot on the iPhone 16 Pro came out blurrier than the Pixel 9 Pro, and it might have to do with the fact that I was using the Camera Control to snap them. This was the issue I alluded to earlier, where using a physical button to take a picture is more likely to introduce shake than a software shutter. It’s not like Samsung or Google phones are immune to this problem, though I will say that the way Camera Control is built, where the recessed button depresses into the phone’s frame, does leave it a bit more vulnerable to this than, say, using a volume rocker might.

A composite of sample images from the iPhone 16 Pro and the Pixel 9 Pro, featuring colorful motorcycle parts on a table.
Cherlynn Low for Engadget

Oh and finally, a quick note for my Gen Z readers: I know how much you all prefer flash photography compared to night modes in low light scenarios. (Thanks to my much younger cousin for the valuable insight.) I’ve done the testing and can say that I prefer Google’s Pixel 9 Pro for its software, warmer flash compared to the iPhone 16 Pro’s, which is stronger and brighter, leading to my face looking washed out.

It’s been about two months since the public beta for iOS 18 was released, and it was nice to get a taste of upcoming features like the new customizable home pages, expanded Tapback reactions and the redesigned Photos app. With the iPhone 16 launch, iOS 18 is basically ready for primetime… with some caveats.

This year, more than ever, it’s hard to figure out what’s coming to your iPhone and what isn’t. With the release of Apple Intelligence slated for October, features like writing tools, Cleanup for photos and the redesigned Siri won’t be ready till next month. And even then, your non-pro iPhone 15 won’t be compatible.

Plus, some features that were teased at WWDC, like Genmoji, still haven’t been added to the iOS 18.1 developer beta, which is where most Apple Intelligence features have been arriving as a preview for app makers. Within the iPhone 16 lineup, too, there are things coming only to the Pro models, like multilayer recording in Voice Memos.

It’s confusing, and can make choosing your iPhone a trickier decision. But for this review, at least the iPhone 16 Pro and Pro Max are getting everything. I cannot wait to try out multi-track recording in Voice Memos, and I hope Apple sees this yearning as a sign that it should bring this to more devices.

It was nice to get time with iOS 18, even in the absence of Apple Intelligence. Honestly, I’m not even sure I’d like those features that much. In a similar way, Gemini AI was nice on the Pixel 9 Pro series, but didn’t feel like must-haves.

Some of the new iOS 18 touches I noticed immediately were the refreshed Control Center, which took some getting used to as I had to re-learn how to swipe back to the home page, since there are more pages to scroll through now. I especially enjoyed seeing the new little chat bubble appear on my voice recordings, indicating that a transcript had been generated for them. And though I haven’t exchanged messages with Android-toting friends yet, I’m glad to see RCS support is finally live this week.

The bottom half of both the iPhone 16 Pro and iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

Though I was excited for the new custom routes tool in Maps, I struggled to actually create them. You can set your start and end points and have the app close the loop for you, or just tap landmarks or points on the map to get the route to basically connect the dots. Unfortunately, no matter how many times I tried to get the route to cut through a building where I knew a pedestrian walkway existed, Maps resisted me at every turn, forcing the route to go through more established (and therefore more crowded) paths instead. It’s not unreasonable, but certainly not the open-world route-creation feature I was envisioning.

The best thing about iOS 18, and also some new features in the iPhone 16 lineup (like in the camera controls) is the customizability. I do appreciate that if you don’t like something, you can usually turn it off. With the new ability to place apps outside of a rigid grid, you can now lay your home screen out just the way you like. The redesigned Photos app lets you create and pin collections so you can more easily find the pictures most important to you. And again, I’m glad Apple is giving people the option to turn off Camera Control altogether or adjust its sensitivity.

The iPhone 16 Pro and Pro Max are powered by Apple’s A18 Pro chip, which are built on “second-generation 3-nanometer technology and [feature] a new architecture with smaller, faster transistors.” All this is meant to deliver “unprecedented efficiency,” according to Apple’s press release.

Some small software glitches aside, I’ve never run into slowdown on the iPhone 16 Pro, but I was certainly surprised by the smaller handset’s battery life. In general, the iPhone 16 Pro would barely last a full day, which is reminiscent of the iPhone 15 Pro, too. It’s worth noting that before this review I was primarily using an iPhone 15 Pro Max as my daily driver, which usually gets through a day and a half with no problem, so the drop in endurance is even more pronounced for me.

Most days, I’d pick up the iPhone 16 Pro at about 9AM and would get to about 9pm before getting low battery alerts. If I started the day a bit later, closer to 11AM for instance, I got to 1am before the iPhone 16 Pro ran completely dry. On Sunday, I unplugged the phone at about 9:30AM and was shocked on the train home to get a warning that remaining power was at just 20 percent. It was only 6:50PM, and the night had barely just started!

You’ll get significantly better battery life on the iPhone 16 Pro Max, which delivers the same almost two-day runtime as its predecessor. And sure, a phone with a smaller battery not lasting as long makes mathematical sense. But considering the Pixel 9 Pro is a comparably sized handset and manages to last about two days, there’s no excuse for the iPhone 16 Pro to conk out before the night is up.

A white iPhone 16 Pro and a desert iPhone 16 Pro Max standing on a table.
Brian Oh for Engadget

One of the best things about the iPhone 16 Pro lineup is that, unlike last year, there isn’t much of a tradeoff in cameras if you opt for the smaller device. The iPhone 15 Pro Max had a 5x telephoto zoom camera, while the iPhone 15 Pro only went up to 3x. As a budding photographer of skittish wild animals, I opted for the Max, especially since it was much lighter than its predecessor thanks to the titanium build.

With the iPhone 16 Pro having essentially the same camera system as the Pro Max, I thought it was time for me to go back to a size that was much easier on my hands. Alas, with the disappointing battery performance, I might just have to stick with a Max, and you might too.

There’s also the non-Pro iPhone 16 models to consider, and just as there were fewer differences than ever between the Pro and Pro Max, the tradeoffs aren’t as significant this year, either. Apple brought the previously Pro-exclusive Action button to the iPhone 16 and iPhone 16 Plus, while also including the Camera Control on its less-premium phones. The main things that set the two lines apart this year are processors, screen quality, camera sensors and onboard mics. You’ll lose support for ProRaw photos and multi-layer recording by opting for the cheaper devices, too.

Otherwise, you’ll still have all the iOS 18 and Apple Intelligence features coming to the Pros, as well as spatial audio recording, which enables the Audio Mix I described in the camera section earlier.

The iPhone 16 Pro and iPhone 16 Pro Max held in mid air with their backs facing up.
Cherlynn Low for Engadget

Apple’s caution is sometimes warranted. Especially at a time when mistrust of AI-generated content runs rampant, the company taking its time to get Apple Intelligence right is understandable. But its deliberation doesn’t always lead to winners. While I appreciate the attempt to differentiate camera control with the touch sensor for more versatility, I’m not yet convinced of its usefulness.

The good news is, and I cannot stress this enough, you have the option to tune it to your liking. And that’s a theme I’m seeing in recent Apple features that hint at more thoughtfulness than usual. If you don’t like something, or if something isn’t right for your needs, you can adjust or disable it. In iOS 18, you have greater control over your home screen’s app layout and can pin custom collections for easier reach in the Photos app. The Action button introduced last year could have been a spectacular fail had Apple not let you still keep it as a mute switch, but it managed to give people more functionality while maintaining the status quo for those who are just as resistant to change.

Change is scary. Change is hard. But without change there is no progress. Apple’s cautious approach is a tricky balancing act that’s evident on the iPhone 16 Pro. Some new features, like Audio Mix and custom routes in Maps, deliver mixed results. Others, like Photographic Styles, are hits. Then there are the basic ingredients, like good battery life and durable, attractive designs, that Apple cannot neglect.

The iPhone 16 Pro’s subpar battery life holds it back from beating the competition, which is stiffer than ever this year, especially from Google. Luckily for Apple, most people who have iPhones are going to stick with iPhones — it’s just easier. For those already sucked into the ecosystem, the iPhone 16 Pro (and particularly the Pro Max) are worth the upgrade from a model that’s at least two years old. If you already have an iPhone 15 Pro (or even a 14 Pro), for the sake of our planet and your wallet, you might prefer to hold off on upgrading, especially since this year’s devices aren’t that much different.

This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-and-pro-max-review-cameras-and-customization-120052372.html?src=rss

The Morning After: Our verdict on the Apple Watch Series 10

The reviews don’t stop. This morning, we’re checking out how Apple’s latest wearable compares to its predecessors and competition. Deputy Editor Cherlynn Low says that, while the Series 10 is noticeably lighter, you wouldn’t notice many differences compared to the Series 9 unless they were side by side. The latest Apple Watch is ever so slightly bigger (46mm), but if you need something even bigger, you should consider the Watch Ultra 2, which has a 49mm screen. Apple is also using a new wide-angle OLED to make its latest watch easier to read, even if your wrist is resting to the side. Again, the change is noticeable but not in a huge way. Upgraded charging should bring the Series 10 back to 80 percent in 30 minutes, but we’re still not hugely impressed with its battery life.

You might notice we haven’t scored the Apple Watch Series 10. While she’s sharing what’s important for folks considering buying one, Cherlynn needs more time to test its sleep features. However, thanks to its similarity to its predecessors (and watchOS 11 bringing many similar features to the Series 9, Watch Ultra 2 and more), it’s hard to recommend to anyone wearing a Series 9 or Ultra. If you’re coming from the Series 8 or older, the update might be worth it. Check out the full review

— Mat Smith

TMA
Snap

Snap’s latest augmented reality glasses have a completely new — and completely bonkers — design, larger field of view and support for full hand tracking. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start. These aren’t for consumers, but given how they look, I could have told you that. Karissa Bell tested them and was impressed. But competition is already on the horizon: Meta will show off the first version of its long-promised augmented reality glasses next week at its developer event.

Continue reading.

Kremlin-affiliated Russian troll farms are running disinformation campaigns discrediting Kamala Harris and Tim Walz before this year’s US presidential elections, according to Microsoft. The Microsoft Threat Analysis Center noted several approaches: One video depicted a supposed attack by Harris supporters on Trump rally attendees. Another video used an actor to accuse Harris of being involved in a 2011 hit-and-run incident, which paralyzed a 13-year-old girl. A second troll farm shared a fake video showing a New York City billboard claiming Harris wants to change children’s gender. Microsoft warned we should expect more Russian-made disinformation materials online, including more staged and AI-edited videos, as we get closer to the election.

Continue reading.

This article originally appeared on Engadget at https://www.engadget.com/general/the-morning-after-engadget-newsletter-111537179.html?src=rss

Apple halts iPadOS 18 update for M4 iPad Pro after bricking reports

Apple has temporarily paused the rollout of iPadOS 18 for M4 iPad Pro models, some of the most expensive iPads that the company sells, after some users complained that the update bricked their devices. Apple acknowledged the issue in a statement to Engadget, saying, “We have temporarily removed the iPadOS 18 update for M4 iPad Pro models as we work to resolve an issue that is impacting a small number of devices.”

The issue first came to light through Reddit, where a growing number of M4 iPad Pro users described how their iPads became unusable after they tried installing the latest version of iPadOS. “At some point during the update my iPad turned off, and would no longer turn on,” a user named tcorey23 posted on Reddit. “I just took it to the Apple Store who confirmed it’s completely bricked, but they said they had to send it out to their engineers before they can give me a replacement even though I have Apple care.”

Another Reddit user called Lisegot wrote that the Apple Store they took their bricked M4 iPad Pro to did not have a replacement in stock, which meant they they would need to wait five to seven days for a working iPad. “No one was particularly apologetic and they even insinuated that there was no way for them to know whether the update caused this,” they wrote.

Having a software bug brick an iPad is rare. ArsTechnica, which first reported this story, pointed out that iPads can typically be put into recovery mode if a software update goes bad.

If you own an M4 iPad Pro, Apple will no longer offer you iPadOS 18 until it fixes the issue. It’s not clear when it will be fixed.

This article originally appeared on Engadget at https://www.engadget.com/mobile/tablets/apple-halts-ipados-18-update-for-m4-ipad-pro-after-bricking-reports-000258237.html?src=rss

Zynga says it will fight $45 million fine for infringing decades-old IBM patents

The internet is so core to how modern life operates that it's easy to forget how much of the technology that went into building the world wide web has patent protections. And some of those patents are still being enforced today. Zynga may be learning that the hard way, as a court ruled last week that the gaming company infringed on IBM patents dating back to the pre-internet telecom platform Prodigy from the 1980s. As a result, Zynga could be facing damages of $44.9 million. IBM's "Method for presenting advertising in an interactive service" patent from 1993 accounts for $40 million of the recommended damages.

For anyone still playing the once-ubiquitous Zynga games, this decision shouldn't interrupt your game time. The company said in an SEC filing that it would not have to modify or end operation of its games as a result of the court decision. Intriguingly, not every game in the Zynga catalog was found to be infringing on the patents. For instance, Crosswords with Friends was deemed an offender, but none of the Words With Friends titles were. A representative from Take-Two told Ars Technica that the company would appeal the ruling.

IBM has a long legacy of collecting intellectual property rights. Zynga, which was acquired by Take-Two Interactive in 2022, isn't its first target for potentially infringing on these Prodigy patents, and it's likely not the last. The computer company has had many online businesses in its crosshairs over the years, from the long-time giants (like Amazon and X, formerly Twitter) to the flashes in the pan (like Groupon). But some defendants, like pet retail platform Chewy, have successfully fended off IBM's legal charges.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/zynga-says-it-will-fight-45-million-fine-for-infringing-decades-old-ibm-patents-214316611.html?src=rss

Logitech drops an analog keyboard and new Pro Superlight mice

Logitech is revealing plenty of new gaming accessories and gear at Logi Play 2024, which is happening right now. Of the many new offerings from Logitech, two keyboards and two mice caught our eye.

Let’s start with the G Pro X TKL Rapid Wired Gaming Keyboard, a keyboard featuring magnetic analog switches, a first for the G Pro line. These switches have adjustable actuation points, rapid trigger functionality and key priority. In short, the keyboard lets you customize how hard presses need to be, has speedy key press recognition and the ability to prioritize certain keys when pressing two at once.

You can also use the multi-point feature in the G Hub keyboard customization software to assign more than one command to a key depending on how far it’s pressed down. As the name suggests, this is a tenkeyless model (no number pad), and you can get it for $170 in November. The three available colors are black, white and pink.

The next keyboard is the G915 X series, a trio of new members of the G915 family (we reviewed the G915 TKL back in 2020). The mechanical keyboards all have a height of 23mm and redesigned galvanic switches with a 1.3mm actuation point. They retain the original volume roller, G key and media buttons, but the Keycontrol feature allows for more macros, even letting users combine the G key with other keys.

G915 X Lightspeed
Logitech

The G915 X series includes the G915 X Lightspeed ($230), G915 X Lightspeed TKL ($200) and G915 X Wired Gaming Keyboard ($180). The G915 X Lightspeed is a tenkeyless version of the G915 X Lightspeed, while the G915 X doesn’t support wireless connections but is identical in almost every way to the G915 X Lightspeed. The Lightspeed models can come in black or white, but the wired model is only available in black. They’re all available right now.

Moving on to the mice, the G Pro X Superlight 2 Dex Lightspeed wireless gaming mouse is an upgrade of the Pro X Superlight and Pro X Superlight 2, both of which are favorites among current and former Engadget staffers. This new mouse is designed with the help of pro esports athletes, boasting a maximum limit of 44k DPI, 888 IPS acceleration and steady 8kHz polling rate performance.

G Pro X Superlight 2 Dex Lightspeed
Logitech

The Superlight 2 Dex Lightspeed has five buttons and Lightforce switches while weighing only 60 grams. It’s also compatible with Logitech’s PowerPlay wireless charging system. If you’re interested, you get it now for $160 in black, white or pink.

For those who like the original G Pro mouse, consider the Pro 2 Lightspeed wireless gaming mouse, an improvement over the old model. The Hero 2 sensors on this one are rated for 32k DPO and over 500 IPS acceleration. The highest polling rate for the Pro 2 Lightspeed is 1kHz.

Pro 2 Lightspeed
Logitech

Similar to the first G Pro, this one weighs 80 grams, perfect for gamers who prefer something heavier. It doesn’t support wireless charging but can work with the Pro Lightspeed receiver for 8kHz polling rates. The receiver will only be available for $30 in October. This mouse is now available for $140 in black, white and pink.

This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/logitech-drops-an-analog-keyboard-and-new-pro-superlight-mice-180113818.html?src=rss

Snap’s fifth-generation Spectacles bring your hands into into augmented reality

Snap’s latest augmented reality glasses have a completely new — but still very oversized — design, larger field of view and all-new software that supports full hand tracking abilities. But the company is only making the fifth-generation Spectacles available to approved developers willing to commit to a year-long $99/month subscription to start.

It’s an unusual strategy, but Snap says it’s taking that approach because developers are, for now, best positioned to understand the capabilities and limitations of augmented reality hardware. They are also the ones most willing to commit to a pricey $1,000+ subscription to get their hands on the tech.

Developers, explains Snap’s director of AR platform Sophia Dominguez, are the biggest AR enthusiasts. They’re also the ones who will build the kinds of experiences that will eventually make the rest of Snapchat’s users excited for them too. “This isn't a prototype,” Dominguez tells Engadget. “We have all the components. We're ready to scale when the market is there, but we want to do so in a thoughtful way and bring developers along with our journey.”

Snap gave me an early preview of the glasses ahead of its Partner Summit event, and the Spectacles don’t feel like a prototype the way its first AR-enabled Spectacles did in 2021. The hardware and software are considerably more powerful. The AR displays are sharper and more immersive, and they already support over two dozen AR experiences, including a few from big names like Lego and Niantic (Star Wars developer Industrial Light and Motion also has a lens in the works, according to Snap.)

To state the obvious, the glasses are massive. Almost comically large. They are significantly wider than my face, and the arms stuck out past the end of my head. A small adapter helped them fit around my ears more snugly, but they still felt like they might slip off my face if I jerked my head suddenly or leaned down.

Still, the new frames look slightly more like actual glasses than the fourth-generation Spectacles, which had a narrow, angular design with dark lenses. The new frames are made of thick black plastic and have clear lenses that are able to darken when you move outside, sort of like transition lenses.

The fifth-generation Spectacles are the first to have clear lenses.
The fifth-generation Spectacles are the first to have clear lenses.
Karissa Bell for Engadget

The lenses house Snap’s waveguide tech that, along with “Liquid Crystal on Silicon micro-projectors,” enable their AR abilities. Each pair is also equipped with cameras, microphones and speakers.

Inside each arm is a Qualcomm Snapdragon processor. Snap says the dual processor setup has made the glasses more efficient and prevents the overheating issues that plagued their predecessor. The change seems to be an effective one. In my nearly hour-long demo, neither pair of Spectacles I tried got hot, though they were slightly warm to the touch after extended use. (The fifth-generation Spectacles have a battery life of about 45 minutes, up from 30 min with the fourth-gen model.)

Snap's newest AR Spectacles are extremely thick.
Snap's newest AR Spectacles are extremely thick.
Karissa Bell for Engadget

Snap has also vastly improved Spectacles’ AR capabilities. The projected AR content was crisp and bright. When I walked outside into the sun, the lenses dimmed, but the content was very nearly as vivid as when I had been indoors. At a resolution of 37 pixels per degree, I wasn’t able to discern individual pixels or fuzzy borders like I have on some other AR hardware.

But the most noticeable improvement from Snap’s last AR glasses is the bigger field of view. Snap says it has almost tripled the field of view from its previous generation of Spectacles, increasing the window of visible content to 46 degrees. Snap claims this is equivalent to having a 100-inch display in the room with you, and my demo felt significantly more immersive than what I saw in 2021.

The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
The fourth-generation Spectacles (above) were narrow and not nearly as oversized as the fifth-gen Spectacles (below).
Karissa Bell for Engadget

It isn’t, however, fully immersive. I still found myself at times gazing around the room, looking for the AR effects I knew were around me. At other points, I had to physically move around my space in order to see the full AR effects. For example, when I tried out a human anatomy demo, which shows a life-sized model of the human body and its various systems, I wasn’t able to see the entire figure at once. I had to move my head up and down in order to view the upper and lower halves of the body.

The other big improvement to the latest Spectacles is the addition of full hand tracking abilities. Snap completely redesigned the underlying software powering Spectacles, now called Snap OS, so the entire user interface is controlled with hand gestures and voice commands.

You can pull up the main menu on the palm of one hand, sort of like Humane’s AI Pin and you simply tap on the corresponding icon to do things like close an app or head back to the lens explorer carousel. There are also pinch and tap gestures to launch and interact with lenses. While Snap still calls these experiences lenses, they look and feel more like full-fledged apps than the AR lens effects you’d find in the Snapchat app.

Lego has a game that allows you to pick up bricks with your hands and build objects. I also tried a mini golf game where you putt a golf ball over an AR course. Niantic created an AR version of its tamagotchi-like character Peridot, which you can place among your surroundings.

MyAI interface for the AR Spectacles.
The interface for Snapchat's AI assistant, MyAI, on Spectacles.
Snap

You can also interact with Snapchat’s generative AI assistant, MyAI, or “paint” the space around you with AR effects. Some experiences are collaborative, so if two people with Spectacles are in a room together, they can view and interact with the same AR content together. If you only have one pair of Spectacles, others around you can get a glimpse of what you’re seeing via the Spectacles mobile app. It allows you to stream your view to your phone, a bit like how you might cast VR content from a headset to a TV.

The new gesture-based interface felt surprisingly intuitive. I occasionally struggled with lenses that required more precise movements, like picking up and placing individual Lego bricks, but the software never felt buggy or unresponsive.

There are even more intriguing use cases in the works. Snap is again partnering with OpenAI so that developers can create multimodal experiences for Spectacles. “Very soon, developers will be able to bring their [OpenAI] models into the Spectacles experience, so that we can really lean into the more utilitarian, camera-based experiences,” Dominguez says. “These AI models can help give developers, and ultimately, their end customers more context about what's in front of them, what they're hearing, what they're seeing.”

CEO Evan Spiegel has spent years touting the promise of AR glasses, a vision that for so long has felt just out of reach. But if the company’s 2021 Spectacles showed AR glasses were finally possible, the fifth-generation Spectacles feel like Snap may finally be getting close to making AR hardware that’s not merely an experiment.

For now, there are still some significant limitations. The glasses are still large and somewhat unwieldy, for one. While the fifth-gen Spectacles passably resemble regular glasses, it’s hard to imagine walking around with them on in public.

Then again, that might not matter much to the people Snap most wants to reach. As virtual and mixed reality become more mainstream, people have been more willing to wear the necessary headgear in public. People wear their Apple Vision Pro headsets on airplanes, in coffee shops and other public spaces. As Snap points out, its Spectacles, at least, don’t cover your entire face or obscure your eyes. And Dominguz says the company expects its hardware to get smaller over time.

Snap's fifth-generation Spectacles.
Snap's fifth-generation Spectacles are its most advanced, and ambitious, yet.
Karissa Bell for Engadget

But the company will also likely need to find a way to reduce Spectacles’ price. Each pair reportedly costs thousands of dollars to produce, which helps explain Snap’s current insistence on a subscription model, but it’s hard to imagine even hardcore AR enthusiasts shelling out more than a thousand dollars for glasses that have less than one hour of battery life.

Snap seems well aware of this too. The company has always been upfront with the fact that it’s playing the long game when it comes to AR, and that thinking hasn’t changed. Dominguez repeatedly said that the company is intentionally starting with developers because they are the ones “most ready” for a device like the fifth-gen Spectacles and that Snap intends to be prepared whenever the consumer market catches up.

The company also isn’t alone in finally realizing AR hardware. By all accounts, Meta is poised to show off the first version of its long-promised augmented reality glasses next week at its developer event. Its glasses, known as Orion, are also unlikely to go on sale anytime soon. But the attention Meta brings to the space could nonetheless benefit Snap as it tries to sell its vision for an AR-enabled world.

This article originally appeared on Engadget at https://www.engadget.com/social-media/snaps-fifth-generation-spectacles-bring-your-hands-into-into-augmented-reality-180026541.html?src=rss

watchOS 11 is out now, with new Sleep Apnea feature

Over three months after Apple introduced it at WWDC 2024, watchOS 11 is officially here. The 2024 Apple Watch update, which adds the new Vitals app, widget improvements and sleep apnea detection, is now available to install on your smartwatch.

Apple’s sleep apnea detection feature, which the company highlighted in its Apple Watch Series 10 reveal, will also work with a couple of year-old models. If you own the Apple Watch Series 9 or Apple Watch Ultra 2, you can try the feature before the new model makes it into customers’ hands later this week. Sleep apnea detection will send you an alert if the watch’s sensors detect overnight breathing disturbances. The health feature, similar to one Samsung included with the Galaxy Watch 7 earlier this year, received FDA approval last week.

watchOS 11 also introduces a new Vitals app, further beefing up Apple’s health-tracking features on its wearable. For those who wear their Apple Watch to bed for sleep tracking (and a handy alarm in the morning), Vitals collects your overnight data in one place. The app establishes baselines for your health metrics. It lets you know if any fall outside your typical range, potentially handy for spotting irregularities like oncoming illnesses or tracking the effects of alcohol use.

Similarly, the new Training Load feature measures the intensity of your workouts over time. After establishing an intensity baseline over 28 days, it shows how hard you’re pushing yourself in your workouts — comparing it with your standard averages. At launch, it supports 17 workout types, including walks, runs, cycling, rowing, swings and more. You’ll find your Training Load in the Activity app on your Apple Watch and the Fitness app on your iPhone.

Grid showing various features for watchOS 11.
Apple

Apple added a long-requested feature this year: the ability to pause and customize Activity ring goals. It hardly makes sense to keep pushing yourself (at your watch’s prodding) if you’re sick or need rest. The wearable now lets you take a break for a day, week, month or more without losing your award streaks. In addition, you can set different Activity ring goals for each day of the week and customize the data you care about most in the iOS 18 Fitness app.

The Apple Watch’s Smart Stack (the pile of widgets you see when you scroll down from your watch face) now shows widgets automatically based on context. (For example, rain alerts.) In addition, Live Activities, which arrived on the iPhone two years ago, is also coming to the Apple Watch in the new update. You’ll find Live Activities for things like sports scores you track or an arriving Uber in the watchOS 11 Smart Stack.

Check In is a new feature that lets you notify a friend when you reach your destination. You can begin a Check In from the watchOS Messages app by tapping the plus button next to the text field, choosing Check In and entering where you’re going and when you expect to arrive. Similarly, when exercising, you can start a Check In from the workouts app: Swipe right from the workout screen and choose Check In from the controls. You can then pick a contact to share your exercise routine with.

Other features include new pregnancy tracking in the Cycles app and a Double Tap API that lets third-party developers incorporate hands-free controls.

To download watchOS 11, you’ll first need to install iOS 18 on your paired iPhone. After that, open the Watch app on your phone, then head to General > Software Update. It should then prompt you to update to the 2024 software.

This article originally appeared on Engadget at https://www.engadget.com/wearables/watchos-11-is-out-now-with-new-sleep-apnea-feature-182103629.html?src=rss

A plastic Apple Watch SE may still be happening, but not until next year

Apple announced a bunch of new products at the It’s Glowtime event on Sept. 9, but the rumored Apple Watch SE with a plastic shell wasn’t among them. That doesn’t necessarily mean we won’t see it at some point, though. According to Bloomberg’s Mark Gurman, who first reported that the company was developing an even cheaper version of the budget watch, the plastic SE “is still moving forward.” Sources told Gurman it could arrive next year.

The Apple Watch SE last got a refresh in 2022 with the release of the second generation line. In addition to a plastic shell, plans for the rumored upcoming watch include bolder colors that would appeal to kids, Gurman reported. But, Apple has apparently hit a manufacturing snag with the plastic design. Last Monday’s official Apple Watch news focused on the Series 10, which has a thinner build and larger display, as was expected in the leadup to the event. In a hands-on with the new wearable, Engadget’s Billy Steele called the Series 10’s brighter, wide-angle OLED display “a massive upgrade” compared to other recent models.

This article originally appeared on Engadget at https://www.engadget.com/wearables/a-plastic-apple-watch-se-may-still-be-happening-but-not-until-next-year-203534583.html?src=rss

Elgato’s latest Stream Deck is a $900 rackmount unit for pros

Elgato has introduced the Stream Deck Studio, a new version of its creative control tech that's firmly targeting professional broadcasters. This 19-inch rackmount console has 32 LCD keys and two rotary dials. The $900 price tag shows that this is not an entry-level purchase.

The company collaborated with broadcast software specialist Bitfocus on the Stream Deck Studio. The device can run the Companion software that works on other Stream Deck models, but also supports the company's new Buttons software. The Buttons app allows for additional interface customization designed specifically for the Stream Deck Studio.

Elgato has been expanding its Stream Deck line, which began life as a simple sidekick for livestreamers, to reach a broader range of users. For instance, it introduced an Adobe Photoshop integration aimed at visual artists. This push to reach more pro-tier customers could put Elgato into more frequent competition with rival brands like Loupedeck, which Logitech acquired last year, along with established broadcast brands like Blackmagic.

This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/elgatos-latest-stream-deck-is-a-900-rackmount-unit-for-pros-215003305.html?src=rss

Google Wallet is testing turning passports into digital IDs

Google will beta test a new feature for Google Wallet that can turn US passports into a new form of digital identification. Google announced the new feature on its official blog.

Now digital IDs made from passports and state issued IDs can be used as valid forms of identification at certain Transportation Security Administration (TSA) checkpoints at some US airports. Now you won’t have to play Beat the Clock with all of your pockets frantically searching for your wallet as you slowly approach a TSA agent in line.

This new digital ID feature won’t work at every airport. As of Thursday, 21 states and the commonwealth of Puerto Rico have at least one airport that accepts digital ID. You can consult the TSA’s digital map on its website to find out if the state you’re leaving or flying to accepts digital IDs at security checkpoints.

Digital ID adoption has grown across the country for both Android and iPhone users even if it’s not available in every state. Major airports in Arizona, Colorado and Georgia started accepting state IDs from Google Wallet users. Last month, California started accepting IDs in Apple Wallet and Google Wallet at some airports.

This article originally appeared on Engadget at https://www.engadget.com/apps/google-wallet-is-testing-turning-passports-into-digital-ids-213526915.html?src=rss