You can save big today on the Elgato Stream Deck+ with $30 off the control panel on Amazon. Great for streamers or anyone who wants tactile shortcuts and dials for their workflow, the Stream Deck+ drops from its usual $200 to $170 with a discount and a clickable coupon.
Although the Stream Deck+ sacrifices some buttons compared to the cheaper Stream Deck MK.2, this model makes up for it with four dials and a touch strip. Each dial is customizable and clickable, allowing you to layer different dial shortcuts with each press inward. You can twist them to adjust things like volume, smart lights and in-game settings.
Its eight buttons are backlit and fully customizable. Streamers can use the Stream Deck desktop app to assign functions for things like muting mics, activating effects or triggering transitions. But you don’t need to be a YouTuber or Twitch streamer for it to be helpful. For example, I’m neither and use a Stream Deck daily to toggle preset macOS window arrangements through the third-party app Moom. It’s also handy for text expansion shortcuts or emojis.
The 4.2 x 0.5-inch touch strip displays labels and levels for each knob, giving you a clear visual cue about what you’re controlling with each twist. The touch-sensitive bar also supports custom long presses and page swipes.
Amazon’s sale covers both the black and white Stream Deck+ models. Make sure you click on the $10 coupon box on the product page to bring it down to $170.
This article originally appeared on Engadget at https://www.engadget.com/computing/accessories/elgatos-stream-deck-drops-to-a-record-low-of-170-in-this-early-prime-day-deal-163729012.html?src=rss
The “regular” iPhone has become like a second child. Year after year, this model has gotten the hand-me-downs from the previous version of the iPhone Pro – the older, smarter sibling. The iPhone 15 received the iPhone 14 Pro’s Dynamic Island and A16 Bionic processor, and the iPhone 14 before that got the A15 Bionic chip and a larger Plus variant with the same screen size as the iPhone 13 Pro Max. For the iPhone 16 ($799 & up), there are trickle-down items once more. But this time around, that’s not the entire story for the Apple phone that’s the best option for most people.
Surprisingly, Apple gave some of the most attractive features it has for 2024 to both the regular and Pro iPhones at the same time. This means you won’t have to wait a year to get expanded camera tools and another brand new button. Sure, Apple Intelligence is still in the works, but that’s the case for the iPhone 16 Pro too. The important thing there is that the iPhone 16 is just as ready when the AI features arrive.
So, for perhaps the first time – or at least the first time in years – Apple has closed the gap between the iPhone and iPhone Pro in a significant way. ProRAW stills and ProRES video are still exclusive to the priciest iPhones, and a new “studio-quality” four-microphone setup is reserved for them too. Frustratingly, you’ll still have to spend more for a 120Hz display. But, as far as the fun new tools that will matter to most of us, you won’t have to worry about missing out this time.
New buttons, new bump, old design
Another year has passed and we still don’t have a significant redesign for any iPhone, let alone the base-level model. As such, I’ll spend my time here discussing what’s new. Apple was content to add new colors once again, opting for a lineup of ultramarine (blueish purple), teal, pink, white and black. The colors are bolder than what was available on the iPhone 15, although I’d like to see a blue and perhaps a bright yellow or orange. Additionally, there’s no Product Red option once again — we haven’t seen that hue since the iPhone 14.
The main change in appearance on the iPhone 16 is the addition of two new buttons. Of course, one of those, the reconfigurable action button above the volume rockers, comes from the Pro-grade iPhones. By default, the control does the task of the switch it replaces: activating silent mode. But, you can also set the action button to open the camera, turn on the flashlight, start a Voice Memo, initiate a Shazam query and more. You can even assign a custom shortcut if none of the presets fit your needs.
While Apple undoubtedly expanded the utility of this switch by making it customizable, regular iPhone users will have to get used to the fact that the volume control is no longer the top button on the left. This means that when you reach for the side to change the loudness, you’ll need to remember it’s the middle and bottom buttons. Of course, the action button is smaller than the other two, so with some patience you can differentiate them by touch.
Billy Steele for Engadget
Near the bottom of the right side, there’s a new Camera Control button for quick access to the camera and its tools. A press will open the camera app from any screen, and a long press will jump straight to 4K Dolby Vision video capture at 60 fps. Once you’re there, this button becomes a touch-sensitive slider for things like zoom, exposure and lens selection. With zoom, for example, you can scroll through all of the options with a swipe. Then with a double “light press,” which took a lot of practice to finally master, you can access the other options. Fully pressing the button once will take a photo — you won’t have to lift a finger to tap the onscreen buttons.
Around back, Apple rearranged the cameras so they’re stacked vertically instead of diagonally. It’s certainly cleaner than the previous look, and the company still favors a smaller bump in the top left over something that takes up more space or spans the entire width of the rear panel (Hi Google). The key reason the company reoriented the rear cameras is to allow for spatial photos and videos, since the layout now enables the iPhone 16 to capture stereoscopic info from the Fusion and Ultra Wide cameras.
Photographic stylin’
The iPhone 16 and 16 Plus have a new 48-megapixel Fusion camera that packs a quad-pixel sensor for high resolution and fine detail. Essentially, it’s two cameras in one, combining – or fusing, hence the name – a 48MP frame and a 12MP one that’s fine-tuned for light capture. By default, you’ll get a 24MP image, one that Apple says offers the best mix of detail, low-light performance and an efficient file size. There’s also a new anti-reflective coating on the main (and ultrawide) camera to reduce flares.
The 12MP ultrawide camera got an upgrade too. This sensor now has a faster aperture and larger pixels, with better performance in low-light conditions. There’s a new macro mode, unlocked by autofocus and able to capture minute detail. This is one of my favorite features as sharp images of smaller objects have never been in the iPhone camera’s arsenal (only the Pros), and the macro tool has worked well for me so far.
The iPhone 16, like its predecessors, takes decent stills. You’ll consistently get crisp, clean detail in well-lit shots and realistic color reproduction that doesn’t skew too warm or too cool. At a concert, I noticed that the iPhone 16’s low-light performance is noticeably better than the iPhone 15. Where the previous model struggled at times in dimly lit venues, my 2x zoom shots with this new model produced better results. There wasn’t a marked improvement across the board, but most of the images were certainly sharper.
Macro mode on the iPhone 16 camera is excellent.
Billy Steele for Engadget
The most significant update to the camera on the iPhone 16 is Photographic Styles. Apple has more computational image data from years of honing its cameras, so the system has a better understanding of skin tones, color, highlights and shadows. Plus, the phone is able to process all of this in real time, so you can adjust skin undertones and mood styles before you even snap a picture. Of course, you can experiment with them after shooting, and you can also assign styles to a gallery of images simultaneously.
Photographic Styles are massively expanded and way more useful, especially when you use them to preview a shot before you commit. My favorite element of the updated workflow is a new control pad where you can swipe around to adjust tone and color. There’s also a slider under it to alter the color intensity of the style you’ve selected. For me, the new tools in Photographic Styles make me feel like I don’t need to hop over to another app immediately to edit since I have a lot more options available right in the Camera app.
As I’ve already mentioned, Camera Control is handy for getting quick shots, and the touch-sensitivity is helpful with settings, but I have some gripes with the button. Like my colleague Cherlynn Low mentioned in her iPhone 16 Pro review, the placement causes issues depending on how you hold your phone, and may lead to some inadvertent presses. You can adjust the sensitivity of the button, or disable it entirely, which is a customization you might want to explore. What’s more, the touch-enabled sliding controls are more accurately triggered if you hold the phone with your thumbs along the bottom while shooting. So, this means you may need to alter your grip for prime performance.
Like I noted earlier, the new camera layout enables spatial capture of both video and photos on the iPhone 16. This content can then be viewed on Apple Vision Pro, with stills in the HEIC format and footage at 1080p/30fps. It’s great that this isn’t reserved for the iPhone 16 Pro, but the downside (for any iPhone) is file size. When you swipe over to Spatial Mode in the camera app, you’ll get a warning that a minute of spatial video is 130MB and a single spatial photo is 5MB. I don’t have one of Apple’s headsets, so I didn’t spend too much time here since the photos and videos just appear normal on an iPhone screen.
I’d argue the most significant advantage of Spatial Mode is Audio Mix. Here, the iPhone 16 uses the sound input from the spatial capture along with “advanced intelligence” to isolate a person’s voice from background noise. There are four options for Audio Mix, offering different methods for eliminating or incorporating environmental sounds. Like Cherlynn discovered on the iPhone 16 Pro, I found the Studio and Cinematic options work best, with each one taking a different approach to background noise. The former makes it sound like the speaker is in a studio while the latter incorporates environmental noise in surround sound with voices focused in the center – like in a movie. However, like her, I quickly realized I need a lot more time with this tool to get comfortable with it.
iOS 18 is still waiting on Apple Intelligence
Billy Steele for Engadget
Apple proudly proclaimed the iPhone 16 is "built for Apple Intelligence,” but you’ll have to wait a while longer to use it. That means things like AI-driven writing tools, summaries of audio transcripts, a prioritized inbox and more will work on the base iPhone 16 when they arrive, so you won’t need a Pro to use them. Genmoji and the Clean Up photo-editing assist are sure to be popular as well, and I’m confident we’re all ready for a long overdue Siri upgrade. There’s a lot to look forward to, but none of it is ready for the iPhone 16’s debut. The iOS 18.1 public beta arrived this week, so we’re inching closer to a proper debut.
Sure, it would’ve been nice for the excitement around the new iPhones to include the first crack at Apple’s AI. But, I’d rather the company fine-tune things before a wider release to make sure Apple Intelligence is fully ready and, more importantly, fully reliable. Google has already debuted some form of AI on its Pixel series, so Apple is a bit behind. I don't mind waiting longer for a useful tool than rushing a company into making buggy software.
What will be available on launch day is iOS 18, which delivers a number of handy updates to the iPhone, and many of which deal with customization. For the first time, Apple is allowing users to customize more than the layout on their Home Screen. You can now apply tint and color to icons, resize widgets and apps and lock certain apps to hide sensitive info. Those Lock Screen controls can also be customized for things you use most often, which is more handy now since the iPhone 16 has a dedicated camera button on its frame. There’s a big overhaul to the Photos app too, mostly focused on organization, that provides a welcome bit of automatization.
Performance and battery life
The iPhone 16 uses Apple’s new A18 chip with a 6-core CPU and 5-core GPU. There’s also a 16-core Neural Engine, which is the same as both the iPhone 15 and the iPhone 16 Pro. With the A18, the base-level iPhone jumped two generations ahead compared to the A16 Bionic inside the iPhone 15. The new chip provides the necessary horsepower for Apple’s AI and demanding camera features like Photographic Styles and the Camera Control button. I never noticed any lag on the iPhone 15, even with resource-heavy tasks, and those shouldn’t be a problem on the iPhone 16, either. But, we’ll have to wait and see how well the iPhone 16 handles Apple Intelligence this fall.
Of course, the A18 is more efficient than its predecessors, which is a benefit that extends to battery life. Apple promises up to 22 hours of local video playback on the iPhone 16 and up to 27 hours on the 16 Plus. For streaming video, those numbers drop to 18 and 24 hours respectively, and they’re all slight increases from the iPhone 15 and 15 Pro.
Starting at 7AM, I ran my battery test on the iPhone 16 and had 25 percent left at midnight. That’s doing what I’d consider “normal” use: a mix of calls, email, social, music and video. I also have a Dexcom continuous glucose monitor (CGM) that’s running over Bluetooth and I used the AirPods 4 several times during the day. And, of course, I was shooting photos and a few short video clips to test out those new features. While getting through the day with no problem is good, I’d love it if I didn’t have to charge the iPhone every night, or rely on low-power mode to avoid doing so.
On a related note, Apple has increased charging speeds via MagSafe, where you can get a 50 percent top up in around 30 minutes via 25W charging from a 30W power adapter or higher.
Wrap-up
With the iPhone 16, Apple has almost closed the gap between its best phone for most people and the one intended for the most demanding power users. It’s a relief to not pine for what could be coming on the iPhone 17 since a lot of the new features on the iPhone 16 Pro are already here. And while some of them will require time to master, it’s great that they’re on the iPhone 16 at all. There are some Pro features you’ll still have to spend more for, like ProRAW photos, ProRES video, a 120Hz display, a 5x telephoto camera and multi-track recording in Voice Memos. But those are luxuries not everyone needs. For this reason, the regular iPhone will likely suit your needs just fine, since splurging on the high-end model has become more of an indulgence than a necessity.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-iphone-16-and-iphone-16-plus-review-closing-the-gap-to-the-pro-120050824.html?src=rss
Apple Intelligence is edging closer to being ready for primetime. Apple has released the public beta of iOS 18.1, which includes some of the major generative AI features that the company has been talking up over the last few months.
We'll have to wait a few more weeks for the public versions of iOS 18.1, iPadOS 18.1 and macOS Sequoia 18.1 to bring Apple Intelligence features to everyone with a compatible device. The public betas should be more stable and less risky to install than the developer betas, but it's still definitely worth backing up your data to your computer and/or iCloud before putting this build of iOS 18.1 on your iPhone.
Right now, the only iPhones that support Apple Intelligence are the iPhone 15 Pro and iPhone 15 Pro Max, but that will change on Friday when Apple ships the iPhone 16 lineup. M-series iPads and Macs will support Apple Intelligence too.
For now, you'll need to have your device and Siri language set to US English to access Apple Intelligence tools. If you want to use Apple Intelligence in a language other than English (or in a localized version of English), you may need to wait until at least December for the public versions of the operating systems that support it.
Apple is gradually rolling out Apple Intelligence tools over the coming months, so not all of them will be available right away. The initial wave of features includes the ability to transcribe phone calls (and audio notes in the Notes app) and get summaries of the key details. Writing tools (rewriting, proofreading and summarizing), email prioritization and smart replies, notification summaries and photo clean up features are also on the docket. You'll be able to create memories in the revamped Photos apps and check out the first incarnation of the redesigned, glowing Siri (including the ability to type requests to the assistant).
You'll need to wait longer for certain other features, including ChatGPT integration, Genmoji, Image Playground (i.e. image generation) and Siri's ability to better understand personal context. Apple will roll those out over the coming months.
How to get the new Apple Intelligence features
On your iPhone, go to Settings > General > Software Update > Beta Updates and select the iOS 18 public beta option. Once the iOS 18.1 public beta is available for your device, you'll be able to see it on the software update page. You might need to free up some space before you can install the beta. To enable Apple Intelligence, go to Settings > Apple Intelligence & Siri > Join the Apple Intelligence waitlist.
The public beta installation process is almost identical on iPad. On your Mac, you'll need to go to System Settings > General > Software Update. Click the info symbol next to the ”Beta updates" option and you should be able to install the iOS 18.1 public beta from there when it's available.
This article originally appeared on Engadget at https://www.engadget.com/ai/the-ios-181-public-beta-is-here-bringing-apple-intelligence-almost-to-the-masses-175248580.html?src=rss
Google is rolling out a really useful update for Google Password Manager, allowing users to sync passkeys across their many devices. Up until this point, folks could only save passkeys to Google Password Manager on Android, so the cross-device utility was limited. It was possible to use the passkeys on other devices, but it would require users to scan a QR code.
The update allows for passkey saving via Google Password Manager on Windows, macOS, Linux and, of course, Android. ChromeOS is currently being beta tested, so that functionality should come sooner rather than later. Google also says that iOS support is “coming soon.”
Once saved, the passkey automatically syncs across other devices using Google Password Manager. The company says this data is end-to-end encrypted, so it’ll be pretty tough for someone to go in and steal credentials.
For the uninitiated, a passkey is slightly different from a password. A passkey is a digital credential that allows users to sign in to an account without using a password. The company’s been using passkeys across its software suite since last year.
Today’s update also brings another layer of security to passkeys on Google Password Manager. The company has introduced a six-digit PIN that will be required when using passkeys on a new device. This would likely stop nefarious actors from logging into an account even if they've somehow gotten ahold of the digital credentials. Just don’t leave the PIN number laying on a sheet of paper directly next to the computer.
Google passkeys can already be used with the company’s productivity software, of course, but also with Amazon, PayPal and WhatsApp. Google Password Manager is built right into Chrome and Android devices.
This article originally appeared on Engadget at https://www.engadget.com/apps/google-passkeys-can-now-sync-across-devices-on-multiple-platforms-160056596.html?src=rss
It may seem like Apple is behind the competition a lot of the time. The company appeared to be slow to developments like widgets, bezel-less displays with camera notches and screens with high refresh rates. And with the iPhone 16 Pro, it appears to once again be late to the party, bringing generative-AI features and a real button for the camera to its 2024 flagship. But if you'll allow me to play therapist for a moment, I think it's not that Apple is slow. I think Apple is cautious. Perhaps overly so.
Caution on its own isn't a bad trait — in fact, it could be considered thoughtful. Rather than rush to the cutting edge with its peers, Apple deliberates, usually finding a slightly different approach that is often an improvement on what's out there. Just look at the Vision Pro headset or Apple Silicon. Or even the iPod, the iPad and the AirPods, which were far from the first of their kind when they launched.
With the iPhone 16 Pro, the focus is on cameras and Apple Intelligence. The problem is, Apple Intelligence isn't quite here yet. We can test some features in the developer beta that's currently available, but that's not necessarily the same as the experience the public will get when the update rolls out in October. It’s not unprecedented for new iPhones to launch without some marquee features, sure, and thankfully there's still plenty that the iPhone 16 Pro brings. From Camera Control, the Fusion Camera and other video-related updates to slightly bigger displays and iOS 18, the iPhone 16 Pro and Pro Max are intriguing successors, even absent the vaunted Intelligence features that are still to come.
Video review of the iPhone 16 Pro
The iPhone 16 Pro’s design and displays
I’m getting deja vu. Looking back at my review of the iPhone 15 Pro, I see a picture of that phone and its predecessor lined up side by side to show just how much thinner the bezels are. Apple has once again trimmed the borders on its flagship phones, but while doing that enabled it to reduce the handsets’ size in 2023, this year it allowed the company to cram in larger screens without much change in footprint.
The iPhone 16 Pro and Pro Max displays have increased in size from 6.1 inches and 6.7 inches up to 6.3 inches and 6.9 inches, respectively. Both handsets have grown ever so slightly, too, by just under 1mm in width and about 3mm in height.
Basically, the iPhone 16 Pro and Pro Max are a hair wider and taller than their predecessors, but maintain the same 8.25mm (0.32-inch) profile. And yet, in spite of this minimal change, you won’t be able to keep your old cases if you’re upgrading from an iPhone 15 Pro to an iPhone 16 Pro.
Not only would the cases not quite fit, you’d also need something with either a cutout or a sapphire crystal and conductive layer to be able to use the new Camera Control. Of course, Apple sells compatible cases, as do some third parties like Otterbox, so you have plenty of options.
I’ve spent most of this year’s hardware review season remarking how Samsung and Google’s flagships feel like iPhones, and I’ve now reached a strange inception point. As I’ve been comparing competing phones for this review, I’ve been surrounded by about a dozen handsets from all these different companies on my couch, including last year’s iPhones, the Galaxy S24 Plus and the Pixel 9 Pro and Pro XL. Trying to figure out which one is the iPhone has become more confusing than ever, as they all feel similar in build. The best way to verify at a glance is looking at their camera arrays or my wallpaper.
All that is to say that the iPhone 16 Pro feels similar to its predecessor, which is what these other companies have been attempting to emulate. Apple would be right to feel flattered by this imitation, and yet I have to wonder if it’s time to do something different. Google’s Pixel 9 Pro is actually a whole six grams lighter than the iPhone 16 Pro at 221 grams (7.79 ounces), and I’m absolutely smitten by its rich pink hue and shiny edges. Though I like the new golden Desert color for the iPhone 16 Pro, I do wish Apple’s premium flagship had more fun and vibrant exteriors. That said, I do love the base iPhone 16 in pink, teal and Ultramarine.
Brian Oh for Engadget
Camera control is (not) just a button
Arguably the biggest change to the iPhone 16 lineup, not to mention the iPhone 16 Pro, is the introduction of Camera Control. This is a button on the right side of the device, which has touch and pressure sensors on it to enable greater control with swipes and semi-presses. (That’s in addition to the Action Button on the top left that was added to last year’s Pros, and carries over to the iPhone 16 and iPhone 16 Plus, too.)
One of the things this was supposed to do was let you push lightly on the button to trigger focus, similar to what half pressing a DSLR shutter button would do. That function won’t be available at launch, so I can’t say if it’s effective.
But by and large, Camera Control is a very Apple approach to a feature that has been around for years. From phones by Sony and Nokia with dedicated shutter buttons to Android handsets with hardware-based double-click shortcuts, the notion of quick access to your camera without having to futz with the screen is a popular one. For good reason, too — I’ve hated having to swipe or long-press the icon on my iPhone’s lock screen in the past, and even though I could set the iPhone 15 Pro’s Action button to open the camera, it just wasn’t positioned well and I’d have to give up my mute button.
So Apple isn’t breaking new ground with its hardware shortcut for a frequently used app. But it does do a few things differently with the touch sensor. You can swipe on it to tweak things like exposure, zoom levels and tone, and the half-press still works as a way to select options or go back out of menus within the new Camera Control interface. In theory, it’s a nice way to make changes on the fly.
In reality, there were a few issues, and they largely have to do with placement. The button sits a little farther from the base of the phone than I’d like, so my fingers have to reach a bit more to press it, whether I was in landscape or portrait mode. This wasn’t usually a problem when I had both hands free and could steady the iPhone with my other hand and readjust my grip.
But if you’re trying to take a quick shot with just one hand, the button’s location can feel unintuitive. Of course, everyone has different finger lengths and ratios, so it’s entirely possible that other people find this logical. It also depends on your grip — if you’re cradling the bottom of the device in your palm, it’s harder to maneuver. If you’re covering part of the screen and reaching for the button head on, it’s slightly easier to use camera control.
Brian Oh for Engadget
Still, even for those with the strongest claws, swiping and half-pressing and double-half-pressing on the sensor is tricky. I was only ever really able to do that if I had my thumb holding up the bottom edge and my middle, ring and little fingers steadying the right end of the phone. Maybe this is a new camera grip I just need to relearn for this button.
The awkward placement is a minor gripe compared to what I found most annoying: the button’s touch sensor. Not only was it difficult to swipe through different settings when holding the device with one hand, it also reacts to accidental touches and swipes. Sometimes, the phone would slide down my palm and change the exposure or zoom level, completely ruining the vibe. I should point out that you can go into accessibility settings to either tweak the swipe sensitivity or turn it off altogether, if it really bothers you. Honestly, if you’re planning on making adjustments with Camera Control, it’s best to have time, patience and both hands free.
In those situations, I had a lot of fun editing settings and watching them be reflected in the viewfinder in real time. I also liked zooming in and out of subjects, recomposing a shot and tweaking exposure till I liked what I saw, before then pushing down to snap the picture. (This action does lead to some small issues, but more on the actual photo quality later.) I especially loved this while recording video, since it makes slowly zooming in or out of a subject smoother than using the onscreen slider.
Then again, for scenarios where I just want to fire off a quick shot without worrying about exposure or zoom settings, the pain of finagling with the sensor mostly goes away. In exchange, being able to rapidly snap pictures is a joy. I found myself taking more pictures than ever thanks to camera control, which if you know me is a feat worthy of the Guinness Book of Records.
A random person cut me off in line? Click. Funny sign on a building I pass by in a Lyft? Click, click. From your lock screen, you’ll have to press the button twice — once to wake the phone up and once to open the camera. Then press again to take the photo. It’s not ideal, but not too far off the same process on a Pixel phone, for instance. Plus, you can long-press the iPhone’s button to start recording a video, and it’ll automatically stop when you let go.
Cherlynn Low for Engadget
This sort of rapid access to the camera is the best thing about the new button, and I could see it being potentially useful not just for shutterbugs like me, but for the upcoming Visual Intelligence feature that Apple teased at its launch event. The company’s version of Google Lens could allow people to ask questions about things in the real world around them. But of course, since this wasn’t available during my review period, I wasn’t able to test it.
For now, you can go into Settings to either change the number of clicks it takes to trigger the camera app, remap it to a Code scanner or the Magnifier tool or disable it altogether. Since you can also set up the Action button to do these things, you have more choices now over where you want your camera shortcut or free up the former volume slider to do something else.
The iPhone 16 Pro: Fusion camera for fast and slow moments
Even if you’re not a glutton for buttons, there are still some camera updates that might intrigue you. This year’s flagships sport what Apple calls a 48-megapixel Fusion Camera, which has a faster quad-pixel sensor. This enables what the company describes as “zero shutter lag,” which is wording it has used repeatedly over the years. In this case, it’s referring to how quickly the camera will capture a shot after you press the shutter button (onscreen or hardware).
I will admit I was initially confused by this update, in part because it requires relearning some behaviors I had adopted to mitigate the shortfalls of older cameras. Basically, the iPhone 16 Pro’s cameras are now so fast that when I asked someone to throw something so I could capture it in motion to see how still the images were, my shots ended up being of the person holding the object.
Our video producer and I were very confused, and it wasn’t until the “zero shutter lag” concept was explained clearer to me that I got it. I had become used to pressing the shutter early since cameras, in my experience, would be fractions of a second slow. Apple has become so fast that it actually captured the literal moment I tapped the button, instead of the split second after, when the object was in mid-air.
Brian Oh for Engadget
This is going to change how people take jump shots, I’m sure, but basically if you and your friends are taking pictures of yourselves floating in the sky, the photographer doesn’t have to hit capture before telling you to jump. I know this is a very specific and silly example, but it’s also the most relatable illustration of how much quicker the Fusion camera is.
Also, why can’t camera stories be silly and fun? That’s what a lot of the best moments in life are, and some of the new features are great in those situations. The support for 4K video at 120 fps in Dolby Vision, for example, led to some beautiful high-quality, rich and colorful clips of my friend’s adorable pomeranian trotting along on a walk. Her little tongue slowly peeking out as she bounded towards the camera looked crisp and smooth when I played it back at 25 percent and 20 percent speeds, too.
The iPhone 16’s new Photographic Styles are excellent
Depending on your mood, the new Photographic Styles can be fun or serious. Apple’s tweaked the built-in camera filters to not only offer more options but give you greater control. Due to how the company has refined its processing each year, there’s also an improved depth map captured when it detects a face in the scene. This, combined with a greater focus on color science around skintone, has led to what might be my favorite new iPhone 16 feature.
Whether I shot them in Portrait mode or not, photos of people that I took using the iPhone 16 Pro were a dream to edit. Simply switching between the Standard, Natural, Luminous, Quiet or Ethereal styles already resulted in improvements to the colors and shadow, but I could also tap on each thumbnail to access the new editing touchpad and drag a dot around. This let me more precisely tweak the hues and contrast levels, and an additional slider below let me adjust how warm the image was.
Cherlynn Low for Engadget
An ugly selfie with my cousin in the hideous overhead lights of a meeting room became a beautiful snapshot after I switched to the Ethereal or Luminous styles. Both of those are quickly becoming my favorites, but I’m more impressed with how well Apple was able to segment the subject from the background. In almost every shot I edited, adjusting the slider mostly only changed the background, keeping people and their complexions within the realm of reality instead of applying harsh oversaturation or extreme contrast levels to them. They also added a background blur that lent a pleasant soft focus effect, and most of the time the system accurately identified outlines of people in the scene.
Perhaps my favorite part is the fact that you can change between styles after you’ve shot the photo on the iPhone 16. As someone who dwells on her Instagram filters and edit tools for some time before each post, I definitely appreciate how much nicer Apple’s versions are and only wish I could retroactively apply them to photos I had taken at a recent wedding. Alas, since the edits are dependent on information captured when the photos were taken, these new retouching features will only work for pictures taken with an iPhone 16 or 16 Pro.
Audio Mix on the iPhone 16 is… mixed
One final camera update I’ll touch on before telling you about actual photo quality is Audio Mix. This uses the spatial audio now recorded by default with the new studio mics on the iPhone 16 Pro (or even the system on the iPhone 16 and 16 Plus) to understand the direction of sound sources in your footage. Then, when you edit the clip, you can choose between Standard, In-frame, Studio and Cinematic mixes, as well as drag a slider to reduce background noise.
You’ll have to be recording in fairly specific acoustic scenarios to get the most out of Audio Mix. I tested it in a variety of situations, like my cousin talking on his phone on a busy New York street, me interviewing my fellow gym buddies after a tiring workout with the background music quietly playing or my friend talking to me while his wife talks about something else off-camera in their fairly quiet kitchen.
For the most part, going to Cinematic or Studio modes from Standard resulted in a noticeable reduction in environmental noise. My favorite is Studio, which generally seemed to improve voice clarity as well, making people sound like they could be talking on a podcast. In-frame, however, rarely did what I expected and occasionally produced some warped distortion. It appears there might need to be more distance between various sources of sound for this to work best, and I have to spend more time testing to better understand this tool. You can check out our review video for examples of a clip with different audio mixes, but for now, while the promised improvements aren’t what I expected, there at least appears to be some benefit to Audio Mix.
Cherlynn Low for Engadget
The iPhone 16 Pro’s photos versus the Pixel 9 Pro
On to the actual photos and how they hold up against the competition. I’ve long considered Google’s Pixel phones to be the gold standard in smartphone photography, since I prefer the company’s color and detail processing. I know some people feel that Google tends to oversharpen, so bear in mind that, as with most things, your preference may be different from mine.
When I compared photos I took with both phones on the same laptop screen, the differences were minimal. Occasionally, Google would expose better, being more able to retain shadows near a bright light source than the iPhone 16 Pro. But the Pixel’s nightscape shots had more light leakage into the sky, whereas Apple was more adept at keeping the background dark against the outline of a skyscraper.
Honestly at this point we’re really nitpicking and pixel-peeping to find differences. Both companies deliver great cameras, and though I still prefer Google’s approach to Portrait shots, Apple has been slowly but surely closing the gap with improvements to its depth maps every year.
I will mention, though, that a lot more of the photos I shot on the iPhone 16 Pro came out blurrier than the Pixel 9 Pro, and it might have to do with the fact that I was using the Camera Control to snap them. This was the issue I alluded to earlier, where using a physical button to take a picture is more likely to introduce shake than a software shutter. It’s not like Samsung or Google phones are immune to this problem, though I will say that the way Camera Control is built, where the recessed button depresses into the phone’s frame, does leave it a bit more vulnerable to this than, say, using a volume rocker might.
Cherlynn Low for Engadget
Oh and finally, a quick note for my Gen Z readers: I know how much you all prefer flash photography compared to night modes in low light scenarios. (Thanks to my much younger cousin for the valuable insight.) I’ve done the testing and can say that I prefer Google’s Pixel 9 Pro for its software, warmer flash compared to the iPhone 16 Pro’s, which is stronger and brighter, leading to my face looking washed out.
iOS 18 is here, but not Apple Intelligence
It’s been about two months since the public beta for iOS 18 was released, and it was nice to get a taste of upcoming features like the new customizable home pages, expanded Tapback reactions and the redesigned Photos app. With the iPhone 16 launch, iOS 18 is basically ready for primetime… with some caveats.
This year, more than ever, it’s hard to figure out what’s coming to your iPhone and what isn’t. With the release of Apple Intelligence slated for October, features like writing tools, Cleanup for photos and the redesigned Siri won’t be ready till next month. And even then, your non-pro iPhone 15 won’t be compatible.
It’s confusing, and can make choosing your iPhone a trickier decision. But for this review, at least the iPhone 16 Pro and Pro Max are getting everything. I cannot wait to try out multi-track recording in Voice Memos, and I hope Apple sees this yearning as a sign that it should bring this to more devices.
It was nice to get time with iOS 18, even in the absence of Apple Intelligence. Honestly, I’m not even sure I’d like those features that much. In a similar way, Gemini AI was nice on the Pixel 9 Pro series, but didn’t feel like must-haves.
Some of the new iOS 18 touches I noticed immediately were the refreshed Control Center, which took some getting used to as I had to re-learn how to swipe back to the home page, since there are more pages to scroll through now. I especially enjoyed seeing the new little chat bubble appear on my voice recordings, indicating that a transcript had been generated for them. And though I haven’t exchanged messages with Android-toting friends yet, I’m glad to see RCS support is finally live this week.
Brian Oh for Engadget
Though I was excited for the new custom routes tool in Maps, I struggled to actually create them. You can set your start and end points and have the app close the loop for you, or just tap landmarks or points on the map to get the route to basically connect the dots. Unfortunately, no matter how many times I tried to get the route to cut through a building where I knew a pedestrian walkway existed, Maps resisted me at every turn, forcing the route to go through more established (and therefore more crowded) paths instead. It’s not unreasonable, but certainly not the open-world route-creation feature I was envisioning.
The best thing about iOS 18, and also some new features in the iPhone 16 lineup (like in the camera controls) is the customizability. I do appreciate that if you don’t like something, you can usually turn it off. With the new ability to place apps outside of a rigid grid, you can now lay your home screen out just the way you like. The redesigned Photos app lets you create and pin collections so you can more easily find the pictures most important to you. And again, I’m glad Apple is giving people the option to turn off Camera Control altogether or adjust its sensitivity.
Performance and battery life
The iPhone 16 Pro and Pro Max are powered by Apple’s A18 Pro chip, which are built on “second-generation 3-nanometer technology and [feature] a new architecture with smaller, faster transistors.” All this is meant to deliver “unprecedented efficiency,” according to Apple’s press release.
Some small software glitches aside, I’ve never run into slowdown on the iPhone 16 Pro, but I was certainly surprised by the smaller handset’s battery life. In general, the iPhone 16 Pro would barely last a full day, which is reminiscent of the iPhone 15 Pro, too. It’s worth noting that before this review I was primarily using an iPhone 15 Pro Max as my daily driver, which usually gets through a day and a half with no problem, so the drop in endurance is even more pronounced for me.
Most days, I’d pick up the iPhone 16 Pro at about 9AM and would get to about 9pm before getting low battery alerts. If I started the day a bit later, closer to 11AM for instance, I got to 1am before the iPhone 16 Pro ran completely dry. On Sunday, I unplugged the phone at about 9:30AM and was shocked on the train home to get a warning that remaining power was at just 20 percent. It was only 6:50PM, and the night had barely just started!
You’ll get significantly better battery life on the iPhone 16 Pro Max, which delivers the same almost two-day runtime as its predecessor. And sure, a phone with a smaller battery not lasting as long makes mathematical sense. But considering the Pixel 9 Pro is a comparably sized handset and manages to last about two days, there’s no excuse for the iPhone 16 Pro to conk out before the night is up.
Brian Oh for Engadget
Which iPhone 16 should you get?
One of the best things about the iPhone 16 Pro lineup is that, unlike last year, there isn’t much of a tradeoff in cameras if you opt for the smaller device. The iPhone 15 Pro Max had a 5x telephoto zoom camera, while the iPhone 15 Pro only went up to 3x. As a budding photographer of skittish wild animals, I opted for the Max, especially since it was much lighter than its predecessor thanks to the titanium build.
With the iPhone 16 Pro having essentially the same camera system as the Pro Max, I thought it was time for me to go back to a size that was much easier on my hands. Alas, with the disappointing battery performance, I might just have to stick with a Max, and you might too.
There’s also the non-Pro iPhone 16 models to consider, and you can check out my colleague Billy Steele's review of the iPhone 16 and iPhone 16 Plus for more details. Just as there were fewer differences than ever between the Pro and Pro Max, the tradeoffs aren’t as significant this year, either. Apple brought the previously Pro-exclusive Action button to the iPhone 16 and iPhone 16 Plus, while also including the Camera Control on its less-premium phones.
The main things that set the two lines apart this year are processors, screen quality, camera sensors and onboard mics. You’ll lose support for ProRaw photos and multi-layer recording by opting for the cheaper devices, too. Basically, if you want all the best features Apple has to offer, or you plan on using your phone to create high-quality videos and get 5x telephoto zoom in your photos, the Pros are the way to go.
Otherwise, you’ll still have all the iOS 18 and Apple Intelligence features coming to the Pros, as well as spatial audio recording, which enables the Audio Mix I described in the camera section earlier.
Cherlynn Low for Engadget
Wrap up
Apple’s caution is sometimes warranted. Especially at a time when mistrust of AI-generated content runs rampant, the company taking its time to get Apple Intelligence right is understandable. But its deliberation doesn’t always lead to winners. While I appreciate the attempt to differentiate camera control with the touch sensor for more versatility, I’m not yet convinced of its usefulness.
The good news is, and I cannot stress this enough, you have the option to tune it to your liking. And that’s a theme I’m seeing in recent Apple features that hint at more thoughtfulness than usual. If you don’t like something, or if something isn’t right for your needs, you can adjust or disable it. In iOS 18, you have greater control over your home screen’s app layout and can pin custom collections for easier reach in the Photos app. The Action button introduced last year could have been a spectacular fail had Apple not let you still keep it as a mute switch, but it managed to give people more functionality while maintaining the status quo for those who are just as resistant to change.
Change is scary. Change is hard. But without change there is no progress. Apple’s cautious approach is a tricky balancing act that’s evident on the iPhone 16 Pro. Some new features, like Audio Mix and custom routes in Maps, deliver mixed results. Others, like Photographic Styles, are hits. Then there are the basic ingredients, like good battery life and durable, attractive designs, that Apple cannot neglect.
The iPhone 16 Pro’s subpar battery life holds it back from beating the competition, which is stiffer than ever this year, especially from Google. Luckily for Apple, most people who have iPhones are going to stick with iPhones — it’s just easier. For those already sucked into the ecosystem, the iPhone 16 Pro (and particularly the Pro Max) are worth the upgrade from a model that’s at least two years old. If you already have an iPhone 15 Pro (or even a 14 Pro), for the sake of our planet and your wallet, you might prefer to hold off on upgrading, especially since this year’s devices aren’t that much different.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-and-pro-max-review-apple-focuses-on-cameras-and-customization-120052459.html?src=rss
Since first introducing its generative AI assistant, Snap has been steadily ramping up the amount of AI in its app. Now, the company is adding a new slate of AI-powered features as it begins testing a larger redesign of the app.
Snap often brings new AI features to its Snapchat+ subscribers first, and the company is continuing the trend with a new feature called “My Selfie.” The feature uses selfies to create AI-generated images of users and their friends (if they also subscribe) in creative poses and situations. The company is also rolling out a new "grandparents lens" that uses AI to imagine what you might look like as a senior citizen.
Snapchat+ subscribers will also get access to a new AI feature in memories, which tracks users’ previously saved snaps. With the change, Memories will be able to surface photos and videos that have been edited with with AI-generated captions or new AR lens effects.
Additionally, Snap is making its chatGPT-powered MyAI assistant more powerful with the ability to “problem solve” based on photo snaps. The company says the assistant will be able to translate restaurant menus, identify plants and understand parking signs.
The new "simplified" Snapchat design.
Snap
The new AI capabilities arrive as Snap is starting to test a larger redesign of its app that’s meant to make Snapchat, long criticized for a confusing interface, simpler and more intuitive. The new design will bring conversations between friends and Stories content into a single view, with Stories at the top of conversations. (Interestingly Snap previously combined users’ chats and Stories into a single feed in a previous, widely unpopular redesign in 2018.) The redesign will also eliminate a separate tab for the Snap Map, placing it instead in the “chat” tab.
And instead of keeping separate sections for Spotlight and Stories, Snap will combine the two into a single "Watch" feed that will algorithmically recommend content. While the current iteration of Snapchat has five distinct sections, the “simplified” version will have just three, including the camera, which will still be the first screen users see upon launching the app.
Snap has struggled with major redesigns in the past and the company says it intends to roll out the new look slowly, with only a small number of users getting the update to start.
This article originally appeared on Engadget at https://www.engadget.com/social-media/snap-is-redesigning-snapchat-and-adding-new-ai-powers-171552703.html?src=rss
After years of scrutiny over its handling of teen safety on its platform, Meta is introducing a new type of account that will soon be required for all teens under 16 on Instagram. The new “teen accounts” add more parental supervision tools and automatically opt teens into stricter privacy settings that can only be adjusted with parental approval.
The changes are unlikely to satisfy Meta’s toughest critics, who have argued that the company puts its own profits ahead of teens’ safety and wellbeing. But the changes will be significant for the app’s legions of younger users who will face new restrictions on how they use the app.
With teen accounts, kids younger than 16 will be automatically opted into Instagram’s strictest privacy settings. Many of these settings, like automatically private accounts, the inability to message strangers and the limiting of “sensitive content” have already been in place for teenagers on Instagram. But younger teens will now be unable to change these settings without approval from a parent.
And, once a parent has set up Instagram’s in-app supervision tools, they’ll be able to monitor which accounts their kids are exchanging messages with (parents won’t see the contents of those DMs, however) as well as the types of topics their children are seeing posts about in their feeds. Parents will also have the ability to limit the amount of time their kids spend in the app by setting up “sleep mode” — which will mute notifications or make the app inaccessible entirely — or reminders to take breaks.
Meta
The changes, according to Meta, are meant to “give parents greater oversight of their teens’ experiences.” While the company has had some parental supervision features since 2022, the features were optional and required teens to opt-in to the controls. Teen accounts, on the other hand, will be mandatory for all teens younger than 16 and the more restrictive settings, like the ability to make an account public, aren’t able to be adjusted without parent approval.
The company says it also has a plan to find teens who have already lied about their age when setting up their Instagram account. Beginning next year, the company will use AI to detect signs an account may belong to a teen, like the age of other linked accounts and the ages on the accounts they frequently interact with, to find younger users trying to avoid its new restrictions. The app will then prompt users to verify their age.
In the meantime, Meta will start designating new accounts created by 13 to 15-year-olds as “teen accounts” beginning today. The company will start switching over existing teens into the accounts over the next two months in the US, Canada, UK and Australia, with a wider rollout in the European Union planned for “later this year.” Teen accounts will be available in other countries and on Meta’s other apps beginning in 2025.
This article originally appeared on Engadget at https://www.engadget.com/social-media/instagram-teen-accounts-with-parental-controls-will-be-mandatory-for-kids-under-16-120013852.html?src=rss
Over three months after Apple introduced it at WWDC 2024, watchOS 11 is officially here. The 2024 Apple Watch update, which adds the new Vitals app, widget improvements and sleep apnea detection, is now available to install on your smartwatch.
watchOS 11 also introduces a new Vitals app, further beefing up Apple’s health-tracking features on its wearable. For those who wear their Apple Watch to bed for sleep tracking (and a handy alarm in the morning), Vitals collects your overnight data in one place. The app establishes baselines for your health metrics. It lets you know if any fall outside your typical range, potentially handy for spotting irregularities like oncoming illnesses or tracking the effects of alcohol use.
Similarly, the new Training Load feature measures the intensity of your workouts over time. After establishing an intensity baseline over 28 days, it shows how hard you’re pushing yourself in your workouts — comparing it with your standard averages. At launch, it supports 17 workout types, including walks, runs, cycling, rowing, swings and more. You’ll find your Training Load in the Activity app on your Apple Watch and the Fitness app on your iPhone.
Apple
Apple added a long-requested feature this year: the ability to pause and customize Activity ring goals. It hardly makes sense to keep pushing yourself (at your watch’s prodding) if you’re sick or need rest. The wearable now lets you take a break for a day, week, month or more without losing your award streaks. In addition, you can set different Activity ring goals for each day of the week and customize the data you care about most in the iOS 18 Fitness app.
The Apple Watch’s Smart Stack (the pile of widgets you see when you scroll down from your watch face) now shows widgets automatically based on context. (For example, rain alerts.) In addition, Live Activities, which arrived on the iPhone two years ago, is also coming to the Apple Watch in the new update. You’ll find Live Activities for things like sports scores you track or an arriving Uber in the watchOS 11 Smart Stack.
Check In is a new feature that lets you notify a friend when you reach your destination. You can begin a Check In from the watchOS Messages app by tapping the plus button next to the text field, choosing Check In and entering where you’re going and when you expect to arrive. Similarly, when exercising, you can start a Check In from the workouts app: Swipe right from the workout screen and choose Check In from the controls. You can then pick a contact to share your exercise routine with.
Other features include new pregnancy tracking in the Cycles app and a Double Tap API that lets third-party developers incorporate hands-free controls.
To download watchOS 11, you’ll first need to install iOS 18 on your paired iPhone. After that, open the Watch app on your phone, then head to General > Software Update. It should then prompt you to update to the 2024 software.
This article originally appeared on Engadget at https://www.engadget.com/wearables/watchos-11-is-out-now-with-new-sleep-apnea-feature-182103629.html?src=rss
While it's unclear if mainstream PC users are actually using Microsoft's Copilot AI, the company claims that businesses using MS 365 Copilot are seeing plenty of benefits. According to a Microsoft survey, Copilot users at Honeywell save up to 92 minutes per week, while customer service agents at Teladoc are saving up to five hours a week by using the AI tool to draft responses to questions. Now that we're a year beyond the MS 365 Copilot launch (at a costly $30 per seat), Microsoft is eager to throw more AI features at corporate drones.
Most intriguingly, Microsoft is upgrading its Business Chat app, which so far has been a way to interact with Copilot's across your emails, calendar entries and other data, alongside data from your organization. Now it's getting better collaboration with the addition of Copilot Pages, which will serve as a sort of "multiplayer" way to share AI generated content with your coworkers.
Copilot Pages in BizChat.
Microsoft
"With Pages, all the data in your organization — whether created by humans or AI — is persistent, accessible and valuable," Microsoft CVP Jared Spataro wrote in a blog post. "Pages takes ephemeral AI-generated content and makes it durable, so you can edit it, add to it, and share it with others... This is an entirely new work pattern — multiplayer, human to AI to human collaboration."
It's surprising that it took a year for Microsoft to bring better collaboration to the Business Chat app, as that's an expected feature of every workplace app these days. Having a place for employees to share their existing Copilot queries simply makes sense: Coworkers may want access to the same information, and it's also environmentally wasteful to have people running the same Copilot search multiple times. (Generative AI queries are far more costly for the environment than simple web searches.)
Microsoft says Pages will be available today to MS 365 Copilot users, and it'll also be coming to free Copilot customers with Microsoft Entra accounts "in the coming weeks."
In general, Microsoft says Copilot queries are more than two times faster now compared to launch, because it's relying on the newer GPT4o model. The company is also upgrading AI capabilities across the suite of MS 365 apps: Excel is getting Python support for more complex queries; PowerPoint's Narrative builder capability is widely available, allowing you to craft the story of your presentations with AI help; and Teams can now scan across meeting transcripts and their accompanying chats.
Outlook Prioritize my Inbox
Microsoft
The other Office apps aren't left out either. Outlook will soon let you choose topics, people and keywords to highlight for the "Prioritize my inbox" feature. You'll also be able to reference meetings and emails directly within Word documents, one OneDrive will let you summarize and compare files without opening them using Copilot.
And if you need even more Copilot AI help, business can also create Copilot Agents directly within Business Chat and SharePoint. They're like chatbots that can peer within your corporate files, and you can also tag them in comments like a typical cooworker. While we still need to see these Agents in action to determine if they're actually useful, at the very least, you can feel less guilty about assigning them some menial information processing at the end of the work day.
This article originally appeared on Engadget at https://www.engadget.com/ai/microsoft-365-copilot-users-can-collaborate-with-ai-and-each-other-in-bizchat-pages-150042326.html?src=rss
A decade ago, Flappy Bird became a sensation among smartphone users, with many of us spending far too long getting the little yellow guy to climb higher and higher along pipes. But, it didn't last long, with it soon pulled from app stores. Here at Engadget, we were excited by the news last week that Flappy Bird is coming back to our devices in 2025. However, there's one person who isn't psyched: Flappy Bird's creator, Dong Nguyen. He took to X (formerly Twitter) to confirm he isn't involved in or profiting off the new version. "No, I have no related with their game. I did not sell anything. I also don't support crypto," he stated.
No, I have no related with their game. I did not sell anything. I also don't support crypto.
The team behind the new Flappy Bird iteration has been open about being a "new team of passionate fans." Nguyen's trademark was reportedly considered abandoned, and Gametech Holdings LLC picked it up for free. The new team then got the rights to Flappy Bird from Gametech.
It's unlikely Nguyen would have ever revived the game on his own. He released the original game in May 2013 and made about $50,000 a day from advertising when it blew up the following January. However, he took the game down only a month later, stating, "I cannot take this anymore." In an interview with Forbes then, Nguyen explained, "Flappy Bird was designed to play in a few minutes when you are relaxed. But it happened to become an addictive product. I think it has become a problem. To solve that problem, it's best to take down Flappy Bird. It's gone forever."
This article originally appeared on Engadget at https://www.engadget.com/apps/flappy-birds-creator-wants-you-to-know-hes-got-nothing-to-do-with-the-new-version-121532179.html?src=rss