Apple’s New ‘Visual Intelligence’ feature on the iPhone 16 basically makes Google Lens obsolete

Apple has a reputation for taking established ideas and refining them into seamless, intuitive features, and it looks like they’ve done it again with their new Visual Intelligence technology in the iPhone 16. In contrast to Google Lens, which primarily scans objects or texts in photos and returns basic web-based results, Apple’s Visual Intelligence integrates advanced generative models and contextual awareness, creating a more profound, interactive experience. This blend of on-device intelligence and private cloud computing not only delivers more relevant information but does so in a way that feels personal and purposeful.

Let’s dive into why Apple’s Visual Intelligence may have just overshadowed Google Lens, and how it’s bringing more powerful insights to users right through the iPhone 16’s camera. Before we do, it’s important to note that Google HAS, in fact, demonstrated Gemini’s ability to ‘see’ the world around you and provide context-based insights… however, it seems like a lot of those features are limited to just Pixel phones because of their AI-capable Tensor chips. While Google Lens (an older product) is available across the board to both iOS and Android devices, Apple’s Visual Intelligence feature gives iPhones a highly powerful multimodal AI feature that would otherwise require existing Apple users to switch over to the Google Pixel.

Going Beyond Surface-Level Search

Google Lens has been a reliable tool for identifying objects, landmarks, animals, and text. It essentially acts as a visual search engine, allowing users to point their camera at something and receive search results based on Google’s vast index of web pages. While this is undoubtedly useful, it stops at merely recognizing objects or extracting text to launch a related Google search.

Apple’s Visual Intelligence, on the other hand, merges object recognition with contextual data retrieval. This means it can offer richer, more integrated information. During the Apple keynote, Craig Federighi demonstrated how users could point their iPhone at a restaurant and instantly retrieve operating hours, reviews, and options to make reservations—all without needing to open Safari or another app​. Similarly, pointing the camera at a movie poster will not just yield a name or showtimes, but deeper context such as ratings, actor bios, and related media, providing a much more immersive and helpful experience.

The Power of Integration: Visual Intelligence and Third-Party Tools

One of the standout features of Apple’s Visual Intelligence is its seamless integration with third-party tools, offering expanded functionality. For instance, if you spot a bike you’re interested in, Visual Intelligence doesn’t just identify the brand and model; it can quickly connect you to third-party retailers via Google Search to check availability and pricing. This interplay between native intelligence and external databases exemplifies Apple’s mastery of pulling together useful, real-time data without breaking the user’s workflow.

But it doesn’t stop there. Apple has built-in support for querying complex topics with tools like ChatGPT. Imagine you’re reviewing lecture notes and stumble across a difficult concept. Simply hold your iPhone over the text and ask ChatGPT to explain it right on the spot. This deep contextual awareness and ability to provide real-time insights based on multiple external sources is something Google Lens simply cannot do at the same level.

Privacy at the Core

Another area where Apple shines is in its privacy-first approach to AI. All interactions through Visual Intelligence, such as identifying objects or pulling up information, are processed on-device or via Apple’s Private Cloud Compute, ensuring that no personal data is stored or shared unnecessarily. This is a stark contrast to Google’s cloud-based model, which has often raised concerns about the volume of user data being processed on external servers. By keeping the majority of computation on the device, Apple delivers peace of mind for privacy-conscious users—an area that Google has historically struggled with.

A Broader Reach: Enabling Personal Context

One of the most significant advantages of Apple’s approach is its deep integration into your phone’s personal data. Visual Intelligence doesn’t just analyze what’s in front of the camera; it connects the dots with your past interactions. For example, Siri, now supercharged with Visual Intelligence, can identify the contents of your messages or your calendar appointments and offer contextual suggestions based on what you’re viewing​. If you’re looking at a flyer for an event, Visual Intelligence will not only retrieve details about the event but also cross-reference it with your schedule to automatically add it to your calendar—again, without having to lift a finger.

Google Lens, by comparison, lacks this deep personal integration. It’s effective as a standalone visual search tool but hasn’t yet reached the level of intuitive, user-centered design that Apple has mastered.

A New Era for Intelligent Photography

Apple’s innovation also extends into how we interact with our cameras. The new camera control on the iPhone 16 doubles as a gateway to Visual Intelligence. This means users can quickly snap a photo and receive actionable insights immediately. With a simple press of the camera control, users can tap into features like instant translations, object recognition, or even educational tools like ChatGPT.

Google Lens, while impressive in its object recognition, doesn’t offer this seamless experience. It requires users to jump between apps or tabs to get additional information, while Apple’s integration means the iPhone is one fluid tool—camera, intelligence, and action all in one place.

Apple Executes Where Google Initiated

Google Lens may have launched first, but Apple has undeniably refined and expanded the concept. It’s a tendency that we’ve come to learn and love about Apple – they usually don’t believe in being first to the market, but rather believe in executing features so well, people tend to ignore the competition – they did so with the Vision Pro AND with Apple Intelligence. Visual Intelligence is a bold step forward, leveraging on-device power and privacy to deliver more meaningful, contextual insights. Where Google Lens excels at basic object recognition, Apple’s approach feels more like a true assistant, offering deeper information, smarter integrations, and a more secure experience.

The post Apple’s New ‘Visual Intelligence’ feature on the iPhone 16 basically makes Google Lens obsolete first appeared on Yanko Design.

Apple Unveils iPhone 16 Pro with New Camera, Video, and Audio Features Powered by A18 Pro Chip

Apple has introduced the iPhone 16 Pro, bringing significant updates to both its design and performance. The iPhone 16 Pro now features larger displays—6.3 inches for the Pro and 6.9 inches for the Pro Max—making them the biggest iPhone screens yet. Despite the larger size, Apple has minimized bezels, resulting in a sleek, nearly edge-to-edge look. The adaptive 120Hz ProMotion technology provides smooth scrolling, while the always-on display functionality gives users quick access to key information.

Designer: Apple

The iPhone 16 Pro’s design is constructed using aerospace-grade titanium, offering both durability and lightness. This titanium body is available in four finishes: Black Titanium, White Titanium, Natural Titanium, and Desert Titanium. The device also incorporates a new thermal architecture that improves sustained performance while keeping the phone cool during heavy use. The Pro models are also water and dust-resistant, ensuring longevity in various environments.

Photography: Real-Time Control and Customization

The iPhone 16 Pro empowers creativity through its 48-megapixel fusion camera, which integrates a second-generation quad-pixel sensor capable of reading data twice as fast, allowing for zero shutter lag. Whether capturing fast-moving subjects or subtle details, this new system ensures uncompromised resolution and detail. The sensor’s high-speed data transfer to the A18 Pro chip allows users to capture 48-megapixel proRAW and HEAT photos effortlessly.

A new 48-megapixel ultra-wide camera complements the fusion camera, offering high-resolution shots with autofocus. It excels in capturing wider scenes and stunning macro shots, delivering sharpness and clarity that make it indispensable for creative users. The 5x telephoto camera—with Apple’s longest focal length—provides incredible zoom capabilities from a distance, while the Tetra Prism design improves optical performance for more detailed, high-quality images.

To streamline the photography process, the iPhone 16 Pro introduces an upgraded camera control system. This interface lets users quickly switch between lenses and adjust the depth of field and exposure with a dedicated slider. Later this year, a two-stage shutter will be added, allowing users to lock focus and exposure with a light press, offering precision when reframing shots.

Advanced Photographic Styles and Real-Time Grading

Apple has enhanced the creative range of the iPhone 16 Pro with advanced photographic styles, allowing users to personalize their photos in real-time. With the A18 Pro chip, the image pipeline dynamically adjusts skin tones, colors, highlights, and shadows, enabling a wider range of aesthetic choices. Users can apply styles like black-and-white, dramatic tones, and other custom looks that go beyond basic filters and fine-tune the look with a new control path that simultaneously adjusts tone and color.

What sets these styles apart is the ability to change them after capture, allowing greater flexibility in editing. The real-time preview enabled by the A18 Pro gives users a professional-level color grading experience as they shoot, a significant upgrade for photographers looking for more creative control.

Video: Cinema-Grade Capabilities

The iPhone 16 Pro significantly advances video recording. The new 4K 120fps recording in Dolby Vision is possible thanks to the faster sensor in the 48-megapixel fusion camera and the high transfer speeds of the Apple camera interface. The image signal processor (ISP) of the A18 Pro allows for frame-by-frame cinema-quality color grading, enabling professional-quality video capture directly on the iPhone.

One of the most exciting features is the ability to shoot 4K 120 ProRes and Log video directly to an external storage device, perfect for high-end workflows that demand high frame rates and extensive color grading control. Users no longer need to commit to frame rates upfront—they can adjust playback speed after capture. Whether for slow-motion effects or cinematic storytelling, the iPhone 16 Pro offers flexible playback options, including quarter-speed, half-speed, and 1/5-speed for 24fps cinematic moments.

The camera control interface supports third-party apps like FiLMiC Pro and Grid Composer, enabling advanced features such as precise framing based on the rule of thirds and other composition tools. This further solidifies the iPhone 16 Pro as a versatile tool for video creators.

Audio: Studio-quality sound and Spatial Audio

The iPhone 16 Pro also delivers significant audio upgrades. Four studio-quality microphones provide low noise levels for true-to-life sound capture, whether recording vocals or instruments. The reduced noise floor ensures high-quality audio, which is essential for professional recordings.

A new feature is spatial audio capture during video recording, enhancing the immersive experience when paired with AirPods or viewed on Apple Vision Pro. The spatial audio capture allows dynamic editing through the new audio mix feature, which uses machine learning to separate background elements from voices. This feature includes three voice options: in-frame mix, which isolates the person’s voice on camera; studio mix, which replicates a professional recording environment by eliminating reverb; and cinematic mix, which positions the vocal track upfront with surrounding environmental sounds in the background.

For content creators, Voice Memos now offers the ability to layer tracks on top of existing recordings. This is especially useful for musicians, who can now add vocals over a guitar track or any other instrumental recording. The system automatically isolates the voice from the background audio for a clean, professional result.

A18 Pro Chip: Powering Creativity and Performance

At the core of the iPhone 16 Pro’s new capabilities is the A18 Pro chip, built with second-generation 3-nanometer technology for improved performance and efficiency. The 16-core Neural Engine is designed for tasks requiring high computational power, such as machine learning and generative AI. With a 17% increase in system memory bandwidth, the iPhone 16 Pro can handle tasks such as ray tracing in gaming, 4K video editing, and more.

The A18 Pro chip’s enhanced image signal processor (ISP) enables real-time color grading and supports the advanced photo and video capabilities of the iPhone 16 Pro, ensuring that every shot and video benefits from professional-level quality. The chip’s GPU also provides 20% faster performance, allowing for smoother gaming and more efficient graphics rendering.

iPhone 16 Pro: Is It Time to Switch or Upgrade?

For professional creators, the iPhone 16 Pro delivers the performance and tools needed to meet demanding creative standards. Powered by the A18 Pro chip, it offers advanced photographic styles, pro-level video recording, and studio-quality audio. Whether capturing intricate details in images, producing cinematic-quality videos, or recording clear, high-fidelity audio, the iPhone 16 Pro provides the precision and control necessary to achieve your creative goals. This upgrade is a powerful creative tool designed to push the boundaries of your work, supporting and enhancing your vision with every use.

The post Apple Unveils iPhone 16 Pro with New Camera, Video, and Audio Features Powered by A18 Pro Chip first appeared on Yanko Design.

Apple Unveils iPhone 16 with Enhanced Camera, Customizable Controls, and A18 Chip

Apple introduced the iPhone 16, showcasing new durability, performance, and personalization features. The device retains its sleek look but incorporates significant upgrades, including color-infused glass in ultramarine, teal, and pink, alongside white and black. This design innovation gives the iPhone 16 a refreshed, vibrant appearance.

Designer: Apple

The iPhone 16 builds on the Action Button first seen in the iPhone 15 Pro, allowing users to access customizable functions quickly. A new dedicated camera button brings a more traditional camera experience, instantly launching the camera and functioning as a hardware shutter.

Powered by Apple’s new A18 chip, the iPhone 16 is built for handling advanced tasks, particularly those requiring machine learning and AI. The 16-core Neural Engine is twice as fast as the previous generation, significantly improving processing power for AI workloads. Apple increased memory bandwidth by 17%, allowing more efficient handling of large models and intensive tasks like large language models. The A18’s six-core CPU is 30% faster than the iPhone 15’s, with improved power efficiency and extended battery life. Apple claims the iPhone 16 can outperform some high-end desktop PCs.

The display has also seen upgrades, with brightness reaching 2,000 nits in sunlight and dropping to one nit in low-light environments for more comfortable viewing. The Ceramic Shield glass is 50% tougher than previous versions, adding to the iPhone’s overall durability and water and dust resistance.

Apple’s focus on AI brings Apple Intelligence to the iPhone 16. This system integrates generative models directly into the device, allowing for more natural language understanding and personalization. It can rewrite messages, suggest professional tones, and even create emojis based on descriptions. Additionally, users can use the new Visual Intelligence feature through the camera, which allows the device to identify objects, locations, and events simply by pointing the camera. This feature provides quick access to information such as restaurant details, event schedules, or product identification, bringing a new level of interaction to the iPhone.

The iPhone 16’s camera system has also been upgraded with a 48-megapixel main camera, allowing users to capture images with more detail. A new 2x telephoto option offers optical quality, made possible by the device’s computational photography capabilities. The ultra-wide camera, now with autofocus, allows for better low-light performance and macro photography, while spatial video and photo capture bring new dimensions to memory preservation, compatible with Apple’s Vision Pro headset.

Overall, the iPhone 16 introduces more power, control, and personalization options, making it a versatile tool for users who demand both performance and convenience in their daily tasks.

The post Apple Unveils iPhone 16 with Enhanced Camera, Customizable Controls, and A18 Chip first appeared on Yanko Design.

iPhone 16S concept mimics the Rabbit R1 format to reinstate that a phone is the best pocket AI device

We are still living with the iPhone 15 and its variants; the era of the iPhone 16 is further away from now. As known, it’s customary of Apple to drop its new seedlings (iPhone variants, if you like) in September every year and it looks like there is nothing unusual this year as well. Like every other year in the past, since Steve Jobs revealed the first iPhone – feels like it was a century ago – iPhone 16 and iPhone 16 Pro variants will arrive with new features.

A lot of them are leaking in bits and will continue to do so until the launch date. Irrespective of that, we will continue to have our own wishlists: long battery performance… please, elaborate AI integration into the iOS, and perhaps smaller screen real estate…hmm! When everyone else is putting their money on predicting the possible large display sizes of the iPhone 16 Pro Max, the Phone Industry is taking an ‘S’ route: A concept of an iPhone 16S that looks to take design cues from the Rabbit R1.

Designer: Phone Industry

For reference, the Rabbit R1 isn’t a typical gadget, and so is not its design. The boxy little AI device is designed to learn from your commands and do more than what the average smartphone can do. That is until the recent debacle of reviews that are showing that the real-world evolution of the Rabbit is far from its advocated details. Anyhow, this is not about what the Rabbit R1 does, it’s about the identical-looking (minus the hold bars on the top and bottom) iPhone 16S concept because the best AI device you can have in your pocket – in the foreseeable future is a phone!

Perhaps then the form factor of the concept phone in question may be stolen from the Rabbit R1, it does have some interesting ideas reliving its iPhone 16 identity (as the rumors hold it for now). The iPhone 16S is taking the expected Capture Button idea from the forthcoming iPhone deals, to give us a pocket camera-like physical clicking button from the yesteryears.

So, the hypothetical capture button on the opposite side of the iPhone 15 Pro like the Action Button, gives this iPhone a more camera-like feel. While Apple is considering on reworking the camera array in the upcoming iPhone 16 lineup, this concept sticks to the S series iPhone basics and uses just one – obviously multi-capability – camera in the rear. The highlight for me – besides the square form factor – of the iPhone 16S concept is its all-metal body and an interesting pattern around the Apple logo on the back. What do you think?

 

The post iPhone 16S concept mimics the Rabbit R1 format to reinstate that a phone is the best pocket AI device first appeared on Yanko Design.

Apple’s Big Mistake with the iPhone 16 Series is Focusing TOO MUCH on the Camera

When Jobs took the stage to announce the iPhone back in 2007, he used three terms to describe the revolutionary device – a widescreen iPod with touch controls; a revolutionary mobile phone; and a breakthrough Internet communications device. However, ever since the iPhone 7 introduced a dual-lens main camera system on the phone, the Apple team has sort of obsessed with making sure the iPhone has a great camera first, and phone-adjacent features later. Almost like a handheld camera with an App Store, the iPhones now are just a shadow of what they could be. No foldable technology, no AI-based enhanced features, and not even a damn-near decent voice assistant. In fact, it took Apple YEARS to get 5G to their iPhones. Apple spends nearly 30-40% of each iPhone keynote talking about the camera and screen, and now rumors are indicating that the iPhone 16 will introduce a dedicated ‘capture’ button that lets you click photos like you would with a professional camera. The problem with this is that it’s diluting the very definition of a smartphone… and I feel like it might be deliberate.

Earlier this week, leaks showed a new hardware feature coming to the iPhone – a Capture Button that would sit on the top right corner of your phone if you held it in landscape mode. Surprising as it is, considering Apple has been trying to go buttonless and portless for a while now, the Capture Button seems like an odd addition to a phone. Not a single other smartphone has a camera shutter button. In fact, the de facto position is to turn your volume button into a capture button while the camera is running… so what’s driving Apple to add YET ANOTHER button to their phone, following the addition of the Action Button last year?

Leaked images of iPhone dummies used for case designs

Last month, I pointed out that the iPhone 16 is just going to be one of those boring phones worth missing, and this Capture Button seems to reinforce that fact. Every 3 years, Apple launches a ‘boring’ iPhone with a minor design upgrade just to keep things moving before a radical change and it’s been 2 years since the Dynamic Island, so this is probably Apple’s boring year. But why a Capture Button? Nobody said we needed it, not a single Android competitor has a Capture Button, heck if anything we’d appreciate bringing the 3.5mm audio jack back. So why is Apple going ahead with this hardware change?

There are two ways to look at it. The first is a simpler explanation – Apple’s run out of ideas. This is just one of those years where Apple pushes out something so it can tick that annual release box and make a few sales before something bigger and better in 2025. It’s a theory that holds merit given that the iPhone 8 was the ‘boring’ phone for the iPhone X, the iPhone 13 hardly had any extra features (unless you count Cinematic Mode as a game-changing upgrade) before the iPhone 14 ushered in the Dynamic Island. This basically means it’s business as usual and 2024 is just going to be a boring year for iPhones… but there’s yet another explanation.

Close-up of the purported Capture Button

The second explanation is a little more layered and vague, considering there’s no concrete proof to the fact. The explanation is that Apple’s pretty much resigning the iPhone to its fate – the camera. With the Vision Pro becoming Apple’s new breakthrough device, the iPhone will eventually take second place, quite like the iPod did 12 years ago. There are multiple rumors that Apple’s building a cheaper Vision headset (without the ‘Pro’ title) for the mass market to immerse in spatial computing… and when that happens, the iPhone won’t be anything except for a glorified photography device. It still doesn’t explain why Apple’s adding a Capture Button to their phone, given that people already use the volume button to capture photos… but that’s the vague part, because we really don’t ever know what’s going on inside the heads of Tim Cook and the Apple team until and unless they tell us. But as far as the iPhone 16 goes, I’d recommend you give it a miss unless you were long due for a smartphone upgrade.

Renders by Sarang Sheth

The post Apple’s Big Mistake with the iPhone 16 Series is Focusing TOO MUCH on the Camera first appeared on Yanko Design.

The iPhone 16 might just be a snoozefest… History tells us why.

Historically, every three years, the iPhone’s design gets a ‘boring’ upgrade. Do you remember the iPhone 8 or the iPhone 13’s most exciting features? Neither do I.

The iPhone X and 11 had radical new designs with the notch, the iPhone 12 introduced 5G and MagSafe… but after two consecutive years of exciting features, the iPhone 13 barely had anything worth talking about (unless you consider ‘Cinematic Mode’ to be a game-changing feature). Skip to the next year and the iPhone 14 Pro had the Dynamic Island and Satellite Connectivity. The iPhone 15 had the Action Button, the USB-C port, and a titanium construction. All indications show that the upcoming iPhone 16 won’t really dazzle much. Aside from a few hardware upgrades and perhaps one or two extra camera features (probably tied to the Vision Pro), there isn’t any thrilling rumor regarding the upcoming iPhone 16’s design. Not that there needs to be – Apple’s entitled to taking a short break every few years and just focusing on fine-tuning the product rather than wowing people. If you’re thinking of upgrading to the 16 this year, I’d probably give it a miss and go for the 15 instead. The iPad, on the other hand, is due for a BIG refresh with rumors of a glass-back, MagSafe, and perhaps some more camera upgrades to support the Vision Pro.

The rumor mill for the latest iPhone often begins around a year prior to its release. Once a model of the iPhone launches, analysts and experts begin speculating what the next year’s model could look like. Speculations turn into rumors by January. Rumors turn into leaks by April or May. And renders emerge online by July or August, approximately a month before Apple announces its newest iPhone. So far, the rumors have been rather underwhelming at best, with some minor upgrades being touted for the iPhone 16.

So far, outlets like MacRumors haven’t specified any ‘game-changing’ new features for the iPhone 16. Sure, you have a chipset upgrade every year and the 16 Pro will run Apple’s latest A18 Bionic chip. Cameras get upgraded too, and there’s speculation that the Ultrawide camera could get a 48MP bump this year. The new iPhone 16 series will apparently have larger displays (so maybe smaller bezels), better 5G, WiFi 7 capabilities, and a new stacked battery architecture for better battery life. Visibly, the iPhone 16 might have a different camera layout, defaulting to the original vertical orientation seen with the iPhone 11 and 12 (although the bump around them may be capsule-shaped instead of square like older models). There’s also speculation about a new physical ‘capture’ button for clicking photos or recording videos… although all indications show that this might just be one of those rumors that end up staying a rumor. Apple’s famously trying to move away from buttons and ports, so adding an extra button to the new phone just doesn’t sound like something the company would do. Moreover, the volume buttons already work as capture buttons when the camera app’s active… so a dedicated capture button feels rather redundant.

The iPhone 16 Pro might see some extremely small incremental changes, with barely any visible differences. The rendering below shows a possible iPhone 16 Pro with a design that’s indistinguishable from last year’s 15 Pro model. Apple will almost certainly stick to titanium for the Pro series, potentially with newer colors to help differentiate them from last year’s models.

All eyes, however, are on Apple’s software development team this year. The company famously canceled its rumored Apple Car project, moving the entire Project Titan team to work for the in-house AI development department. Analysts like Ming-Chi Kuo speculate that Apple might announce AI-based features like a next-gen Siri powered by Apple’s own LLM, or other generative AI capabilities. These announcements, however, may just come with the iOS 18 debut during WWDC in June. To push the latest iPhone series, Apple may also limit these AI features only to the iPhone 16 range, forcing consumers to make the upgrade. However, until these speculations are confirmed, the iPhone 16 may just be worth a miss this year.

Images via MacRumors

The post The iPhone 16 might just be a snoozefest… History tells us why. first appeared on Yanko Design.

Small iPhone 16 camera design change could have important implications

Apple is known for moving slowly but surely, changing things only when it really makes sense and after numerous tests. That can be especially seen in the iPhone’s cameras, which don’t change year after year, unlike the general trend in the mobile industry. Whether it’s the hardware itself or the design of the camera bump, Apple has been careful and meticulous in introducing changes. That’s why it’s a bit of a big thing that Apple is rumored to be changing the iPhone 16’s camera design for the first time in three years, but it’s a change that will bring a significant change in functionality, at least for the base iPhone models, and possibly a very different aesthetic as well.

Designer: Apple (via MacRumors)

Apple has traditionally been modest with its camera hardware because it is able to pull off great images even with sensors that look outdated on paper. It also doesn’t change the design of the camera island as often as other brands, introducing a pill-shaped bump in the iPhone 8 Plus in 2017 and only switching to a square enclosure in the 2019 iPhone 11. In the 2021 iPhone 13, it moved the position of the lenses so that they’d be diagonally opposite to each other, creating a visually more interesting composition than the plain vertical arrangement of the iPhone X.

According to rumors, however, Apple is switching back to a vertical design for the iPhone 16 cameras, and it’s unlikely just on a whim. There is a chance that this change will enable the Spatial Video function that’s only available on the iPhone 15 Pro models, a feature that allows recording video in three dimensions. This, however, requires using both the main wide camera and the ultrawide shooter in tandem, in which case these two sensors have to be aligned vertically or even horizontally.

What’s a little more interesting is that Apple is reportedly testing a different design for this new camera as well. There is one that looks exactly like the iPhone 11 and 12, with the cameras vertically arranged. Another potential design, however, places the vertical pill shape of the iPhone X on top of the square bump of the iPhone 15. As interesting as that may look visually, it’s a rather distracting design and would be a bit complicated and inconsistent with the current iPhone design language.

Changing the camera arrangement to enable Spatial Video recording is another sign that Apple is aiming to bring Pro features to the base iPhone 16. An earlier leak claimed that a redesigned Action Button will be available on all iPhone 16 models. That button will supposedly switch from a mechanical button to a pressure-sensitive capacitive surface, a feature that could expand what the button is capable of doing.

The post Small iPhone 16 camera design change could have important implications first appeared on Yanko Design.