Apple’s iPhone 16 & iPhone 16 Plus Are Official

iPhone 16

Apple has officially announced the iPhone 16 and iPhone 16 Plus, marking a significant advancement in smartphone technology. Built for Apple Intelligence, these new models offer a personal intelligence system that understands user context to deliver relevant and helpful information while maintaining privacy. This innovative feature is set to transform how users interact with their […]

The post Apple’s iPhone 16 & iPhone 16 Plus Are Official appeared first on Geeky Gadgets.

Apple’s New ‘Visual Intelligence’ feature on the iPhone 16 basically makes Google Lens obsolete

Apple has a reputation for taking established ideas and refining them into seamless, intuitive features, and it looks like they’ve done it again with their new Visual Intelligence technology in the iPhone 16. In contrast to Google Lens, which primarily scans objects or texts in photos and returns basic web-based results, Apple’s Visual Intelligence integrates advanced generative models and contextual awareness, creating a more profound, interactive experience. This blend of on-device intelligence and private cloud computing not only delivers more relevant information but does so in a way that feels personal and purposeful.

Let’s dive into why Apple’s Visual Intelligence may have just overshadowed Google Lens, and how it’s bringing more powerful insights to users right through the iPhone 16’s camera. Before we do, it’s important to note that Google HAS, in fact, demonstrated Gemini’s ability to ‘see’ the world around you and provide context-based insights… however, it seems like a lot of those features are limited to just Pixel phones because of their AI-capable Tensor chips. While Google Lens (an older product) is available across the board to both iOS and Android devices, Apple’s Visual Intelligence feature gives iPhones a highly powerful multimodal AI feature that would otherwise require existing Apple users to switch over to the Google Pixel.

Going Beyond Surface-Level Search

Google Lens has been a reliable tool for identifying objects, landmarks, animals, and text. It essentially acts as a visual search engine, allowing users to point their camera at something and receive search results based on Google’s vast index of web pages. While this is undoubtedly useful, it stops at merely recognizing objects or extracting text to launch a related Google search.

Apple’s Visual Intelligence, on the other hand, merges object recognition with contextual data retrieval. This means it can offer richer, more integrated information. During the Apple keynote, Craig Federighi demonstrated how users could point their iPhone at a restaurant and instantly retrieve operating hours, reviews, and options to make reservations—all without needing to open Safari or another app​. Similarly, pointing the camera at a movie poster will not just yield a name or showtimes, but deeper context such as ratings, actor bios, and related media, providing a much more immersive and helpful experience.

The Power of Integration: Visual Intelligence and Third-Party Tools

One of the standout features of Apple’s Visual Intelligence is its seamless integration with third-party tools, offering expanded functionality. For instance, if you spot a bike you’re interested in, Visual Intelligence doesn’t just identify the brand and model; it can quickly connect you to third-party retailers via Google Search to check availability and pricing. This interplay between native intelligence and external databases exemplifies Apple’s mastery of pulling together useful, real-time data without breaking the user’s workflow.

But it doesn’t stop there. Apple has built-in support for querying complex topics with tools like ChatGPT. Imagine you’re reviewing lecture notes and stumble across a difficult concept. Simply hold your iPhone over the text and ask ChatGPT to explain it right on the spot. This deep contextual awareness and ability to provide real-time insights based on multiple external sources is something Google Lens simply cannot do at the same level.

Privacy at the Core

Another area where Apple shines is in its privacy-first approach to AI. All interactions through Visual Intelligence, such as identifying objects or pulling up information, are processed on-device or via Apple’s Private Cloud Compute, ensuring that no personal data is stored or shared unnecessarily. This is a stark contrast to Google’s cloud-based model, which has often raised concerns about the volume of user data being processed on external servers. By keeping the majority of computation on the device, Apple delivers peace of mind for privacy-conscious users—an area that Google has historically struggled with.

A Broader Reach: Enabling Personal Context

One of the most significant advantages of Apple’s approach is its deep integration into your phone’s personal data. Visual Intelligence doesn’t just analyze what’s in front of the camera; it connects the dots with your past interactions. For example, Siri, now supercharged with Visual Intelligence, can identify the contents of your messages or your calendar appointments and offer contextual suggestions based on what you’re viewing​. If you’re looking at a flyer for an event, Visual Intelligence will not only retrieve details about the event but also cross-reference it with your schedule to automatically add it to your calendar—again, without having to lift a finger.

Google Lens, by comparison, lacks this deep personal integration. It’s effective as a standalone visual search tool but hasn’t yet reached the level of intuitive, user-centered design that Apple has mastered.

A New Era for Intelligent Photography

Apple’s innovation also extends into how we interact with our cameras. The new camera control on the iPhone 16 doubles as a gateway to Visual Intelligence. This means users can quickly snap a photo and receive actionable insights immediately. With a simple press of the camera control, users can tap into features like instant translations, object recognition, or even educational tools like ChatGPT.

Google Lens, while impressive in its object recognition, doesn’t offer this seamless experience. It requires users to jump between apps or tabs to get additional information, while Apple’s integration means the iPhone is one fluid tool—camera, intelligence, and action all in one place.

Apple Executes Where Google Initiated

Google Lens may have launched first, but Apple has undeniably refined and expanded the concept. It’s a tendency that we’ve come to learn and love about Apple – they usually don’t believe in being first to the market, but rather believe in executing features so well, people tend to ignore the competition – they did so with the Vision Pro AND with Apple Intelligence. Visual Intelligence is a bold step forward, leveraging on-device power and privacy to deliver more meaningful, contextual insights. Where Google Lens excels at basic object recognition, Apple’s approach feels more like a true assistant, offering deeper information, smarter integrations, and a more secure experience.

The post Apple’s New ‘Visual Intelligence’ feature on the iPhone 16 basically makes Google Lens obsolete first appeared on Yanko Design.

Rumors we got right (and wrong) with the September 2024 Apple Keynote

Apple’s September event remains every tech nerd’s most awaited time of the year for two reasons. For starters, it’s the announcement of brand-new gear that we can get our hands on… but more importantly, it’s a validation of everything we’ve been hearing and assuming over the years. There’s never a single day when people aren’t actively trying to find out details on what Apple will do next. The company has a rock-solid reputation for keeping its products under wraps until they’re ready to launch… but sometimes things slip through the cracks. Us nerds love to speculate on these rumors, and come September, the keynote feels like judgment day – where some rumors get turned into reality, and others into rubble.

As Apple unveiled their latest slew of devices today, it’s interesting to see exactly where the rumors were right, and where we were absolutely off mark. For starters, we all expected next-gen AirPods Max headphones (given that it’s been 4 years since their first launch) as well as a Watch Ultra 3. Apple conveniently skipped those devices, only to announce minor upgrades to them (nice way of saying they’re already perfect the way they are). Meanwhile, rumors of a ‘capture’ button on the iPhone seemed as good as true with phone-makers actually displaying iPhone 16 cases at IFA 4 days before Apple’s formal launch. So without further ado, here’s a look at all the rumors over the past year, and whether they made it or not.

Rumors we got wrong:

1. Apple Watch X would see a radical redesign for the 10th anniversary

Honestly, it feels a little heartbreaking to see that Apple didn’t give the Watch a 10th-anniversary makeover the way they did with the iPhone X back in 2017. A lot of us were pinning our hopes on seeing a radical redesign (some speculated flat edges like on the iPhone), while others tried hinting at a new form factor. None of that turned out to be true, as Apple announced a nominally slimmer Watch Series 10 (measuring 9.7 mm thick) with a larger display. The design, for the most part, remains entirely the same, except that now the Watch has a re-engineered speaker system that can play audio from podcasts and music apps, allowing you to listen clearly through your watch instead of needing AirPods.

2. Apple would announce a Watch Ultra 3

It seemed natural that a Watch Ultra 3 would drop this year, considering Apple refreshed the Watch Ultra after a single year too. However, the Watch Ultra 2 only got a new color upgrade this year, with a gorgeous satin black finish. Everything about the Watch Ultra 2 remains the same on the design and hardware front, although Apple did announce a set of stunning Milanese metal straps for both the Natural as well as the Satin Black color finishes.

3. Launch of the AirPods Pro 3 and AirPods Max 2

Apple launched the AirPods Pro 2 in September 2022, and the AirPods Max as early as September 2020. It felt all but natural to expect the company to give these devices their due upgrades, but it seemed like Apple had other plans. The company didn’t upgrade their highest-end earbuds and headphones but rather decided to give them a set of new protective features. The new AirPods (across the entire range) will have built-in hearing protection features that not only keep your ability to hear intact over time, but also provide tools to measure your hearing loss – something that’s usually a concern with prolonged earphone/headphone usage. Pending FDA approvals, the AirPods will have clinically validated hearing test features, as well as an over-the-counter hearing aid feature. Oh, and while the AirPods Max didn’t get an upgrade, they DID get 5 new color options, along with USB-C charging… finally.

4. Launch of an iPhone Slim

I had my doubts about this, but when Apple announced the world’s slimmest iPad clocking in at a mere 5.1 millimeters in thickness, it seemed like Apple had similar plans for the iPhone 16 series. However, none of that was true. Call it post-Bendgate-trauma or just something Apple isn’t planning on working on, but the iPhone Slim never really became a thing. I’ll be honest, if Apple DID want to make a slim iPhone, the best way to do it would be to turn it into a foldable… but it seems like we’re years away from that for now.

Rumors we got right:

1. The Apple Watch would get ZERO AI features

This felt surprisingly sad even for a rumor, but when WWDC rolled out, everyone was quick to notice that WatchOS didn’t get mentioned EVEN ONCE during the Apple Intelligence segment. I dismissed it as just a mere mistake, hoping that Apple would announce big AI features for the 10th anniversary Watch, but alas, the Watch Series 10 did NOT get any AI features. Sure, it has neural cores in its S10 SIP that uses machine learning to detect heart problems, falls, and now even sleep apnea… but ‘intelligent Siri’ won’t be coming to the Watch any time soon. Or any Apple Intelligence feature for that matter.

2. The AirPods Max would get upgraded to USB-C

It’s surprising that Apple’s ENTIRE consumer product line got upgraded to USB-C, but the AirPods Max got left behind. First announced in 2020 (when Lightning connectors were still a thing), Apple practically ignored their flagship headphones for the next 4 years (it still sold like hotcakes), and just as we were hoping for an AirPods Max 2, Apple just decided to give their existing headphone collection 5 new color variants. However, along with the new colors, the AirPods Max DID finally get upgraded to a USB-C charging protocol, which means practically every mobile device (AirPods, iPad, and iPhone) has officially ditched the Lightning connector. In fact, it’s just the Magic Mouse, Magic Trackpad, and the Magic Keyboard that still have a Lightning port on them… but I guess the EU isn’t complaining about those.

3. The iPhone 16 series would have a Camera Control button

Arguably the biggest change to Apple’s iPhone since the Action Button, the new Camera Control button is surprisingly great. We speculated that it would just be a simple shutter button, but news later highlighted that it would have haptic control, along with a touch-slide feature. Obviously, these rumors only highlight half the truth… because it’s easy to speculate on hardware, but not on software. The Camera Control button’s features were finally announced at the keynote, highlighting how capable this new button would be. Aside from opening the camera and clicking photos, the button can zoom in/out, adjust focus, switch through presets, and do a whole bunch of exciting new things.

4. A larger iPhone 16 Pro Max with a bigger battery, bigger 6.9″ display, and thinner bezels

Apple announcing their iPhone launch in the MIDDLE of IFA 2024 in Berlin felt like a strategic move… and turns out, it was one, because even though the iPhone 16 wasn’t announced, cases for the new iPhone 16 series were on display at IFA, causing everyone to get a fair idea of what the new phones would look and feel like. Once I saw these cases on display, I obviously had to take a closer look… and upon doing so, I couldn’t help but notice how the 16 Pro Max case was significantly larger than my 15 Pro Max smartphone. Turns out, Apple DID end up making their flagship phones bigger (with the iPhone 16 Pro Max having a whopping 6.9-inch display), while making displays thinner, and cramming an even larger battery into their phones. The largest on any iPhone, as Apple personnel tend to say.

5. The iPhone 16 would get a direct bump to the A18 chip

When Apple announced that their iPad Pros would skip the M3 and go directly to M4, my jaw dropped. It seemed inconceivable that Apple would leapfrog ITSELF, but once I saw the announcement, the rumor that the iPhone 16 would get an A18 chip seemed more believable. Traditionally, the base-model iPhones get the same chip as the previous year’s Pro models… but Apple decided to be kind this year. Given how so much of the new iPhone would be centered around Apple Intelligence, it made sense to build a NEW chipset just for handling these AI tasks. The iPhone 16 series is the first phone to get Apple’s latest A18 chipset, and as Tim Cook says, is truly the first iPhone built from the ground-up for Apple Intelligence.

The post Rumors we got right (and wrong) with the September 2024 Apple Keynote first appeared on Yanko Design.

Apple Unveils iPhone 16 Pro with New Camera, Video, and Audio Features Powered by A18 Pro Chip

Apple has introduced the iPhone 16 Pro, bringing significant updates to both its design and performance. The iPhone 16 Pro now features larger displays—6.3 inches for the Pro and 6.9 inches for the Pro Max—making them the biggest iPhone screens yet. Despite the larger size, Apple has minimized bezels, resulting in a sleek, nearly edge-to-edge look. The adaptive 120Hz ProMotion technology provides smooth scrolling, while the always-on display functionality gives users quick access to key information.

Designer: Apple

The iPhone 16 Pro’s design is constructed using aerospace-grade titanium, offering both durability and lightness. This titanium body is available in four finishes: Black Titanium, White Titanium, Natural Titanium, and Desert Titanium. The device also incorporates a new thermal architecture that improves sustained performance while keeping the phone cool during heavy use. The Pro models are also water and dust-resistant, ensuring longevity in various environments.

Photography: Real-Time Control and Customization

The iPhone 16 Pro empowers creativity through its 48-megapixel fusion camera, which integrates a second-generation quad-pixel sensor capable of reading data twice as fast, allowing for zero shutter lag. Whether capturing fast-moving subjects or subtle details, this new system ensures uncompromised resolution and detail. The sensor’s high-speed data transfer to the A18 Pro chip allows users to capture 48-megapixel proRAW and HEAT photos effortlessly.

A new 48-megapixel ultra-wide camera complements the fusion camera, offering high-resolution shots with autofocus. It excels in capturing wider scenes and stunning macro shots, delivering sharpness and clarity that make it indispensable for creative users. The 5x telephoto camera—with Apple’s longest focal length—provides incredible zoom capabilities from a distance, while the Tetra Prism design improves optical performance for more detailed, high-quality images.

To streamline the photography process, the iPhone 16 Pro introduces an upgraded camera control system. This interface lets users quickly switch between lenses and adjust the depth of field and exposure with a dedicated slider. Later this year, a two-stage shutter will be added, allowing users to lock focus and exposure with a light press, offering precision when reframing shots.

Advanced Photographic Styles and Real-Time Grading

Apple has enhanced the creative range of the iPhone 16 Pro with advanced photographic styles, allowing users to personalize their photos in real-time. With the A18 Pro chip, the image pipeline dynamically adjusts skin tones, colors, highlights, and shadows, enabling a wider range of aesthetic choices. Users can apply styles like black-and-white, dramatic tones, and other custom looks that go beyond basic filters and fine-tune the look with a new control path that simultaneously adjusts tone and color.

What sets these styles apart is the ability to change them after capture, allowing greater flexibility in editing. The real-time preview enabled by the A18 Pro gives users a professional-level color grading experience as they shoot, a significant upgrade for photographers looking for more creative control.

Video: Cinema-Grade Capabilities

The iPhone 16 Pro significantly advances video recording. The new 4K 120fps recording in Dolby Vision is possible thanks to the faster sensor in the 48-megapixel fusion camera and the high transfer speeds of the Apple camera interface. The image signal processor (ISP) of the A18 Pro allows for frame-by-frame cinema-quality color grading, enabling professional-quality video capture directly on the iPhone.

One of the most exciting features is the ability to shoot 4K 120 ProRes and Log video directly to an external storage device, perfect for high-end workflows that demand high frame rates and extensive color grading control. Users no longer need to commit to frame rates upfront—they can adjust playback speed after capture. Whether for slow-motion effects or cinematic storytelling, the iPhone 16 Pro offers flexible playback options, including quarter-speed, half-speed, and 1/5-speed for 24fps cinematic moments.

The camera control interface supports third-party apps like FiLMiC Pro and Grid Composer, enabling advanced features such as precise framing based on the rule of thirds and other composition tools. This further solidifies the iPhone 16 Pro as a versatile tool for video creators.

Audio: Studio-quality sound and Spatial Audio

The iPhone 16 Pro also delivers significant audio upgrades. Four studio-quality microphones provide low noise levels for true-to-life sound capture, whether recording vocals or instruments. The reduced noise floor ensures high-quality audio, which is essential for professional recordings.

A new feature is spatial audio capture during video recording, enhancing the immersive experience when paired with AirPods or viewed on Apple Vision Pro. The spatial audio capture allows dynamic editing through the new audio mix feature, which uses machine learning to separate background elements from voices. This feature includes three voice options: in-frame mix, which isolates the person’s voice on camera; studio mix, which replicates a professional recording environment by eliminating reverb; and cinematic mix, which positions the vocal track upfront with surrounding environmental sounds in the background.

For content creators, Voice Memos now offers the ability to layer tracks on top of existing recordings. This is especially useful for musicians, who can now add vocals over a guitar track or any other instrumental recording. The system automatically isolates the voice from the background audio for a clean, professional result.

A18 Pro Chip: Powering Creativity and Performance

At the core of the iPhone 16 Pro’s new capabilities is the A18 Pro chip, built with second-generation 3-nanometer technology for improved performance and efficiency. The 16-core Neural Engine is designed for tasks requiring high computational power, such as machine learning and generative AI. With a 17% increase in system memory bandwidth, the iPhone 16 Pro can handle tasks such as ray tracing in gaming, 4K video editing, and more.

The A18 Pro chip’s enhanced image signal processor (ISP) enables real-time color grading and supports the advanced photo and video capabilities of the iPhone 16 Pro, ensuring that every shot and video benefits from professional-level quality. The chip’s GPU also provides 20% faster performance, allowing for smoother gaming and more efficient graphics rendering.

iPhone 16 Pro: Is It Time to Switch or Upgrade?

For professional creators, the iPhone 16 Pro delivers the performance and tools needed to meet demanding creative standards. Powered by the A18 Pro chip, it offers advanced photographic styles, pro-level video recording, and studio-quality audio. Whether capturing intricate details in images, producing cinematic-quality videos, or recording clear, high-fidelity audio, the iPhone 16 Pro provides the precision and control necessary to achieve your creative goals. This upgrade is a powerful creative tool designed to push the boundaries of your work, supporting and enhancing your vision with every use.

The post Apple Unveils iPhone 16 Pro with New Camera, Video, and Audio Features Powered by A18 Pro Chip first appeared on Yanko Design.

Apple Unveils iPhone 16 with Enhanced Camera, Customizable Controls, and A18 Chip

Apple introduced the iPhone 16, showcasing new durability, performance, and personalization features. The device retains its sleek look but incorporates significant upgrades, including color-infused glass in ultramarine, teal, and pink, alongside white and black. This design innovation gives the iPhone 16 a refreshed, vibrant appearance.

Designer: Apple

The iPhone 16 builds on the Action Button first seen in the iPhone 15 Pro, allowing users to access customizable functions quickly. A new dedicated camera button brings a more traditional camera experience, instantly launching the camera and functioning as a hardware shutter.

Powered by Apple’s new A18 chip, the iPhone 16 is built for handling advanced tasks, particularly those requiring machine learning and AI. The 16-core Neural Engine is twice as fast as the previous generation, significantly improving processing power for AI workloads. Apple increased memory bandwidth by 17%, allowing more efficient handling of large models and intensive tasks like large language models. The A18’s six-core CPU is 30% faster than the iPhone 15’s, with improved power efficiency and extended battery life. Apple claims the iPhone 16 can outperform some high-end desktop PCs.

The display has also seen upgrades, with brightness reaching 2,000 nits in sunlight and dropping to one nit in low-light environments for more comfortable viewing. The Ceramic Shield glass is 50% tougher than previous versions, adding to the iPhone’s overall durability and water and dust resistance.

Apple’s focus on AI brings Apple Intelligence to the iPhone 16. This system integrates generative models directly into the device, allowing for more natural language understanding and personalization. It can rewrite messages, suggest professional tones, and even create emojis based on descriptions. Additionally, users can use the new Visual Intelligence feature through the camera, which allows the device to identify objects, locations, and events simply by pointing the camera. This feature provides quick access to information such as restaurant details, event schedules, or product identification, bringing a new level of interaction to the iPhone.

The iPhone 16’s camera system has also been upgraded with a 48-megapixel main camera, allowing users to capture images with more detail. A new 2x telephoto option offers optical quality, made possible by the device’s computational photography capabilities. The ultra-wide camera, now with autofocus, allows for better low-light performance and macro photography, while spatial video and photo capture bring new dimensions to memory preservation, compatible with Apple’s Vision Pro headset.

Overall, the iPhone 16 introduces more power, control, and personalization options, making it a versatile tool for users who demand both performance and convenience in their daily tasks.

The post Apple Unveils iPhone 16 with Enhanced Camera, Customizable Controls, and A18 Chip first appeared on Yanko Design.

Apple Watch Ultra 2 Debuts in Satin Black with Sleep Apnea Detection

Apple has unveiled the Apple Watch Ultra 2, now available in a striking satin black finish and packed with new health features like sleep apnea detection. Announced at Apple’s September event, this refresh marks the first time the Ultra has been offered in more than one color since its launch two years ago.

Designer: Apple

The Ultra 2 uses its accelerometer and long-term motion tracking to detect signs of sleep apnea, enhancing the watch’s reputation as a health-focused device. Additionally, the new Vitals app monitors respiratory rate and sleep duration and flags any outliers in these key health metrics. Apple has also introduced the new Training Load feature, designed to help athletes balance exertion and recovery based on their activity levels.

Priced at $799, the Apple Watch Ultra 2 is available for preorder now, with deliveries beginning September 20th. It retains the rugged titanium case but now features a custom diamond-like carbon PVD coating, making it more scratch-resistant. The new titanium Milanese Loop, inspired by mesh used in scuba gear, provides a lightweight, corrosion-resistant band option.

The Ultra 2 is equipped with Apple’s S9 chip, which offers faster performance and enables on-device Siri processing for greater efficiency. It also sports a brighter display, ideal for outdoor activities, and remains Apple’s battery life leader, built to last through extended workouts and adventures.

WatchOS 11, which powers the Ultra 2, updates the Smart Stack, now showing Live Activities similar to the iPhone. Offline maps, enhanced GPS accuracy, and detailed workout metrics make the watch a powerful tool for athletes, hikers, and divers alike. With automatic stroke detection for swimmers, track detection for runners, and cycling metrics like cadence and power, the Ultra 2 is designed to meet the needs of serious fitness enthusiasts.

This year’s addition of sleep apnea detection and a sleek black finish keeps the Ultra 2 as Apple’s premier wearable for those who demand advanced health tracking, durability, and performance in their smartwatch.

The post Apple Watch Ultra 2 Debuts in Satin Black with Sleep Apnea Detection first appeared on Yanko Design.

Apple Watch Series 10: Largest Display and Enhanced Usability Detailed

Apple’s latest wearable, the Series 10 Watch, pushes usability and design to new heights. Featuring its largest and most advanced display yet, the Series 10 now boasts the biggest screen ever built into the brand’s smartwatch lineup. Slightly larger than the Apple Watch Ultra, this screen provides up to 30% more area than earlier versions, allowing users to increase font size without losing content and offering extra lines of text for apps like Messages and Mail.

Designer: Apple

In focusing on user experience, the company has made Series 10 easier to interact with. Rounded corners and a wider aspect ratio give the watch a sleek and softer appearance. One of the most significant upgrades is the first-ever wide-angle OLED display, designed to emit more light at wider viewing angles. This proves particularly useful when checking the watch from the side, such as while typing or walking. The display is also up to 40% brighter when viewed at an angle, making information easier to read in passing.

Battery efficiency remains a priority, and Series 10 features Always On mode, which updates once per second and displays a ticking second hand even when the wrist is down. A new watch face, Flux, takes full advantage of the large display, filling it with dynamic color. The case, made from a polished aluminum alloy, marks a first for Apple with its reflective Jet Black and warm Rose Gold finishes.

Comfort has been central to the design of the Series 10. The latest iteration is Apple’s thinnest watch, measuring 9.7 millimeters thick, almost 10% thinner than its predecessor. Achieving this thin profile required miniaturizing essential components, including the digital crown, SIP, and speaker system, which has been re-engineered to be 30% smaller without compromising sound quality. Users can now play music or podcasts directly through the watch’s speaker, eliminating the need for AirPods in certain situations.

Series 10 offers the fastest charging experience, thanks to a new metal back that integrates the antenna and enhances cellular performance. The design retains its 50-meter water resistance, making it ideal for swimming and surfing. Charging now reaches 80% in just 30 minutes, adding convenience to the user’s day.

Alongside the aluminum options, the tech giant has introduced polished titanium cases in three colors: natural, gold, and dark slate gray. These lightweight cases are nearly 20% lighter than the stainless steel Series 9, offering durability without extra bulk. The Milanese Loop and Classic Link bracelets have been updated to match these titanium finishes, creating a seamless metallic aesthetic.

Beyond functionality and design, the Series 10 reflects Apple’s commitment to sustainability. The titanium model is made with 95% recycled materials and produced using renewable electricity. The new S10 SIP, featuring a neural engine, powers key features like Siri dictation and crash detection, enhancing everyday usability and safety.

The Apple Watch Series 10 represents the company’s ongoing effort to combine thoughtful design with cutting-edge technology, delivering a seamless user experience worldwide.

The post Apple Watch Series 10: Largest Display and Enhanced Usability Detailed first appeared on Yanko Design.

Can You Build a Working iPhone From AliExpress Parts (Video)

iPhone

Building an iPhone 13 Pro from scratch using parts sourced from AliExpress is an intriguing project that has captured the attention of tech enthusiasts and DIY aficionados alike. The awesome video from Phone Repair Guru delves into the feasibility, cost-effectiveness, and challenges associated with undertaking such an endeavor. The primary objective of this project is […]

The post Can You Build a Working iPhone From AliExpress Parts (Video) appeared first on Geeky Gadgets.

More Last Minute Apple Event Details Leaked

Apple Event

The Apple event takes place later today, and recent leaks have provided a tantalizing glimpse into what’s in store. From the iPhone 16 lineup to the Apple Watch Series 10 and AirPods 4, the event promises to showcase a range of exciting updates and new features. However, it’s worth noting that some highly anticipated products […]

The post More Last Minute Apple Event Details Leaked appeared first on Geeky Gadgets.

Awesome Back to School Apple Tech Tips You Need to Know

Back to School

Going back to school as a student, staying organized and productive is crucial for academic success. With the latest features in iOS 18 and macOS, Apple devices offer a wide range of tools to help you manage your schedules, research, notes, and notifications effectively. The awesome video from Stephen Robles provides 30 essential tips focused […]

The post Awesome Back to School Apple Tech Tips You Need to Know appeared first on Geeky Gadgets.