Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong??

Meta is building a smartwatch, and it wants to know your heart rate, your sleep patterns, your activity levels, and whatever else it can pull from a sensor pressed against your skin all day. The device is codenamed Malibu 2, it’s targeting a 2026 launch, and by most accounts it sounds like a perfectly competent health wearable. The problem isn’t the hardware. The problem is the company attached to it.

This is the same Meta that just faced congressional scrutiny over social media addiction today. The same Meta whose smart glasses are reportedly inching toward facial recognition. The same Meta that filed a patent for Project Lazarus, a system designed to generate posthumous content from deceased users, because apparently your data doesn’t stop being useful to them just because you do. Handing your most intimate biometric information to that company is a case study in one.

Designer: Meta

To be fair, the product itself has a coherent logic behind it. Meta’s Ray-Ban Display glasses have been received surprisingly well by the press, and the neural wristband that ships with them, which uses electromyography to read muscle signals and translate them into gestures, only works with those glasses. That’s a real limitation. A smartwatch that absorbs that gesture-control functionality while adding health tracking and a persistent AI assistant would close a gap that currently makes the whole setup feel incomplete. From a pure product strategy standpoint, Malibu 2 makes sense.

The hardware ambitions have also matured since Meta’s first attempt at a smartwatch, which was scrapped in 2022 after accumulating plans for detachable cameras and metaverse tie-ins that never quite added up to a coherent device. Malibu 2 is reportedly focused on health tracking and Meta AI integration, which is a much cleaner pitch. The company already has a working partnership with Garmin, visible in the Oakley Vanguard sports glasses and a neural band demo at CES 2026 inside a Garmin-powered car concept. If there’s a natural manufacturing and platform partner for this watch, Garmin is the obvious candidate.

Meta is also reportedly developing the watch to sit alongside updated Ray-Ban Display glasses, internally called Hypernova 2, with both devices likely to be unveiled at Meta Connect in September. The Phoenix mixed reality glasses, meanwhile, have been pushed to 2027 partly because Meta’s executives were concerned about releasing too many devices at once and confusing consumers. That’s a reasonable concern. It’s also a little rich coming from a company whose current product lineup already includes smart glasses with a separate neural band that only controls one device.

The wearables market is genuinely ready for a credible third competitor alongside Apple and Samsung, and Meta has the AI infrastructure and the existing glasses ecosystem to make Malibu 2 compelling from launch. But compelling and trustworthy are different things, and Meta has spent twenty years demonstrating which one it prioritizes. Your Apple Watch data sits in Apple’s ecosystem, behind a company that has made privacy a marketing pillar and a legal battleground. Your Malibu 2 data sits with a company that patented a way to keep monetizing you after you die.

The post Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong?? first appeared on Yanko Design.

Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong??

Meta is building a smartwatch, and it wants to know your heart rate, your sleep patterns, your activity levels, and whatever else it can pull from a sensor pressed against your skin all day. The device is codenamed Malibu 2, it’s targeting a 2026 launch, and by most accounts it sounds like a perfectly competent health wearable. The problem isn’t the hardware. The problem is the company attached to it.

This is the same Meta that just faced congressional scrutiny over social media addiction today. The same Meta whose smart glasses are reportedly inching toward facial recognition. The same Meta that filed a patent for Project Lazarus, a system designed to generate posthumous content from deceased users, because apparently your data doesn’t stop being useful to them just because you do. Handing your most intimate biometric information to that company is a case study in one.

Designer: Meta

To be fair, the product itself has a coherent logic behind it. Meta’s Ray-Ban Display glasses have been received surprisingly well by the press, and the neural wristband that ships with them, which uses electromyography to read muscle signals and translate them into gestures, only works with those glasses. That’s a real limitation. A smartwatch that absorbs that gesture-control functionality while adding health tracking and a persistent AI assistant would close a gap that currently makes the whole setup feel incomplete. From a pure product strategy standpoint, Malibu 2 makes sense.

The hardware ambitions have also matured since Meta’s first attempt at a smartwatch, which was scrapped in 2022 after accumulating plans for detachable cameras and metaverse tie-ins that never quite added up to a coherent device. Malibu 2 is reportedly focused on health tracking and Meta AI integration, which is a much cleaner pitch. The company already has a working partnership with Garmin, visible in the Oakley Vanguard sports glasses and a neural band demo at CES 2026 inside a Garmin-powered car concept. If there’s a natural manufacturing and platform partner for this watch, Garmin is the obvious candidate.

Meta is also reportedly developing the watch to sit alongside updated Ray-Ban Display glasses, internally called Hypernova 2, with both devices likely to be unveiled at Meta Connect in September. The Phoenix mixed reality glasses, meanwhile, have been pushed to 2027 partly because Meta’s executives were concerned about releasing too many devices at once and confusing consumers. That’s a reasonable concern. It’s also a little rich coming from a company whose current product lineup already includes smart glasses with a separate neural band that only controls one device.

The wearables market is genuinely ready for a credible third competitor alongside Apple and Samsung, and Meta has the AI infrastructure and the existing glasses ecosystem to make Malibu 2 compelling from launch. But compelling and trustworthy are different things, and Meta has spent twenty years demonstrating which one it prioritizes. Your Apple Watch data sits in Apple’s ecosystem, behind a company that has made privacy a marketing pillar and a legal battleground. Your Malibu 2 data sits with a company that patented a way to keep monetizing you after you die.

The post Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong?? first appeared on Yanko Design.

Meta Quest 3 feature turns any flat surface into functional Surface Keyboard with trackpad

Meta may have discontinued the Quest for Business program intended for its Quest 3, but the Horizon OS v85 is planning on introducing some niche features to the VR headset, which, of course, will start with the replacement of Horizon Feed with the Navigator UI as the default. In addition to that, if the new Horizon OS v85 Public Test Channel (PTC) is considered, the Meta headset will be able to turn any flat surface – a table or a desk – into a virtual keyboard you can type on like a physical keyboard.

The PTC for Quest OS v85 has started rolling out and initial YouTube hands-on reviews and forum discussions reveal it’s available as an experimental feature exclusively on Quest 3. The Quest 3S may have been left out of the virtual keyboard (the reason is not apparent at the time of writing), which appears like magic on a table and turns it into a keyboard complete with a trackpad.

Designer: Meta

The feature is called Surface Keyboard and it adds a keyboard on top of any surface you want. With the tap on the handheld controllers, you can switch back from the virtual keyboard to controllers and back seamlessly. If mixed reality and hand tracking have always excited you, with v85 of its operating system, the Quest 3 is going to take that experience to a new level.

To be able to truly live this fiction, where you place your hands on a table for a couple of seconds and a keyboard appears out of nowhere (where your hands were), and you can start typing – no buttons, no configuration, just hands and the virtual keyboard to type on. You will be required to opt in to the PTC of the Horizon OS, and receive the pre-release version to toil with.

If we remember correctly, Meta has been working on creating a virtual keyboard of this kind for a better part of the decade. In fact, it was in 2023 that Mark Zuckerberg demoed it and claimed he could reach 100 words per minute. Going by the videos and reviews floating online, the keyboard will take some getting used to. That said, the setup is easy and straightforward.

When you have opted in for the PTC, you can go to Movement Tracking and enable hand movement and body and double-tap controllers for hand tracking (to be able to switch between controller and keyboard). Now, to the Experimental and Unstable Surface Keyboard under the head. Once that’s done, go to devices, click keyboard, and setup and you’re set. Place your hands flat on a surface, and in seconds, a keyboard will appear where your hands are.

 

The post Meta Quest 3 feature turns any flat surface into functional Surface Keyboard with trackpad first appeared on Yanko Design.

Meta Misread the Future Twice. Now They’re Sitting on a Golden Egg, But Don’t Know It

Mark Zuckerberg changed his company’s name to Meta in October 2021 because he believed the future was virtual. Not just sort-of virtual, like Instagram filters or Zoom calls, but capital-V Virtual: immersive 3D worlds where you’d work, socialize, and live a parallel digital life through a VR headset. Four years and roughly $70 billion in cumulative Reality Labs losses later, Meta is quietly dismantling that vision. In January 2026, the company laid off around 1,500 people from its metaverse division, shut down multiple VR game studios, killed its VR meeting app Workrooms, and effectively admitted that the grand bet on virtual reality had failed. Investors barely blinked. The stock went up.

The official line now is that Meta is pivoting to AI and wearables. Zuckerberg spent much of 2025 building what he calls a “superintelligence” lab, hiring top-tier AI talent with eye-watering compensation packages that are now one of the largest drivers of Meta’s 2026 expense growth. The company released Llama models that benchmark decently against OpenAI and Google, embedded chatbots into WhatsApp and Instagram, and talks constantly about “AI agents” and “new media formats.” But from a product and profit perspective, Meta’s AI strategy looks suspiciously like its metaverse strategy: lots of spending, vague promises, and no breakout consumer experience that people actually love. Meanwhile, the thing that is quietly working, the thing people are buying and using in the real world, is a pair of $300 smart glasses that Meta barely talks about. If this sounds like a pattern, that’s because it is. Meta has now misread the future twice in a row, and both times the answer was hiding in plain sight.

The Metaverse Was a $70 Billion Fantasy

Reality Labs has been hemorrhaging money since late 2020. As of early 2026, cumulative operating losses sit somewhere between $70 and $80 billion, depending on how you slice the quarters. In the third quarter of 2025 alone, Reality Labs posted a $4.4 billion loss on $470 million in revenue. For 2025 as a whole, the division lost more than $19 billion. These are not rounding errors or R&D investments that will pay off next year. These are structural losses tied to a product category, VR headsets and metaverse platforms, that the market simply does not want at the scale Meta imagined.

The vision sounded compelling in a keynote. You would strap on a Quest headset, meet your coworkers in a virtual conference room with floating whiteboards, then hop over to Horizon Worlds to hang out with friends as legless avatars. The problem was that almost no one wanted to do any of that for more than a demo. VR remained a niche gaming platform with occasional fitness and entertainment use cases, not the next paradigm shift in human interaction. Zuckerberg kept insisting the breakthrough was just around the corner. He was wrong, and the January 2026 layoffs and studio closures were the formal acknowledgment that Reality Labs as originally conceived was dead.

The irony is that Meta actually had a potential killer app inside Reality Labs, and it murdered it. Supernatural, a VR fitness game that Meta acquired for $400 million in 2023, was one of the few pieces of Quest software that generated genuine user loyalty and recurring revenue. People who used Supernatural regularly described it as the most effective home workout they had ever done, combining rhythm-based gameplay with full-body movement in a way that treadmills and Peloton bikes could not replicate. It had a subscription model, a dedicated community, and real retention. In January 2026, Meta moved Supernatural into “maintenance mode,” which is corporate speak for “we fired almost everyone and it will get no new content.” If you are trying to prove that VR has mainstream utility beyond gaming, fitness is one of the most obvious wedges. Meta had that wedge, and it chose to kill it in the same round of cuts that shuttered studios working on Batman VR games and other prestige titles. The message was clear: Zuckerberg had lost interest in Quest, even the parts that worked.

The AI Bet That Looks Like the ‘Metaverse Bust’ 2.0

After spending years insisting the future was virtual worlds, Meta pivoted hard to AI in 2023 and 2024. Zuckerberg now talks about AI the way he used to talk about the metaverse: with sweeping language about paradigm shifts and transformative platforms. The company stood up an AI division focused on building what it calls “superintelligence,” hired aggressively from OpenAI and Anthropic, and made technical talent compensation the second-largest contributor to Meta’s 2026 expense growth behind infrastructure. This is not a side project. Meta is spending billions on AI research, training, and deployment, and Zuckerberg expects losses to remain near 2025 levels in 2026 before they start to taper.

From a technical standpoint, Meta’s AI work is solid. The Llama family of models is legitimately competitive with GPT-4 class systems and has found real adoption among developers who want open-source alternatives to OpenAI and Google. Meta’s internal AI is also driving real business value in ad targeting, content ranking, and moderation. Those systems work, and they contribute directly to Meta’s core revenue. But from a consumer product perspective, Meta’s AI feels scattered and often unnecessary. The company has embedded “Meta AI” chatbots into WhatsApp, Instagram, Messenger, and Facebook, none of which feel like natural places for a chatbot. Instagram’s feed is increasingly stuffed with AI-generated images and engagement bait that users actively complain about. Meta has launched character-based AI bots tied to influencers and celebrities, and approximately no one uses them. The gap between “we have impressive models” and “we have a product people love” is enormous, and it is the exact same gap that sank the metaverse.

What Meta is missing, again, is product intuition. OpenAI built ChatGPT and made it feel like the future because the interface was simple, the use cases were obvious, and it delivered consistent value. Google integrated Gemini into Search and productivity tools where users were already working. Meta, by contrast, seems to be throwing AI at every surface it controls and hoping something sticks. Zuckerberg talks about “an explosion of new media formats” and “more interactive feeds,” which in practice means more algorithmic slop and fewer posts from people you actually know. Analysts are starting to notice. One Bernstein note from early 2026 argued that the “winner” criteria in AI is shifting from model quality to product usage, which is a polite way of saying that having a great model does not matter if your product is annoying. Meta has a great model. Its products are annoying.

The financial picture is also murkier than Meta would like to admit. Reality Labs is still losing close to $20 billion a year, and while AI is not a separate reporting segment, the talent and infrastructure costs are clearly rising. Meta’s overall revenue growth is strong, driven by advertising, but the company is not yet showing a clear path to AI profitability outside of ‘ad optimization’. That puts Meta in the awkward position of having pivoted from one unprofitable moonshot (metaverse) to another potentially unprofitable moonshot (consumer AI products) while the actual profitable parts of the business, social ads and engagement, keep the lights on. This is a pattern, and it is not a good one.

The Smart Glasses Lead That Meta Is Poised to Lose

Meta talks about the Ray-Ban smart glasses constantly. Zuckerberg calls them the “ultimate incarnation” of the company’s AI vision, and the pitch is relentless: sales more than tripled in 2025, the glasses represent the future of ambient computing, this is the post-smartphone platform. The problem is not that Meta is ignoring the glasses. The problem is that Meta is about to squander a massive early lead, and the competition is closing in fast. 2026 is shaping up to be a blockbuster year for smart glasses. Samsung confirmed its AR glasses are launching this year. Google is releasing its first pair of smart glasses since 2013, an audio-only pair similar to the Ray-Ban Meta glasses. Apple is reportedly pursuing its own smart glasses and shelved plans for a cheaper Vision Pro to prioritize the project. Meta dominated VR because it was early, cheap, and had no real competition. In smart glasses, that window is closing fast, and the field is getting crowded with all kinds of names, from smaller players like Looktech and Xgimi’s Memomind to mid-sized brands like Xreal, to even larger ones like Google, TCL, and Xiaomi.

The Ray-Ban Meta glasses work because they are simple and focused. They take photos and videos, play music, make calls, and provide real-time answers through an AI assistant. Parents use them to record their kids hands-free. Travelers use them for translation. The form factor, actual Ray-Ban Wayfarers that cost around $300, means they do not scream “I am wearing a computer on my face.” This is the rare Meta hardware product that feels intuitive rather than forced, and it is selling because it solves boring, everyday problems without requiring users to change their behavior.

Then Meta made a critical mistake. To use the glasses, you have to route everything through the Meta AI app, which means you cannot just power-use the hardware without engaging with Meta’s AI-slop ecosystem. Want to access your photos? Meta AI. Want to tweak settings? Meta AI. The app is the mandatory gateway, and it is stuffed with the same kind of algorithmic recommendations and AI-generated suggestions that clutter Instagram and Facebook. Instead of letting the glasses be a clean, utilitarian tool, Meta is using them as another vector to push its AI products. Google and Samsung are not going to make that mistake. Their glasses will integrate with Android XR and existing ecosystems without forcing users into a single AI app. Apple, if and when it launches, will almost certainly take a similar approach: clean hardware, seamless OS integration, optional AI features. Meta had a head start, Ray-Ban branding, and a product people actually liked. It is on track to waste all of that by prioritizing AI evangelism over product discipline, and the competition is going to eat its lunch.

What Happens When You Chase Narratives Instead of Products

The pattern across metaverse and AI is that Meta keeps betting on big, abstract visions rather than iterating on the things that work. Zuckerberg is a narrative-driven founder. He wants to define the future, not respond to it. That impulse gave us Facebook in 2004, when no one else saw the potential of real-identity social networks, but it has led Meta astray repeatedly in the 2020s. The metaverse was a narrative, not a product. The idea that billions of people would strap on headsets to work and socialize in 3D was always more science fiction than product roadmap, but Zuckerberg committed so hard to it that he renamed the company.

AI feels like the same mistake. The narrative is that foundation models and “agents” will transform every part of computing, and Meta wants to be seen as a leader in that transformation. The actual products, chatbots in WhatsApp and AI-generated feed content, do not meaningfully improve the user experience and in many cases make it worse. Meanwhile, the thing that is working, smart glasses, does not fit cleanly into the AI or metaverse narrative, so it gets less attention and investment than it deserves. Meta’s 2026 strategy, “shifting investment from metaverse to wearables,” is a tacit admission of this, but it is couched in language that still emphasizes AI rather than the hardware itself.

The other pattern is that Meta is willing to kill its own successes if they do not fit the broader narrative. The hit VR fitness game on Meta’s Horizon, Supernatural, was working. It had subscribers, retention, and cultural momentum within the VR fitness community. It was also a relatively small, specific product rather than a platform play, and that made it expendable when Meta decided to scale back Reality Labs. The same logic applies to Quest more broadly. The headset had carved out a niche in gaming and fitness, and with sustained investment in content and ecosystem development, it could have grown into a meaningful adjacent business. Instead, Meta is deprioritizing it because Zuckerberg has decided the future is AI and lightweight wearables. That might turn out to be correct, but the way Meta is executing the pivot, by shuttering studios and putting products in maintenance mode rather than spinning them out or finding partners, suggests a lack of product discipline.

Why Smart Glasses Might Actually Be the Next Facebook

If you step back and ask what Meta is actually good at, the answer is not virtual reality or language models. Meta is good at building social products with massive scale, capturing and distributing content, and monetizing attention through ads. The Ray-Ban Meta glasses fit all of those strengths. They make it easier to capture photos and video, which feeds into Instagram and Facebook. They use AI to provide contextual information, which ties into Meta’s model development. And they are a physical product that people wear in public, which is a form of distribution and branding that Meta has never had before.

The bigger story is that smart glasses as a category are exploding, and Meta happened to be early. It is not just Samsung, Google, and Apple entering the space. Meta itself is expanding the Ray-Ban line with Displays (which adds a heads-up display) and partnering with Oakley on HSTN, a sportier model aimed at action sports. Google is teaming up with Warby Parker for its glasses, which gives it instant credibility in eyewear design. And then there are the startups: Even Realities, Xiaomi, Looktech, MemoMind, and dozens more, all slated for 2026 releases. This feels exactly like the moment AirPods sparked the true wireless earbud movement. Apple defined the format, then everyone from Samsung to Sony to no-name brands flooded the market, and now you can buy HMD ANC earbuds for 28 dollars. Smart glasses are following the same trajectory, which means the form factor itself is validated, and Meta’s early lead matters less than whether it can keep iterating faster than everyone else.

The other underrated piece is that having an instant camera on your face is genuinely useful in ways that VR headsets never were. People are using Ray-Ban Meta glasses as GoPro alternatives while skateboarding, cycling, and doing action sports, because POV capture without holding a phone or mounting a camera is frictionless. Content creators are using them to shoot hands-free B-roll at events like CES. Parents are using them to record their kids playing without the weird “I am holding my phone up at the playground” vibe. Pet owners are capturing spontaneous moments with dogs and cats that would be impossible to get with a phone. These are not sci-fi use cases or metaverse fantasies. They are boring, real-world problems that the glasses solve immediately, and that is why they are selling. Meta has spent a decade chasing grand visions of the future, and it accidentally built a product that people want right now. The challenge is whether it can resist the urge to over-complicate it before Google, Samsung, and Apple catch up.

The Real Lesson Is About Focus

Meta has spent the last five years oscillating between grand visions, metaverse and AI, and neglecting the products that actually work. The Ray-Ban Meta glasses are proof that when Meta focuses on solving real problems with tangible products, it can still build things people want. The metaverse failed because it was a solution in search of a problem, and the AI push is struggling because Meta is shipping features rather than products. Smart glasses, by contrast, are succeeding because they make everyday tasks easier without requiring users to change their behavior or buy into a futuristic narrative.

If Zuckerberg can internalize that lesson, Meta might actually have a shot at owning the next platform. But that requires a level of product discipline and restraint that Meta has not shown in years. It means resisting the urge to turn every product into a platform, admitting when a bet has failed rather than pouring another $10 billion into it, and focusing on iteration over narration. The irony is that Meta already has the right product. It just needs to stop looking past it.

The post Meta Misread the Future Twice. Now They’re Sitting on a Golden Egg, But Don’t Know It first appeared on Yanko Design.

Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model?

Ray-Ban’s Meta Wayfarer glasses have quickly become the intersection of fashion and technology, combining classic style with advanced smart features. Recently, Ray-Ban and Meta unveiled the new Shiny Transparent Wayfarer, featuring exposed internal components and Clear to Sapphire Transitions lenses. While this new model pushes the boundaries of what smart glasses can look like, the big question is: should you upgrade, especially if you already own a pair? Let’s break it down.

Designer: Ray-Ban + Meta

If Money Is No Object, Then Yes—Go for It

If price isn’t a barrier, the decision to upgrade is straightforward. At $429 USD, the Shiny Transparent Wayfarer offers a visually striking design that showcases the internal technology, creating a futuristic look that stands apart from the Matte Black version. The Clear to Sapphire Transitions lenses add another layer of sophistication, adapting to light conditions and giving the glasses a sleek sapphire tint when outdoors. This is an easy yes for those who enjoy staying at the forefront of wearable tech.

If You Want the New Lens Transition, It’s Worth Considering

If your current Ray-Ban Meta Wayfarer comes with standard clear lenses or basic non-adaptive sunglasses, upgrading to the new Transitions lenses could make a big difference in how you use the glasses day-to-day. The Clear to Sapphire Transitions lenses offer a smooth transition between indoor and outdoor settings, making it easier to adapt to different lighting conditions without needing to switch eyewear. When you’re indoors, the lenses remain clear, providing a natural and unobstructed view. However, once you step outside, they automatically darken to a sleek sapphire tint, adding a touch of style and protecting your eyes from harsh sunlight. For anyone who finds themselves frequently moving between environments, this flexibility could be a major convenience.

On the other hand, if you already own a pair with Clear to Green Transitions lenses, the upgrade may not offer enough of a difference to justify the change. Both lenses provide the same adaptive functionality, adjusting to light to enhance your vision while adding a color tint. The real difference lies in the aesthetic—whether you prefer the cooler sapphire tint or the more classic green hue. If you’re satisfied with the current performance and look of your lenses, there may be little reason to make the leap unless the sapphire color truly appeals to you.

If You Want a New Design with Exposed Tech, Then Yes

The most noticeable difference in the new model is the Shiny Transparent frame. This design exposes the inner workings of the glasses, giving them a high-tech look that contrasts with the more traditional Matte Black frame. The transparent frame brings an aesthetic shift, showcasing the cutting-edge technology that powers the glasses in a more visually pronounced way. It’s an intriguing design choice for those who appreciate a bold, futuristic look.

If you’re drawn to a more tech-forward, modern aesthetic, this new design is worth considering. The transparent frame is eye-catching and adds a fresh dimension to the Ray-Ban Meta Wayfarer collection. For those who want their eyewear to make a visual statement, the exposed components are a step forward in wearable tech design. However, if you prefer a more classic and understated look of the Matte Black Wayfarer, you might find that the new frame doesn’t offer enough reason to make the switch.

For Me, It’s a Hard No

For anyone who already owns the Matte Black Wayfarer with Clear to Green Transitions lenses, upgrading to the new Shiny Transparent model may not be necessary. Your current pair offers the same core features—AI-powered assistance, a 12MP camera, open-ear speakers, and a touchpad for easy control. The Clear to Green Transitions lenses provide excellent functionality, and if you’re happy with the design and tech you already have, there’s no pressing need to make the switch.

The Introduction of AI-Powered Features

With the recent updates, Ray-Ban and Meta have significantly improved the AI capabilities of the glasses. Now, you can use voice commands by simply saying “Hey Meta” and follow up with additional commands without repeating the wake word. The glasses can also remember important details like where you parked your car or set reminders for when you land after a flight. The ability to send voice messages via WhatsApp or Messenger while your hands are occupied adds an extra layer of convenience for staying connected on the go.

One of the more impressive AI features is real-time video assistance. Whether you’re exploring a new city or browsing the aisles of a grocery store, Meta AI can offer real-time help by identifying landmarks or suggesting meals based on the ingredients you’re looking at. Additionally, real-time language translation for Spanish, French, and Italian can remove language barriers, and future updates will likely support more languages.

Expanding Partnerships with Major Platforms

The glasses also support deeper integrations with platforms like Spotify and Amazon Music, but Ray-Ban has expanded these offerings to include Audible and iHeart as well. Now, you can use voice commands to search and play music or audiobooks without touching your phone. This makes the listening experience even more seamless, allowing you to ask questions like “What album is this from?” while on the move. These expanded partnerships deepen the glasses’ role in day-to-day media consumption.

The collaboration with Be My Eyes is another significant step in making the glasses more accessible. This app, designed for individuals who are blind or have low vision, pairs users with sighted volunteers who provide real-time assistance. The glasses’ camera allows the volunteer to see what the wearer sees, enabling them to help with tasks like reading mail or navigating new environments.

Are You Going for It?

Ultimately, the decision to upgrade comes down to personal preference and how much you value the new design and lens options. If money isn’t an issue or you’re drawn to the transparent frame and sapphire lenses, the upgrade makes sense. However, if you’re content with your current Matte Black Wayfarer with Clear to Green Transitions lenses, there’s no pressing reason to switch. The new features and design are exciting, but your existing pair still holds up as a stylish, highly functional piece of wearable tech.

The post Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model? first appeared on Yanko Design.

Meta’s futuristic Orion AR Glasses have Holographic Displays and Neural Control. Apple should take notes

At the Meta Connect 2024 keynote, not only did Mark Zuckerberg debut actual Augmented Reality with holographic displays and neural control, it did so in a device that’s smaller, lighter, and one could argue, more socially acceptable (aka stylish) than Apple’s Vision Pro. Dubbed the Orion, it’s simply a developer prototype for now, but Meta hopes to refine the design, improve the displays, and actually sell it at an affordable price to consumers.

Designer: Meta

Orion is not a bulky headset—it’s a sleek, spectacle-like device that weighs under 100 grams, making it comfortable for extended use. This is an impressive feat considering the amount of technology packed into such a small form factor. While Meta Quest Pro and Apple’s Vision Pro are capable of mixed reality, Orion’s fully transparent, holographic display takes things to a different level. Instead of the passthrough experiences that blend digital elements on top of a live camera feed, Orion projects 3D objects directly into the real world using innovative waveguide technology. The frames are made from magnesium, a super-light metal known for its strength and ability to dissipate heat (something even NASA’s relied on for its space hardware).

The core of this magic is a set of tiny projectors embedded within the arms of the glasses. These projectors beam light into lenses that have nanoscale 3D structures, creating stunningly sharp holographic displays. Zuckerberg emphasized that you could go about your day—whether you’re working in a coffee shop or flying on a plane—while interacting with immersive AR elements like a cinema-sized virtual screen or multiple work monitors.

But it’s not just about visuals. The glasses also facilitate natural social interaction: you can maintain eye contact with others through the transparent lenses, and digital elements seamlessly overlay onto the real world. Need to send a message? Instead of fumbling for your phone, a hologram will appear before your eyes, letting you reply with a quick, subtle gesture. This fluid integration of the digital and physical worlds could set Orion apart from its competitors.

When it comes to control, the Orion glasses offer several interaction modes—voice, hand, and eye tracking—but the star of the show is the neural wristband. In contrast to the Vision Pro, which relies on hand gestures, eye-tracking, and voice commands, Orion takes the next step by reading neural signals from your wrist to control the device. This neural interface allows for discreet control. Imagine being in a meeting or walking down the street—gesturing in mid-air or speaking aloud commands isn’t always convenient. The wristband can pick up subtle electrical signals from your brain and translate them into actions, like tapping your fingers to summon a holographic card game or message a friend. This introduces a new level of human-computer interaction, far more intimate and nuanced than what’s currently available on the market.

While Apple’s Vision Pro and Meta’s previous Quest Pro have been praised for their intuitive interaction systems, Orion’s neural control represents a massive leap forward. It reduces the friction of interacting with digital elements by cutting down on the physical and vocal gestures required, creating a more seamless experience.

One of the key differentiators for Orion is its display technology. Unlike the Vision Pro or Meta Quest Pro, which rely on cameras to pass a live feed of the outside world onto a screen, Orion offers true augmented reality. The glasses project digital holograms directly into your field of view, blending with your surroundings. This isn’t just a camera feed of your environment with digital elements superimposed—it’s real-world AR with transparent lenses that you can see through as you would normal glasses. The holograms are bright enough to stand out even in varied lighting conditions and sharp enough to allow users to perceive fine details in their digital overlays.

Zuckerberg illustrated this with examples: receiving a message as a floating hologram or “teleporting” a distant friend’s avatar into your living room. The display architecture is entirely new, made possible by custom silicon chips and sensors integrated into the glasses, offering a level of immersion that’s more subtle yet more profound than the pass-through systems we’ve seen so far. In a private demo, he even played a metaverse version of Pong with key industry experts like Nvidia CEO Jensen Huang, and investors like Gary Vaynerchuck and Daymond John of Shark Tank.

For all its innovation, Orion is still in the development phase. Zuckerberg was candid that Orion is not yet ready for consumers. Instead, it will serve as a development kit for Meta’s internal teams and a select group of external partners. This will help refine both the hardware and software, as well as grow the ecosystem of apps and experiences that will make Orion valuable when it eventually hits the consumer market. There’s also the matter of affordability—Zuckerberg mentioned the team is working to improve manufacturing processes to bring the cost down. As it stands, this isn’t a device you’ll see in stores next week, but it’s a crucial step in realizing Meta’s vision for the future of AR.

The potential for Orion is vast. Zuckerberg envisions it as the next major computing platform, capable of reshaping how we work, play, and interact with others. By leveraging the power of true augmented reality with a groundbreaking neural interface, Orion positions itself as more than just a wearable gadget—it’s an entirely new way of interfacing with the digital and physical worlds. For now, it’s an exciting glimpse into what the future might hold. The Orion glasses may not be in your hands today, but their arrival could redefine the entire AR landscape in the years to come.

The post Meta’s futuristic Orion AR Glasses have Holographic Displays and Neural Control. Apple should take notes first appeared on Yanko Design.

Meta’s new ‘Affordable’ Quest 3s Headset leaks online, hinting at strong Spatial rivalry with Apple

With multiple rumors floating around that Apple is dead set on building an affordable version of its Vision Pro headsets (probably named the Vision Air), it seems like Meta is doubling down on the affordable headset space too, with the upcoming Meta Quest 3s – a budget alternative to the Quest 3 from just last year.

Images of the Quest 3s leaked around March this year, but new details are finally emerging as Meta is getting ready to launch the affordable headset, both to pre-empt Apple as well ByteDance (the TikTok company) that’s also rumored to be debuting a headset as soon as August 20th.

Designer: Meta

The Quest 3S will reportedly house the same Snapdragon XR2 Gen 2 processor found in its predecessor, ensuring it maintains robust performance capabilities. This processor is specifically designed for XR devices, providing the necessary computational power to handle complex VR and AR applications seamlessly. The inclusion of this processor suggests that Meta isn’t compromising on core performance, which is crucial for maintaining the immersive experience users expect from their devices.

The Quest 3S will feature 1832 x 1920 fast-switching LCD panels. While this might not be as high-end as some OLED displays, it still offers a refresh rate of 90/120 Hz, which should be more than adequate for most users. This choice helps keep costs down while still providing clear, fluid visuals. For users who might be new to VR, the slightly reduced specs in the display won’t be a dealbreaker, especially when considering the price.

The headset will come equipped with Fresnel lenses, which are known for being lightweight while offering a wide field of view. This design helps make the Quest 3S comfortable to wear, even during extended sessions. Additionally, the headset will feature a three-position inter-pupillary distance (IPD) adjustment, so users can adjust the lens spacing to get the sharpest possible view based on their eye spacing. These kinds of thoughtful features show that Meta is keeping the user experience front and center, even with a more budget-friendly model.

The design of the Quest 3S has also been a topic of conversation, particularly due to its unique triangular camera clusters that have surfaced in leaked images. These clusters are expected to house two 4 MP RGB passthrough cameras, four infrared (IR) tracking cameras, and two IR illuminators for depth sensing. This array of sensors is designed to ensure that the headset can accurately track movements and provide a realistic sense of depth, essential for an immersive experience. There’s also an action button, which is rumored to be customizable, allowing users to tweak the functionality to suit their preferences.

Meta’s decision to maintain the Quest Touch Plus controllers in the 3S suggests a commitment to a consistent user experience across its XR ecosystem. These controllers have been praised for their ergonomic design and precision, making them a valuable asset for both VR newcomers and veterans. The use of these familiar controllers will also likely reduce production costs, allowing Meta to pass savings on to consumers.

As for pricing, although nothing has been officially confirmed, it’s expected that the Quest 3S will come in at under $300. This makes it a highly competitive option in the XR market, especially as other companies like ByteDance prepare to launch their own budget-friendly headsets. With the XR space getting more crowded, Meta’s move to introduce a more affordable yet capable device could be a game-changer, opening up mixed reality to a much wider audience. The Quest 3S seems poised to offer a well-rounded experience without breaking the bank, making it a promising choice for those looking to dip their toes into the world of VR and AR.

The post Meta’s new ‘Affordable’ Quest 3s Headset leaks online, hinting at strong Spatial rivalry with Apple first appeared on Yanko Design.

Logitech MX Ink stylus for Meta Quest gives creators a new tool for mixed reality

Mixed reality platforms, or spatial computing as Apple calls it, try to seamlessly blend digital objects into the real world, but that illusion quickly breaks down when it comes to manipulating those virtual pieces directly. Yes, tapping on buttons in thin air or pinching the corner of floating windows might feel a little natural, but creating content, especially 2D and 3D objects, is less believable when all you have are two “wands” in each hand. For decades, the stylus has been the tool of choice of digital artists and designers because of its precision and familiarity, almost like holding a pencil or paintbrush. It was really only a matter of time before the same device came to mixed reality, which is exactly what the Logitech MX Ink tries to bring to the virtual table.

Designer: Logitech

The Logitech MX Ink is practically a stylus designed to work in virtual 3D space, but while that description is simplistic, its implications are rather world-changing. It means that creators no longer need to feel awkward about waving around a thick wand, making them feel like they’re playing games more than painting or modeling. Artists, designers, and sculptors can now use a more convenient and intuitive tool when moving around in mixed reality, bolstering not only their productivity but also the quality of their work. Admittedly, the MX Ink is bulkier and heavier than most styluses, closer to a 3D printing pen than an Apple Pencil, and drawing on air is still going to feel unnatural at first, but it’s significantly better than even drawing with your finger.

What makes Logitech’s implementation a bit more special is that it works in both 3D and 2D spaces. The latter means that you can still draw on a flat surface and feel the same haptics and pressure sensitivity as a Wacom stylus, for example. This means you can easily trace over a sketch or blueprint on paper and bring that up to a 3D space for fleshing out. Or you can paint artistic masterpieces on a physical canvas without actually leaving any mark on the paper.

The MX Ink is a standalone product, but Logitech is also offering optional accessories to further reduce the friction of working in mixed reality. The MX Mat offers a low-friction surface for drawing with the stylus in 2D, though the MX Ink can actually work on most flat surfaces anyway. The MX Inkwell is a stand and wireless charging station for the device, letting you simply lift it from the dock to start drawing and then put it back without having to worry it won’t be charged and ready for your next work session. Without the MX Inkwell, the stylus will have to charge via a USB-C connection, and Logitech doesn’t even ship a cable with it.

As promising as this new creativity tool might sound, its use is limited to the Meta Quest 2 and Quest 3 headsets, ironically leaving the Quest Pro out of the party. This is boasted to be the first time the Quest headsets support more than two paired controllers at the same time, which means you can connect the MX Ink and simply switch between it and the regular Quest controllers without having to reconfigure anything every time. The Logitech MX Ink goes on sale in September with a starting price of $129.99.

The post Logitech MX Ink stylus for Meta Quest gives creators a new tool for mixed reality first appeared on Yanko Design.

Meta Quest 3S images leak online, hinting at an even more affordable VR headset

Upscaled using AI

The Meta Quest 3 was supposed to be the cheaper alternative to the Meta Quest Pro… but now leaked photos from an internal presentation show a new device called the Meta Quest 3S, a ‘lite’ version of the already wildly popular VR headset. Sparked by user u/LuffySanKira on Reddit, screenshots supposedly from a Meta user research session offer a glimpse of the potential Quest 3s. The images showcase the rumored headset alongside the standard Quest 3, revealing some key specifications.

Designer: Meta

The Quest 3s is expected to be a more affordable version of its pricier counterpart. According to the leaks, it will feature a display resolution of 1920 x 1832 with 20 pixels per degree (PPD). This falls short of the Quest 3’s rumored 2208 x 2064 resolution and 25.5 PPD. Storage capacity is also speculated to be lower at 256GB compared to the Quest 3’s 512GB.

The leaked images provide a visual comparison as well. The Quest 3s appears slightly smaller overall, with the most noticeable difference being the front sensors. The Quest 3 has three oval cutouts, while the Quest 3s sports a configuration of six stacked cutouts, three on either side. These leaks are yet to be confirmed by Meta. However, they offer an exciting possibility for VR fans seeking a more accessible entry point into the world of virtual reality.

The post Meta Quest 3S images leak online, hinting at an even more affordable VR headset first appeared on Yanko Design.

Microsoft Mesh lets you hold virtual meetings around virtual bonfires

The hype around the so-called Metaverse seems to have died down a bit. Even Facebook, which changed its name to Meta to emphasize its new mission, has been rather silent on that front, especially in light of AI being the hottest thing in tech these days. With the launch of the Apple Vision Pro, however, interest in mixed reality, as well as AR and VR, is once again on the rise. As such, now seems to be the best time for Microsoft to also make widely available its own virtual meeting platform, Microsoft Mesh, encouraging a new approach to hybrid work arrangements that will have attendees “sitting” around digital bonfires or posh virtual rooms, all for the sake of trying to make people feel more connected even when they’re all just sitting in their own homes.

Designer: Microsoft

In order to shake off the image of something only for games and entertainment, platform developers like Meta and Microsoft try to make mixed reality technologies something that’s actually useful for serious business as well. These usually involve providing virtual spaces for meetings, creating avatars that represent employees, and holding more interactive and livelier gatherings that would otherwise be a boring experience of watching people’s faces in a grid of boxes. In other words, they try to recreate the feelings and emotions of meeting in person when they physically can’t.

Microsoft Mesh is Redmond’s solution to this problem. Think of it like a VR Microsoft Teams and is, in fact, integrated into Microsoft’s collaboration platform. With just a few clicks, you can turn a flat, literally and figuratively, meeting into a 3D virtual experience, complete with bars, chairs, fires, and, of course, a screen inside a screen for showing presentations to your team. You’ll have to create your own personalized avatar, preferably something close to your real-world appearance, and you can decorate your spaces the way you want, including company logos, of course.

1

Microsoft is leaning heavily on its no-code tools to make Mesh more enticing, in addition to having it tied to Microsoft Teams in the first place. Designing the area is a simple process of dragging and dropping assets as you would in a 3D game editor, thanks to a collaboration with Unity 3D. But if that is already too complex, Microsoft Co-Pilot offers an easier method that utilizes AI to translate your prompts into captivating virtual interiors, or at least the semblance of one. Whether it’s just a simple stand-up meeting that needs everyone to be on their toes, a brainstorming session that requires a bit more creativity, or a presentation that needs to keep people attentive, a virtual meeting space is probably going to help spice things up a bit.

Mesh comes at an interesting time when businesses are actually pushing for their workers to return to the office completely. For many companies, however, hybrid has become an unavoidable and permanent reality, with both the benefits and drawbacks it carries, particularly when it comes to the indirect interaction between humans. Microsoft Mesh is being positioned as the next best thing to support those social connections even when actual physical cues are absent. It’s now being made available for Windows PCs, but those who want a more immersive and convincing experience can enjoy it using their Meta Quest headset. That said, you’ll need a Microsoft subscription as well, so it’s not exactly something that everyone can experience.

The post Microsoft Mesh lets you hold virtual meetings around virtual bonfires first appeared on Yanko Design.