Meta Is Turning Its Smart Glasses Into A Mass Surveillance Tool… And You Can’t Stop It

If not Palantir, why Palantir-shaped??

Palantir builds spy tech for the CIA, DHS, and ICE. It aggregates data, maps your life, and tells governments who to watch. Meta is building something with the same bones. It’s called Name Tag, a facial recognition feature coming to Ray-Ban smart glasses that lets a wearer look at a stranger in public and have an AI identify them in real time, pulling their name and profile directly from Facebook and Instagram. The surveillance hardware is a $300 fashion accessory, the database was built by 3 billion people tagging photos for free, and the targets are anyone, anywhere, who never agreed to any of it.

A leaked internal memo from May 2025, obtained by The New York Times, laid out the full scope: the feature is planned for every pair of Meta’s glasses, from Ray-Bans to the Oakley Meta HSTN sports line. Meta’s official response was a practiced non-denial: “we’re still thinking through options and will take a thoughtful approach if and before we roll anything out.” Companies that aren’t building something just say they’re not building it. Meta is not saying that.

The Database Was Being Built Before the Glasses Existed

Facebook turned on automatic photo tagging in 2010 with zero opt-in, and for eleven years, every time you tagged a friend’s face in a photo, you were feeding their facial recognition model. When Meta “deleted” over a billion faceprints in 2021 under lawsuit pressure, they kept the photos. They kept the social graph. They kept the engineers who built the whole thing. Name Tag isn’t a new product concept; it’s a previously mothballed capability getting a second run, this time with a camera on your face instead of a server in Menlo Park.

Anyone with a public Instagram account is immediately a potential target (it’s not like making your account private makes you any safer), which covers hundreds of millions of people who signed up to share photos, not to be enrolled in a real-world biometric identification system. Remember Portal, Meta’s smart home display with a face-tracking camera? It launched in 2018 right in the middle of the Cambridge Analytica fallout, and consumers collectively declined to put a Facebook camera in their living room. Meta discontinued it by 2022. The lesson they apparently took wasn’t “don’t build surveillance hardware.” It was “make sure the camera comes in wearing someone else’s face.”

They Know Exactly How We’ll React

“We will launch during a dynamic political environment where many civil society groups that we would expect to attack us would have their resources focused on other concerns.” That’s a sentence directly from an official internal planning document from Meta’s Reality Labs, dated May 2025, reviewed by The New York Times. The company was explicitly planning to exploit civic chaos as a launch window, timing the rollout of a mass surveillance feature to coincide with another crisis-event that occupies our mind so we’re distracted. Sleight of hand, with a dash of corporate evil. There’s no ethical framework in which that sentence represents good-faith product development.

Their original rollout plan was to debut Name Tag at a conference for the blind, wrapping a mass-surveillance tool in the language of accessibility before expanding it to the general public. That plan was eventually shelved, but the thinking behind it is the more revealing part. The accessibility framing was a softening mechanism, a way to generate human-interest coverage before the obvious misuse cases took over the conversation. Privacy advocates, abuse charities, and civil liberties groups were going to come for this feature regardless. The strategy was never to address their concerns. It was to buy a news cycle of goodwill first.

Your Face Is Being Reviewed in a Nairobi Office Park Right Now

Swedish newspapers Svenska Dagbladet and Göteborgs-Posten tracked Meta’s data pipeline from Ray-Ban glasses worn in Western homes to a company called Sama, operating out of an office park in Nairobi, Kenya. Workers there are paid to watch footage captured by glasses users and label what they see, teaching Meta’s AI to understand and interpret the visual world. The footage includes people on the toilet, naked bodies, couples in bed, bank card details accidentally filmed, and intimate conversations being had by people who had no idea they were being recorded, let alone reviewed by a contractor on another continent.

Meta’s defense was to point at a clause buried in their terms of service permitting “manual (human)” review of AI interactions, which is technically accurate and practically worthless as a justification, because no person buying a pair of fashion-forward smart glasses understands that clause to mean workers in Kenya are watching them undress. The April 2025 privacy policy update for the glasses silently expanded Meta’s right to use all captured photos, videos, and audio for AI training, with no prominent notification to existing owners. A class action lawsuit filed in San Francisco federal court in March 2026 argues this constitutes consumer fraud, given that Meta’s own marketing described the glasses as “designed for privacy, controlled by you.” The UK’s Information Commissioner’s Office wrote to Meta characterizing the situation as “concerning,” which in British regulatory language lands somewhere between “deeply troubled” and “genuinely alarmed.”

$2.1 Billion in Fines and Still Going

The fine history reads like a repeat offender’s rap sheet. Meta paid $650 million to settle an Illinois class action over collecting facial geometry without consent through Facebook’s “Tag Suggestion” feature. They paid another $68.5 million for the same BIPA violation in 2023. In 2024, Texas extracted $1.4 billion from them for capturing biometric data on millions of Texans “for commercial purposes” without informed consent, with the lawsuit specifically alleging Meta was disclosing that data for profit. That’s over $2.1 billion in biometric privacy penalties across four years, all for variations of the same violation, against the same company, building the same technology.

None of it changed the product roadmap. The Texas settlement of $1.4 billion represents roughly one percent of Meta’s $134 billion in 2023 revenue. The Electronic Privacy Information Center has filed complaints with the FTC calling Name Tag a direct facilitator of “stalking, harassment, doxxing and worse.” The EU’s AI Act classifies real-time remote biometric identification in public spaces as high-risk AI and prohibits it for most commercial applications. The fines and the regulatory pressure are clearly baked into Meta’s planning rather than functioning as deterrents. They paid $2.1 billion to establish what a decade of biometric data collection actually costs, looked at that number next to their revenue, and decided it wasn’t a fine. It was an investment.

The Glasses Are Just the Beginning

Name Tag as currently designed still requires the wearer to deliberately trigger an identification query. The next product removes even that minimal friction. Internal documents describe “super sensing” glasses with always-on cameras and microphones that record continuously for the entire duration they’re worn, feeding an unbroken stream to an AI assistant that builds a fully searchable log of the wearer’s day. The surveillance model shifts from opt-in query to permanent ambient default. Every person who passes within the glasses’ field of view gets their face processed, regardless of whether they’ve opted out, regardless of whether they even know the technology exists.

The threat model was demonstrated in 2024 by two Harvard students, AnhPhu Nguyen and Caine Ardayfio, using nothing but current, available hardware. They connected Ray-Ban Meta Gen 2 glasses to PimEyes, a commercial facial recognition engine, alongside LLM data extraction tools, FastPeopleSearch, and Cloaked.com for social security lookups. Streaming the feed to Instagram Live, they identified strangers on the Boston subway and pulled names, home addresses, phone numbers, and social security numbers in seconds. They approached a woman on the street, told her they’d met at a Cambridge Community Foundation event, and she believed them. They told a female student her Atlanta home address and her parents’ names; she confirmed they were right. Name Tag doesn’t make this possible. It already is possible. Name Tag just makes it Meta’s official product.

What “Opt-Out” Actually Means

Meta’s proposed safeguards rely on limiting identification to connected contacts or public accounts, and offering an opt-out toggle buried in Instagram settings. The connected-contacts restriction doesn’t address the most statistically common danger. Stalkers, abusers, and harassers overwhelmingly target people they already know. Limiting the feature to existing connections doesn’t reduce the risk to the most vulnerable users; it focuses it on them. Domestic abuse charities in the UK raised this point directly, noting that abusers could use Name Tag to locate survivors who have relocated, changed their appearance, or created entirely new digital identities to stay safe.

The opt-out toggle is available to Instagram’s roughly 2 billion monthly active users, almost none of whom will encounter it organically. Privacy protections that require the potential victim to proactively locate and activate a setting are not privacy protections. They are liability documentation. Abuse survivors, journalists, political dissidents, undocumented individuals, people in witness protection: these are the people with the highest stakes, and also the people with the least bandwidth to hunt through app settings on the off chance that facial recognition has been added to a device they don’t even own. The toggle protects Meta in a courtroom. It protects its users in no meaningful sense at all.

We Were Free Labor All Along

Twenty years of tagging photos, liking posts, following accounts, and uploading selfies. Every interaction trained the model. Every tagged face sharpened the database. Meta framed all of it as self-expression and social connection, and it was, but it was also free labor on the world’s largest biometric mapping project. The glasses are the hardware layer that connects that digital registry to the physical world. The data collection phase is largely complete. The deployment phase is now.

Reddit ran the same playbook with text and nobody stopped them either. In early 2024, Reddit signed a $60 million-per-year deal with Google to license user-generated content for AI training, then struck a separate deal with OpenAI estimated at $70 million annually. Two decades of forum posts, niche expertise, personal advice, and community-built knowledge that users created for each other got packaged and sold to the highest bidder. Users built the database. Reddit sold it. The users got nothing except the knowledge that their words now live inside a model they don’t control. Meta’s version is identical in structure and more intimate in substance, because the asset being extracted isn’t something you typed. It’s your face, your home, and the faces of everyone in your immediate vicinity.

While all of this unfolds on the hardware and data side, Meta is simultaneously stripping privacy from the software side. End-to-end encryption for Instagram DMs dies on May 8, 2026. Meta’s stated justification is that “very few people” were using it, which is a direct consequence of never making it the default and never promoting it. After May 8, Meta retains full technical access to message content, which means any contractor, government request, or legal process with sufficient leverage can access it too. The feature was specifically extended to users in Ukraine and Russia during the war as a safety measure for people in genuine danger. Those users are now being told to download their chats before the cutoff. The facial recognition is the front door. The unencrypted message access is the unlocked safe. At some point the question stops being “is Meta building a surveillance company?” and starts being “why are we still acting like it isn’t one?”

The post Meta Is Turning Its Smart Glasses Into A Mass Surveillance Tool… And You Can’t Stop It first appeared on Yanko Design.

Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong??

Meta is building a smartwatch, and it wants to know your heart rate, your sleep patterns, your activity levels, and whatever else it can pull from a sensor pressed against your skin all day. The device is codenamed Malibu 2, it’s targeting a 2026 launch, and by most accounts it sounds like a perfectly competent health wearable. The problem isn’t the hardware. The problem is the company attached to it.

This is the same Meta that just faced congressional scrutiny over social media addiction today. The same Meta whose smart glasses are reportedly inching toward facial recognition. The same Meta that filed a patent for Project Lazarus, a system designed to generate posthumous content from deceased users, because apparently your data doesn’t stop being useful to them just because you do. Handing your most intimate biometric information to that company is a case study in one.

Designer: Meta

To be fair, the product itself has a coherent logic behind it. Meta’s Ray-Ban Display glasses have been received surprisingly well by the press, and the neural wristband that ships with them, which uses electromyography to read muscle signals and translate them into gestures, only works with those glasses. That’s a real limitation. A smartwatch that absorbs that gesture-control functionality while adding health tracking and a persistent AI assistant would close a gap that currently makes the whole setup feel incomplete. From a pure product strategy standpoint, Malibu 2 makes sense.

The hardware ambitions have also matured since Meta’s first attempt at a smartwatch, which was scrapped in 2022 after accumulating plans for detachable cameras and metaverse tie-ins that never quite added up to a coherent device. Malibu 2 is reportedly focused on health tracking and Meta AI integration, which is a much cleaner pitch. The company already has a working partnership with Garmin, visible in the Oakley Vanguard sports glasses and a neural band demo at CES 2026 inside a Garmin-powered car concept. If there’s a natural manufacturing and platform partner for this watch, Garmin is the obvious candidate.

Meta is also reportedly developing the watch to sit alongside updated Ray-Ban Display glasses, internally called Hypernova 2, with both devices likely to be unveiled at Meta Connect in September. The Phoenix mixed reality glasses, meanwhile, have been pushed to 2027 partly because Meta’s executives were concerned about releasing too many devices at once and confusing consumers. That’s a reasonable concern. It’s also a little rich coming from a company whose current product lineup already includes smart glasses with a separate neural band that only controls one device.

The wearables market is genuinely ready for a credible third competitor alongside Apple and Samsung, and Meta has the AI infrastructure and the existing glasses ecosystem to make Malibu 2 compelling from launch. But compelling and trustworthy are different things, and Meta has spent twenty years demonstrating which one it prioritizes. Your Apple Watch data sits in Apple’s ecosystem, behind a company that has made privacy a marketing pillar and a legal battleground. Your Malibu 2 data sits with a company that patented a way to keep monetizing you after you die.

The post Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong?? first appeared on Yanko Design.

Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong??

Meta is building a smartwatch, and it wants to know your heart rate, your sleep patterns, your activity levels, and whatever else it can pull from a sensor pressed against your skin all day. The device is codenamed Malibu 2, it’s targeting a 2026 launch, and by most accounts it sounds like a perfectly competent health wearable. The problem isn’t the hardware. The problem is the company attached to it.

This is the same Meta that just faced congressional scrutiny over social media addiction today. The same Meta whose smart glasses are reportedly inching toward facial recognition. The same Meta that filed a patent for Project Lazarus, a system designed to generate posthumous content from deceased users, because apparently your data doesn’t stop being useful to them just because you do. Handing your most intimate biometric information to that company is a case study in one.

Designer: Meta

To be fair, the product itself has a coherent logic behind it. Meta’s Ray-Ban Display glasses have been received surprisingly well by the press, and the neural wristband that ships with them, which uses electromyography to read muscle signals and translate them into gestures, only works with those glasses. That’s a real limitation. A smartwatch that absorbs that gesture-control functionality while adding health tracking and a persistent AI assistant would close a gap that currently makes the whole setup feel incomplete. From a pure product strategy standpoint, Malibu 2 makes sense.

The hardware ambitions have also matured since Meta’s first attempt at a smartwatch, which was scrapped in 2022 after accumulating plans for detachable cameras and metaverse tie-ins that never quite added up to a coherent device. Malibu 2 is reportedly focused on health tracking and Meta AI integration, which is a much cleaner pitch. The company already has a working partnership with Garmin, visible in the Oakley Vanguard sports glasses and a neural band demo at CES 2026 inside a Garmin-powered car concept. If there’s a natural manufacturing and platform partner for this watch, Garmin is the obvious candidate.

Meta is also reportedly developing the watch to sit alongside updated Ray-Ban Display glasses, internally called Hypernova 2, with both devices likely to be unveiled at Meta Connect in September. The Phoenix mixed reality glasses, meanwhile, have been pushed to 2027 partly because Meta’s executives were concerned about releasing too many devices at once and confusing consumers. That’s a reasonable concern. It’s also a little rich coming from a company whose current product lineup already includes smart glasses with a separate neural band that only controls one device.

The wearables market is genuinely ready for a credible third competitor alongside Apple and Samsung, and Meta has the AI infrastructure and the existing glasses ecosystem to make Malibu 2 compelling from launch. But compelling and trustworthy are different things, and Meta has spent twenty years demonstrating which one it prioritizes. Your Apple Watch data sits in Apple’s ecosystem, behind a company that has made privacy a marketing pillar and a legal battleground. Your Malibu 2 data sits with a company that patented a way to keep monetizing you after you die.

The post Meta Wants to Put an AI Health Tracker on Your Wrist in 2026. What Could Go Wrong?? first appeared on Yanko Design.

Meta Quest 3 feature turns any flat surface into functional Surface Keyboard with trackpad

Meta may have discontinued the Quest for Business program intended for its Quest 3, but the Horizon OS v85 is planning on introducing some niche features to the VR headset, which, of course, will start with the replacement of Horizon Feed with the Navigator UI as the default. In addition to that, if the new Horizon OS v85 Public Test Channel (PTC) is considered, the Meta headset will be able to turn any flat surface – a table or a desk – into a virtual keyboard you can type on like a physical keyboard.

The PTC for Quest OS v85 has started rolling out and initial YouTube hands-on reviews and forum discussions reveal it’s available as an experimental feature exclusively on Quest 3. The Quest 3S may have been left out of the virtual keyboard (the reason is not apparent at the time of writing), which appears like magic on a table and turns it into a keyboard complete with a trackpad.

Designer: Meta

The feature is called Surface Keyboard and it adds a keyboard on top of any surface you want. With the tap on the handheld controllers, you can switch back from the virtual keyboard to controllers and back seamlessly. If mixed reality and hand tracking have always excited you, with v85 of its operating system, the Quest 3 is going to take that experience to a new level.

To be able to truly live this fiction, where you place your hands on a table for a couple of seconds and a keyboard appears out of nowhere (where your hands were), and you can start typing – no buttons, no configuration, just hands and the virtual keyboard to type on. You will be required to opt in to the PTC of the Horizon OS, and receive the pre-release version to toil with.

If we remember correctly, Meta has been working on creating a virtual keyboard of this kind for a better part of the decade. In fact, it was in 2023 that Mark Zuckerberg demoed it and claimed he could reach 100 words per minute. Going by the videos and reviews floating online, the keyboard will take some getting used to. That said, the setup is easy and straightforward.

When you have opted in for the PTC, you can go to Movement Tracking and enable hand movement and body and double-tap controllers for hand tracking (to be able to switch between controller and keyboard). Now, to the Experimental and Unstable Surface Keyboard under the head. Once that’s done, go to devices, click keyboard, and setup and you’re set. Place your hands flat on a surface, and in seconds, a keyboard will appear where your hands are.

 

The post Meta Quest 3 feature turns any flat surface into functional Surface Keyboard with trackpad first appeared on Yanko Design.

Meta Misread the Future Twice. Now They’re Sitting on a Golden Egg, But Don’t Know It

Mark Zuckerberg changed his company’s name to Meta in October 2021 because he believed the future was virtual. Not just sort-of virtual, like Instagram filters or Zoom calls, but capital-V Virtual: immersive 3D worlds where you’d work, socialize, and live a parallel digital life through a VR headset. Four years and roughly $70 billion in cumulative Reality Labs losses later, Meta is quietly dismantling that vision. In January 2026, the company laid off around 1,500 people from its metaverse division, shut down multiple VR game studios, killed its VR meeting app Workrooms, and effectively admitted that the grand bet on virtual reality had failed. Investors barely blinked. The stock went up.

The official line now is that Meta is pivoting to AI and wearables. Zuckerberg spent much of 2025 building what he calls a “superintelligence” lab, hiring top-tier AI talent with eye-watering compensation packages that are now one of the largest drivers of Meta’s 2026 expense growth. The company released Llama models that benchmark decently against OpenAI and Google, embedded chatbots into WhatsApp and Instagram, and talks constantly about “AI agents” and “new media formats.” But from a product and profit perspective, Meta’s AI strategy looks suspiciously like its metaverse strategy: lots of spending, vague promises, and no breakout consumer experience that people actually love. Meanwhile, the thing that is quietly working, the thing people are buying and using in the real world, is a pair of $300 smart glasses that Meta barely talks about. If this sounds like a pattern, that’s because it is. Meta has now misread the future twice in a row, and both times the answer was hiding in plain sight.

The Metaverse Was a $70 Billion Fantasy

Reality Labs has been hemorrhaging money since late 2020. As of early 2026, cumulative operating losses sit somewhere between $70 and $80 billion, depending on how you slice the quarters. In the third quarter of 2025 alone, Reality Labs posted a $4.4 billion loss on $470 million in revenue. For 2025 as a whole, the division lost more than $19 billion. These are not rounding errors or R&D investments that will pay off next year. These are structural losses tied to a product category, VR headsets and metaverse platforms, that the market simply does not want at the scale Meta imagined.

The vision sounded compelling in a keynote. You would strap on a Quest headset, meet your coworkers in a virtual conference room with floating whiteboards, then hop over to Horizon Worlds to hang out with friends as legless avatars. The problem was that almost no one wanted to do any of that for more than a demo. VR remained a niche gaming platform with occasional fitness and entertainment use cases, not the next paradigm shift in human interaction. Zuckerberg kept insisting the breakthrough was just around the corner. He was wrong, and the January 2026 layoffs and studio closures were the formal acknowledgment that Reality Labs as originally conceived was dead.

The irony is that Meta actually had a potential killer app inside Reality Labs, and it murdered it. Supernatural, a VR fitness game that Meta acquired for $400 million in 2023, was one of the few pieces of Quest software that generated genuine user loyalty and recurring revenue. People who used Supernatural regularly described it as the most effective home workout they had ever done, combining rhythm-based gameplay with full-body movement in a way that treadmills and Peloton bikes could not replicate. It had a subscription model, a dedicated community, and real retention. In January 2026, Meta moved Supernatural into “maintenance mode,” which is corporate speak for “we fired almost everyone and it will get no new content.” If you are trying to prove that VR has mainstream utility beyond gaming, fitness is one of the most obvious wedges. Meta had that wedge, and it chose to kill it in the same round of cuts that shuttered studios working on Batman VR games and other prestige titles. The message was clear: Zuckerberg had lost interest in Quest, even the parts that worked.

The AI Bet That Looks Like the ‘Metaverse Bust’ 2.0

After spending years insisting the future was virtual worlds, Meta pivoted hard to AI in 2023 and 2024. Zuckerberg now talks about AI the way he used to talk about the metaverse: with sweeping language about paradigm shifts and transformative platforms. The company stood up an AI division focused on building what it calls “superintelligence,” hired aggressively from OpenAI and Anthropic, and made technical talent compensation the second-largest contributor to Meta’s 2026 expense growth behind infrastructure. This is not a side project. Meta is spending billions on AI research, training, and deployment, and Zuckerberg expects losses to remain near 2025 levels in 2026 before they start to taper.

From a technical standpoint, Meta’s AI work is solid. The Llama family of models is legitimately competitive with GPT-4 class systems and has found real adoption among developers who want open-source alternatives to OpenAI and Google. Meta’s internal AI is also driving real business value in ad targeting, content ranking, and moderation. Those systems work, and they contribute directly to Meta’s core revenue. But from a consumer product perspective, Meta’s AI feels scattered and often unnecessary. The company has embedded “Meta AI” chatbots into WhatsApp, Instagram, Messenger, and Facebook, none of which feel like natural places for a chatbot. Instagram’s feed is increasingly stuffed with AI-generated images and engagement bait that users actively complain about. Meta has launched character-based AI bots tied to influencers and celebrities, and approximately no one uses them. The gap between “we have impressive models” and “we have a product people love” is enormous, and it is the exact same gap that sank the metaverse.

What Meta is missing, again, is product intuition. OpenAI built ChatGPT and made it feel like the future because the interface was simple, the use cases were obvious, and it delivered consistent value. Google integrated Gemini into Search and productivity tools where users were already working. Meta, by contrast, seems to be throwing AI at every surface it controls and hoping something sticks. Zuckerberg talks about “an explosion of new media formats” and “more interactive feeds,” which in practice means more algorithmic slop and fewer posts from people you actually know. Analysts are starting to notice. One Bernstein note from early 2026 argued that the “winner” criteria in AI is shifting from model quality to product usage, which is a polite way of saying that having a great model does not matter if your product is annoying. Meta has a great model. Its products are annoying.

The financial picture is also murkier than Meta would like to admit. Reality Labs is still losing close to $20 billion a year, and while AI is not a separate reporting segment, the talent and infrastructure costs are clearly rising. Meta’s overall revenue growth is strong, driven by advertising, but the company is not yet showing a clear path to AI profitability outside of ‘ad optimization’. That puts Meta in the awkward position of having pivoted from one unprofitable moonshot (metaverse) to another potentially unprofitable moonshot (consumer AI products) while the actual profitable parts of the business, social ads and engagement, keep the lights on. This is a pattern, and it is not a good one.

The Smart Glasses Lead That Meta Is Poised to Lose

Meta talks about the Ray-Ban smart glasses constantly. Zuckerberg calls them the “ultimate incarnation” of the company’s AI vision, and the pitch is relentless: sales more than tripled in 2025, the glasses represent the future of ambient computing, this is the post-smartphone platform. The problem is not that Meta is ignoring the glasses. The problem is that Meta is about to squander a massive early lead, and the competition is closing in fast. 2026 is shaping up to be a blockbuster year for smart glasses. Samsung confirmed its AR glasses are launching this year. Google is releasing its first pair of smart glasses since 2013, an audio-only pair similar to the Ray-Ban Meta glasses. Apple is reportedly pursuing its own smart glasses and shelved plans for a cheaper Vision Pro to prioritize the project. Meta dominated VR because it was early, cheap, and had no real competition. In smart glasses, that window is closing fast, and the field is getting crowded with all kinds of names, from smaller players like Looktech and Xgimi’s Memomind to mid-sized brands like Xreal, to even larger ones like Google, TCL, and Xiaomi.

The Ray-Ban Meta glasses work because they are simple and focused. They take photos and videos, play music, make calls, and provide real-time answers through an AI assistant. Parents use them to record their kids hands-free. Travelers use them for translation. The form factor, actual Ray-Ban Wayfarers that cost around $300, means they do not scream “I am wearing a computer on my face.” This is the rare Meta hardware product that feels intuitive rather than forced, and it is selling because it solves boring, everyday problems without requiring users to change their behavior.

Then Meta made a critical mistake. To use the glasses, you have to route everything through the Meta AI app, which means you cannot just power-use the hardware without engaging with Meta’s AI-slop ecosystem. Want to access your photos? Meta AI. Want to tweak settings? Meta AI. The app is the mandatory gateway, and it is stuffed with the same kind of algorithmic recommendations and AI-generated suggestions that clutter Instagram and Facebook. Instead of letting the glasses be a clean, utilitarian tool, Meta is using them as another vector to push its AI products. Google and Samsung are not going to make that mistake. Their glasses will integrate with Android XR and existing ecosystems without forcing users into a single AI app. Apple, if and when it launches, will almost certainly take a similar approach: clean hardware, seamless OS integration, optional AI features. Meta had a head start, Ray-Ban branding, and a product people actually liked. It is on track to waste all of that by prioritizing AI evangelism over product discipline, and the competition is going to eat its lunch.

What Happens When You Chase Narratives Instead of Products

The pattern across metaverse and AI is that Meta keeps betting on big, abstract visions rather than iterating on the things that work. Zuckerberg is a narrative-driven founder. He wants to define the future, not respond to it. That impulse gave us Facebook in 2004, when no one else saw the potential of real-identity social networks, but it has led Meta astray repeatedly in the 2020s. The metaverse was a narrative, not a product. The idea that billions of people would strap on headsets to work and socialize in 3D was always more science fiction than product roadmap, but Zuckerberg committed so hard to it that he renamed the company.

AI feels like the same mistake. The narrative is that foundation models and “agents” will transform every part of computing, and Meta wants to be seen as a leader in that transformation. The actual products, chatbots in WhatsApp and AI-generated feed content, do not meaningfully improve the user experience and in many cases make it worse. Meanwhile, the thing that is working, smart glasses, does not fit cleanly into the AI or metaverse narrative, so it gets less attention and investment than it deserves. Meta’s 2026 strategy, “shifting investment from metaverse to wearables,” is a tacit admission of this, but it is couched in language that still emphasizes AI rather than the hardware itself.

The other pattern is that Meta is willing to kill its own successes if they do not fit the broader narrative. The hit VR fitness game on Meta’s Horizon, Supernatural, was working. It had subscribers, retention, and cultural momentum within the VR fitness community. It was also a relatively small, specific product rather than a platform play, and that made it expendable when Meta decided to scale back Reality Labs. The same logic applies to Quest more broadly. The headset had carved out a niche in gaming and fitness, and with sustained investment in content and ecosystem development, it could have grown into a meaningful adjacent business. Instead, Meta is deprioritizing it because Zuckerberg has decided the future is AI and lightweight wearables. That might turn out to be correct, but the way Meta is executing the pivot, by shuttering studios and putting products in maintenance mode rather than spinning them out or finding partners, suggests a lack of product discipline.

Why Smart Glasses Might Actually Be the Next Facebook

If you step back and ask what Meta is actually good at, the answer is not virtual reality or language models. Meta is good at building social products with massive scale, capturing and distributing content, and monetizing attention through ads. The Ray-Ban Meta glasses fit all of those strengths. They make it easier to capture photos and video, which feeds into Instagram and Facebook. They use AI to provide contextual information, which ties into Meta’s model development. And they are a physical product that people wear in public, which is a form of distribution and branding that Meta has never had before.

The bigger story is that smart glasses as a category are exploding, and Meta happened to be early. It is not just Samsung, Google, and Apple entering the space. Meta itself is expanding the Ray-Ban line with Displays (which adds a heads-up display) and partnering with Oakley on HSTN, a sportier model aimed at action sports. Google is teaming up with Warby Parker for its glasses, which gives it instant credibility in eyewear design. And then there are the startups: Even Realities, Xiaomi, Looktech, MemoMind, and dozens more, all slated for 2026 releases. This feels exactly like the moment AirPods sparked the true wireless earbud movement. Apple defined the format, then everyone from Samsung to Sony to no-name brands flooded the market, and now you can buy HMD ANC earbuds for 28 dollars. Smart glasses are following the same trajectory, which means the form factor itself is validated, and Meta’s early lead matters less than whether it can keep iterating faster than everyone else.

The other underrated piece is that having an instant camera on your face is genuinely useful in ways that VR headsets never were. People are using Ray-Ban Meta glasses as GoPro alternatives while skateboarding, cycling, and doing action sports, because POV capture without holding a phone or mounting a camera is frictionless. Content creators are using them to shoot hands-free B-roll at events like CES. Parents are using them to record their kids playing without the weird “I am holding my phone up at the playground” vibe. Pet owners are capturing spontaneous moments with dogs and cats that would be impossible to get with a phone. These are not sci-fi use cases or metaverse fantasies. They are boring, real-world problems that the glasses solve immediately, and that is why they are selling. Meta has spent a decade chasing grand visions of the future, and it accidentally built a product that people want right now. The challenge is whether it can resist the urge to over-complicate it before Google, Samsung, and Apple catch up.

The Real Lesson Is About Focus

Meta has spent the last five years oscillating between grand visions, metaverse and AI, and neglecting the products that actually work. The Ray-Ban Meta glasses are proof that when Meta focuses on solving real problems with tangible products, it can still build things people want. The metaverse failed because it was a solution in search of a problem, and the AI push is struggling because Meta is shipping features rather than products. Smart glasses, by contrast, are succeeding because they make everyday tasks easier without requiring users to change their behavior or buy into a futuristic narrative.

If Zuckerberg can internalize that lesson, Meta might actually have a shot at owning the next platform. But that requires a level of product discipline and restraint that Meta has not shown in years. It means resisting the urge to turn every product into a platform, admitting when a bet has failed rather than pouring another $10 billion into it, and focusing on iteration over narration. The irony is that Meta already has the right product. It just needs to stop looking past it.

The post Meta Misread the Future Twice. Now They’re Sitting on a Golden Egg, But Don’t Know It first appeared on Yanko Design.

Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model?

Ray-Ban’s Meta Wayfarer glasses have quickly become the intersection of fashion and technology, combining classic style with advanced smart features. Recently, Ray-Ban and Meta unveiled the new Shiny Transparent Wayfarer, featuring exposed internal components and Clear to Sapphire Transitions lenses. While this new model pushes the boundaries of what smart glasses can look like, the big question is: should you upgrade, especially if you already own a pair? Let’s break it down.

Designer: Ray-Ban + Meta

If Money Is No Object, Then Yes—Go for It

If price isn’t a barrier, the decision to upgrade is straightforward. At $429 USD, the Shiny Transparent Wayfarer offers a visually striking design that showcases the internal technology, creating a futuristic look that stands apart from the Matte Black version. The Clear to Sapphire Transitions lenses add another layer of sophistication, adapting to light conditions and giving the glasses a sleek sapphire tint when outdoors. This is an easy yes for those who enjoy staying at the forefront of wearable tech.

If You Want the New Lens Transition, It’s Worth Considering

If your current Ray-Ban Meta Wayfarer comes with standard clear lenses or basic non-adaptive sunglasses, upgrading to the new Transitions lenses could make a big difference in how you use the glasses day-to-day. The Clear to Sapphire Transitions lenses offer a smooth transition between indoor and outdoor settings, making it easier to adapt to different lighting conditions without needing to switch eyewear. When you’re indoors, the lenses remain clear, providing a natural and unobstructed view. However, once you step outside, they automatically darken to a sleek sapphire tint, adding a touch of style and protecting your eyes from harsh sunlight. For anyone who finds themselves frequently moving between environments, this flexibility could be a major convenience.

On the other hand, if you already own a pair with Clear to Green Transitions lenses, the upgrade may not offer enough of a difference to justify the change. Both lenses provide the same adaptive functionality, adjusting to light to enhance your vision while adding a color tint. The real difference lies in the aesthetic—whether you prefer the cooler sapphire tint or the more classic green hue. If you’re satisfied with the current performance and look of your lenses, there may be little reason to make the leap unless the sapphire color truly appeals to you.

If You Want a New Design with Exposed Tech, Then Yes

The most noticeable difference in the new model is the Shiny Transparent frame. This design exposes the inner workings of the glasses, giving them a high-tech look that contrasts with the more traditional Matte Black frame. The transparent frame brings an aesthetic shift, showcasing the cutting-edge technology that powers the glasses in a more visually pronounced way. It’s an intriguing design choice for those who appreciate a bold, futuristic look.

If you’re drawn to a more tech-forward, modern aesthetic, this new design is worth considering. The transparent frame is eye-catching and adds a fresh dimension to the Ray-Ban Meta Wayfarer collection. For those who want their eyewear to make a visual statement, the exposed components are a step forward in wearable tech design. However, if you prefer a more classic and understated look of the Matte Black Wayfarer, you might find that the new frame doesn’t offer enough reason to make the switch.

For Me, It’s a Hard No

For anyone who already owns the Matte Black Wayfarer with Clear to Green Transitions lenses, upgrading to the new Shiny Transparent model may not be necessary. Your current pair offers the same core features—AI-powered assistance, a 12MP camera, open-ear speakers, and a touchpad for easy control. The Clear to Green Transitions lenses provide excellent functionality, and if you’re happy with the design and tech you already have, there’s no pressing need to make the switch.

The Introduction of AI-Powered Features

With the recent updates, Ray-Ban and Meta have significantly improved the AI capabilities of the glasses. Now, you can use voice commands by simply saying “Hey Meta” and follow up with additional commands without repeating the wake word. The glasses can also remember important details like where you parked your car or set reminders for when you land after a flight. The ability to send voice messages via WhatsApp or Messenger while your hands are occupied adds an extra layer of convenience for staying connected on the go.

One of the more impressive AI features is real-time video assistance. Whether you’re exploring a new city or browsing the aisles of a grocery store, Meta AI can offer real-time help by identifying landmarks or suggesting meals based on the ingredients you’re looking at. Additionally, real-time language translation for Spanish, French, and Italian can remove language barriers, and future updates will likely support more languages.

Expanding Partnerships with Major Platforms

The glasses also support deeper integrations with platforms like Spotify and Amazon Music, but Ray-Ban has expanded these offerings to include Audible and iHeart as well. Now, you can use voice commands to search and play music or audiobooks without touching your phone. This makes the listening experience even more seamless, allowing you to ask questions like “What album is this from?” while on the move. These expanded partnerships deepen the glasses’ role in day-to-day media consumption.

The collaboration with Be My Eyes is another significant step in making the glasses more accessible. This app, designed for individuals who are blind or have low vision, pairs users with sighted volunteers who provide real-time assistance. The glasses’ camera allows the volunteer to see what the wearer sees, enabling them to help with tasks like reading mail or navigating new environments.

Are You Going for It?

Ultimately, the decision to upgrade comes down to personal preference and how much you value the new design and lens options. If money isn’t an issue or you’re drawn to the transparent frame and sapphire lenses, the upgrade makes sense. However, if you’re content with your current Matte Black Wayfarer with Clear to Green Transitions lenses, there’s no pressing reason to switch. The new features and design are exciting, but your existing pair still holds up as a stylish, highly functional piece of wearable tech.

The post Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model? first appeared on Yanko Design.

Meta’s futuristic Orion AR Glasses have Holographic Displays and Neural Control. Apple should take notes

At the Meta Connect 2024 keynote, not only did Mark Zuckerberg debut actual Augmented Reality with holographic displays and neural control, it did so in a device that’s smaller, lighter, and one could argue, more socially acceptable (aka stylish) than Apple’s Vision Pro. Dubbed the Orion, it’s simply a developer prototype for now, but Meta hopes to refine the design, improve the displays, and actually sell it at an affordable price to consumers.

Designer: Meta

Orion is not a bulky headset—it’s a sleek, spectacle-like device that weighs under 100 grams, making it comfortable for extended use. This is an impressive feat considering the amount of technology packed into such a small form factor. While Meta Quest Pro and Apple’s Vision Pro are capable of mixed reality, Orion’s fully transparent, holographic display takes things to a different level. Instead of the passthrough experiences that blend digital elements on top of a live camera feed, Orion projects 3D objects directly into the real world using innovative waveguide technology. The frames are made from magnesium, a super-light metal known for its strength and ability to dissipate heat (something even NASA’s relied on for its space hardware).

The core of this magic is a set of tiny projectors embedded within the arms of the glasses. These projectors beam light into lenses that have nanoscale 3D structures, creating stunningly sharp holographic displays. Zuckerberg emphasized that you could go about your day—whether you’re working in a coffee shop or flying on a plane—while interacting with immersive AR elements like a cinema-sized virtual screen or multiple work monitors.

But it’s not just about visuals. The glasses also facilitate natural social interaction: you can maintain eye contact with others through the transparent lenses, and digital elements seamlessly overlay onto the real world. Need to send a message? Instead of fumbling for your phone, a hologram will appear before your eyes, letting you reply with a quick, subtle gesture. This fluid integration of the digital and physical worlds could set Orion apart from its competitors.

When it comes to control, the Orion glasses offer several interaction modes—voice, hand, and eye tracking—but the star of the show is the neural wristband. In contrast to the Vision Pro, which relies on hand gestures, eye-tracking, and voice commands, Orion takes the next step by reading neural signals from your wrist to control the device. This neural interface allows for discreet control. Imagine being in a meeting or walking down the street—gesturing in mid-air or speaking aloud commands isn’t always convenient. The wristband can pick up subtle electrical signals from your brain and translate them into actions, like tapping your fingers to summon a holographic card game or message a friend. This introduces a new level of human-computer interaction, far more intimate and nuanced than what’s currently available on the market.

While Apple’s Vision Pro and Meta’s previous Quest Pro have been praised for their intuitive interaction systems, Orion’s neural control represents a massive leap forward. It reduces the friction of interacting with digital elements by cutting down on the physical and vocal gestures required, creating a more seamless experience.

One of the key differentiators for Orion is its display technology. Unlike the Vision Pro or Meta Quest Pro, which rely on cameras to pass a live feed of the outside world onto a screen, Orion offers true augmented reality. The glasses project digital holograms directly into your field of view, blending with your surroundings. This isn’t just a camera feed of your environment with digital elements superimposed—it’s real-world AR with transparent lenses that you can see through as you would normal glasses. The holograms are bright enough to stand out even in varied lighting conditions and sharp enough to allow users to perceive fine details in their digital overlays.

Zuckerberg illustrated this with examples: receiving a message as a floating hologram or “teleporting” a distant friend’s avatar into your living room. The display architecture is entirely new, made possible by custom silicon chips and sensors integrated into the glasses, offering a level of immersion that’s more subtle yet more profound than the pass-through systems we’ve seen so far. In a private demo, he even played a metaverse version of Pong with key industry experts like Nvidia CEO Jensen Huang, and investors like Gary Vaynerchuck and Daymond John of Shark Tank.

For all its innovation, Orion is still in the development phase. Zuckerberg was candid that Orion is not yet ready for consumers. Instead, it will serve as a development kit for Meta’s internal teams and a select group of external partners. This will help refine both the hardware and software, as well as grow the ecosystem of apps and experiences that will make Orion valuable when it eventually hits the consumer market. There’s also the matter of affordability—Zuckerberg mentioned the team is working to improve manufacturing processes to bring the cost down. As it stands, this isn’t a device you’ll see in stores next week, but it’s a crucial step in realizing Meta’s vision for the future of AR.

The potential for Orion is vast. Zuckerberg envisions it as the next major computing platform, capable of reshaping how we work, play, and interact with others. By leveraging the power of true augmented reality with a groundbreaking neural interface, Orion positions itself as more than just a wearable gadget—it’s an entirely new way of interfacing with the digital and physical worlds. For now, it’s an exciting glimpse into what the future might hold. The Orion glasses may not be in your hands today, but their arrival could redefine the entire AR landscape in the years to come.

The post Meta’s futuristic Orion AR Glasses have Holographic Displays and Neural Control. Apple should take notes first appeared on Yanko Design.

Meta’s new ‘Affordable’ Quest 3s Headset leaks online, hinting at strong Spatial rivalry with Apple

With multiple rumors floating around that Apple is dead set on building an affordable version of its Vision Pro headsets (probably named the Vision Air), it seems like Meta is doubling down on the affordable headset space too, with the upcoming Meta Quest 3s – a budget alternative to the Quest 3 from just last year.

Images of the Quest 3s leaked around March this year, but new details are finally emerging as Meta is getting ready to launch the affordable headset, both to pre-empt Apple as well ByteDance (the TikTok company) that’s also rumored to be debuting a headset as soon as August 20th.

Designer: Meta

The Quest 3S will reportedly house the same Snapdragon XR2 Gen 2 processor found in its predecessor, ensuring it maintains robust performance capabilities. This processor is specifically designed for XR devices, providing the necessary computational power to handle complex VR and AR applications seamlessly. The inclusion of this processor suggests that Meta isn’t compromising on core performance, which is crucial for maintaining the immersive experience users expect from their devices.

The Quest 3S will feature 1832 x 1920 fast-switching LCD panels. While this might not be as high-end as some OLED displays, it still offers a refresh rate of 90/120 Hz, which should be more than adequate for most users. This choice helps keep costs down while still providing clear, fluid visuals. For users who might be new to VR, the slightly reduced specs in the display won’t be a dealbreaker, especially when considering the price.

The headset will come equipped with Fresnel lenses, which are known for being lightweight while offering a wide field of view. This design helps make the Quest 3S comfortable to wear, even during extended sessions. Additionally, the headset will feature a three-position inter-pupillary distance (IPD) adjustment, so users can adjust the lens spacing to get the sharpest possible view based on their eye spacing. These kinds of thoughtful features show that Meta is keeping the user experience front and center, even with a more budget-friendly model.

The design of the Quest 3S has also been a topic of conversation, particularly due to its unique triangular camera clusters that have surfaced in leaked images. These clusters are expected to house two 4 MP RGB passthrough cameras, four infrared (IR) tracking cameras, and two IR illuminators for depth sensing. This array of sensors is designed to ensure that the headset can accurately track movements and provide a realistic sense of depth, essential for an immersive experience. There’s also an action button, which is rumored to be customizable, allowing users to tweak the functionality to suit their preferences.

Meta’s decision to maintain the Quest Touch Plus controllers in the 3S suggests a commitment to a consistent user experience across its XR ecosystem. These controllers have been praised for their ergonomic design and precision, making them a valuable asset for both VR newcomers and veterans. The use of these familiar controllers will also likely reduce production costs, allowing Meta to pass savings on to consumers.

As for pricing, although nothing has been officially confirmed, it’s expected that the Quest 3S will come in at under $300. This makes it a highly competitive option in the XR market, especially as other companies like ByteDance prepare to launch their own budget-friendly headsets. With the XR space getting more crowded, Meta’s move to introduce a more affordable yet capable device could be a game-changer, opening up mixed reality to a much wider audience. The Quest 3S seems poised to offer a well-rounded experience without breaking the bank, making it a promising choice for those looking to dip their toes into the world of VR and AR.

The post Meta’s new ‘Affordable’ Quest 3s Headset leaks online, hinting at strong Spatial rivalry with Apple first appeared on Yanko Design.

Logitech MX Ink stylus for Meta Quest gives creators a new tool for mixed reality

Mixed reality platforms, or spatial computing as Apple calls it, try to seamlessly blend digital objects into the real world, but that illusion quickly breaks down when it comes to manipulating those virtual pieces directly. Yes, tapping on buttons in thin air or pinching the corner of floating windows might feel a little natural, but creating content, especially 2D and 3D objects, is less believable when all you have are two “wands” in each hand. For decades, the stylus has been the tool of choice of digital artists and designers because of its precision and familiarity, almost like holding a pencil or paintbrush. It was really only a matter of time before the same device came to mixed reality, which is exactly what the Logitech MX Ink tries to bring to the virtual table.

Designer: Logitech

The Logitech MX Ink is practically a stylus designed to work in virtual 3D space, but while that description is simplistic, its implications are rather world-changing. It means that creators no longer need to feel awkward about waving around a thick wand, making them feel like they’re playing games more than painting or modeling. Artists, designers, and sculptors can now use a more convenient and intuitive tool when moving around in mixed reality, bolstering not only their productivity but also the quality of their work. Admittedly, the MX Ink is bulkier and heavier than most styluses, closer to a 3D printing pen than an Apple Pencil, and drawing on air is still going to feel unnatural at first, but it’s significantly better than even drawing with your finger.

What makes Logitech’s implementation a bit more special is that it works in both 3D and 2D spaces. The latter means that you can still draw on a flat surface and feel the same haptics and pressure sensitivity as a Wacom stylus, for example. This means you can easily trace over a sketch or blueprint on paper and bring that up to a 3D space for fleshing out. Or you can paint artistic masterpieces on a physical canvas without actually leaving any mark on the paper.

The MX Ink is a standalone product, but Logitech is also offering optional accessories to further reduce the friction of working in mixed reality. The MX Mat offers a low-friction surface for drawing with the stylus in 2D, though the MX Ink can actually work on most flat surfaces anyway. The MX Inkwell is a stand and wireless charging station for the device, letting you simply lift it from the dock to start drawing and then put it back without having to worry it won’t be charged and ready for your next work session. Without the MX Inkwell, the stylus will have to charge via a USB-C connection, and Logitech doesn’t even ship a cable with it.

As promising as this new creativity tool might sound, its use is limited to the Meta Quest 2 and Quest 3 headsets, ironically leaving the Quest Pro out of the party. This is boasted to be the first time the Quest headsets support more than two paired controllers at the same time, which means you can connect the MX Ink and simply switch between it and the regular Quest controllers without having to reconfigure anything every time. The Logitech MX Ink goes on sale in September with a starting price of $129.99.

The post Logitech MX Ink stylus for Meta Quest gives creators a new tool for mixed reality first appeared on Yanko Design.

Meta Quest 3S images leak online, hinting at an even more affordable VR headset

Upscaled using AI

The Meta Quest 3 was supposed to be the cheaper alternative to the Meta Quest Pro… but now leaked photos from an internal presentation show a new device called the Meta Quest 3S, a ‘lite’ version of the already wildly popular VR headset. Sparked by user u/LuffySanKira on Reddit, screenshots supposedly from a Meta user research session offer a glimpse of the potential Quest 3s. The images showcase the rumored headset alongside the standard Quest 3, revealing some key specifications.

Designer: Meta

The Quest 3s is expected to be a more affordable version of its pricier counterpart. According to the leaks, it will feature a display resolution of 1920 x 1832 with 20 pixels per degree (PPD). This falls short of the Quest 3’s rumored 2208 x 2064 resolution and 25.5 PPD. Storage capacity is also speculated to be lower at 256GB compared to the Quest 3’s 512GB.

The leaked images provide a visual comparison as well. The Quest 3s appears slightly smaller overall, with the most noticeable difference being the front sensors. The Quest 3 has three oval cutouts, while the Quest 3s sports a configuration of six stacked cutouts, three on either side. These leaks are yet to be confirmed by Meta. However, they offer an exciting possibility for VR fans seeking a more accessible entry point into the world of virtual reality.

The post Meta Quest 3S images leak online, hinting at an even more affordable VR headset first appeared on Yanko Design.