Leion Hey2 Brings First AR Glasses Built for Translation to CES 2026

Cross-language conversations create a familiar kind of friction. You hold a phone over menus, miss half a sentence while an app catches up, or watch a partner speak fast in a meeting while your translation lags behind. Even people who travel or work globally still juggle apps, hand-held translators, and guesswork just to keep up with what is being said in the room, which pulls attention away from the actual conversation.

Leion Hey2 is translation that lives where your eyes already are, in a pair of glasses that quietly turns speech into subtitles without asking you to look down or pass a device back and forth. The glasses were built for translation first, not as an afterthought on top of entertainment or social features, and they are meant to last through full days of meetings or classes instead of dying halfway through, when you need them most.

Designer: LLVision

Click here to know more.

Glasses That Care About Conversation, Not Spectacle

Leion Hey2 is a pair of professional AR translation glasses from LLVision, a company that has spent more than a decade deploying AR and AI in industrial and public-sector settings. Hey2 is not trying to be an all-in-one headset; it is engineered from the ground up for real-time translation and captioning, supporting more than 100 languages and dialects with bidirectional translation and latency under 500 ms in typical conditions, plus 6–8 hours of continuous translation on a single charge.

Hey2 is designed to wear like everyday eyewear rather than a gadget. The classic browline frame, 49g weight, magnesium-lithium alloy structure, and adjustable titanium nose pads are all chosen to make it feel like a normal pair of glasses you forget you are wearing. A stepless spring hinge adapts to different faces, and the camera-free, microphone-only design, which follows GDPR-aligned privacy principles and is supported by a secure cloud infrastructure built on Microsoft Azure, helping keep both wearers and bystanders more comfortable in sensitive environments.

Subtitles in Your Line of Sight

Hey2 uses waveguide optics and a micro-LED engine to project crisp, green subtitles into both eyes, with a 25-degree field of view and more than 90% passthrough so the real world stays bright. The optical engine is tuned to reduce rainbow artifacts by up to 98%, keeping text stable and readable in different lighting conditions, while three levels of subtitle size and position let you decide how prominently captions sit in your forward field of view.

The audio side relies on a four-microphone array that performs 360-degree spatial detection to identify who is speaking, while face-to-face directional pickup prioritizes the person within roughly a 60-degree cone in front of you. A neural noise-reduction algorithm uses beamforming and multi-channel processing to isolate the main voice, which helps in noisy restaurants, busy trade-show floors, or classrooms where questions come from different directions, without forcing you to constantly adjust settings.

Modes That Support Work, Learning, and Accessibility

In translation and Free Talk modes, foreign speech is converted into your language as subtitles in your line of sight, so you can mix languages freely and still follow long-form speech without constantly checking a screen. In Free Talk, Hey2 provides subtitles for what you hear and spoken translation for what you say, turning a two-language conversation into something that feels more like a normal chat than a tech demo, with the charging case extending total use to 96 hours across 12 recharges.

Teleprompter mode scrolls your script in your line of sight and advances it automatically as you speak, useful for lectures, pitches, or keynotes where you want to keep eye contact without glancing at notes. AI Q&A, triggered by a temple tap, taps into ChatGPT-powered answers for discreet look-ups, while Captions mode turns fast speech into clean text, helping students, professionals, and Deaf or hard-of-hearing users stay on top of what is being said, even in noisy environments where handheld devices struggle.

A Different Kind of AR Story

When Leion Hey2 steps onto the CES 2026 stage, it represents a quieter kind of AR story. Instead of chasing spectacle, it narrows the brief to something very human, helping people speak, listen, and be understood across languages and hearing abilities. For a show that often celebrates what technology can do, Hey2 is a reminder that sometimes the most interesting innovation is the one that simply lets you keep your head up and stay in the conversation.

Click here to know more.

The post Leion Hey2 Brings First AR Glasses Built for Translation to CES 2026 first appeared on Yanko Design.

Even Realities G2 Just Solved the Biggest Problem With Smart Glasses… Using A Ring

Even Realities launched their first smart glasses last year with a pitch that felt almost countercultural: what if your eyewear didn’t record everything around you, didn’t pipe audio into your ears, and didn’t make everyone nearby wonder if you were filming them? Instead of packing their frames with cameras and speakers, they focused on a single function: a clean, effective heads-up display. The G1 glasses were a minimalist take on wearables, offering monochrome green text in your line of sight for notifications and AI assistance, all without the privacy concerns of outward-facing cameras. This focused approach found its niche, landing the G1 in 350 luxury eyewear shops globally and proving there’s a real appetite for smart glasses that prioritize subtlety and practical assistance.

The G2 glasses themselves improve on last year’s G1 in predictable but welcome ways. Bigger display, better optics, lighter frame, longer battery life. They still avoid cameras and speakers entirely, sticking with Even’s “Quiet Tech” philosophy of providing information without creating privacy concerns. But pair them with the new R1 ring and you get something more interesting than incremental hardware improvements. The ring lets you control the glasses with thumb gestures against your index finger, turning navigation into something closer to using a trackpad than fumbling with voice commands or head taps. Whether that’s actually more natural in practice than the alternatives depends partly on how well the gesture recognition works and partly on whether you’re the kind of person who wants to wear a ring in the first place.

Designer: Even Realities

The display improvements are significant enough to matter in daily use. Even calls their new system HAO 2.0, which stands for Holistic Adaptive Optics, and the practical result is that information appears in layers rather than as flat text plastered across your vision. Quick notifications and AI prompts sit closer in your field of view, while longer content like navigation directions or notes recede slightly into the background. It’s still monochrome green, the same matrix-style aesthetic from the G1, but sharper and easier to read in motion or bright light. The frame itself weighs just 36 grams and carries an IP67 rating for water and dust resistance, so you can wear them in the rain without worrying about killing a $599 investment. Battery life stretches past two days now, and the prescription range goes from -12 to +12, covering most people who need corrective lenses.

What made the G1 frustrating for some users was the interaction model. You could talk to the glasses, but that meant either looking weird in public or finding a quiet spot. You could tap the touch-sensitive nubs on the temples, but they were finicky and required you to constantly reach up to your face. While the G2 improves the reliability of those touchpads significantly, Even Realities’ R1 smart ring practically revolutionizes how you interact with the smart display. Worn on your index finger, the ring lets you swipe up and down with your thumb or tap to select options, essentially turning your hand into a trackpad for your face. The ring is made from zirconia ceramic and stainless steel, costs $249 separately, and connects to the glasses through what Even calls their TriSync ecosystem, linking the glasses, ring, and phone into one synchronized unit.

The gesture controls take some getting used to, based on early reviews. Accidental swipes are common at first, and the learning curve means you might fumble through menus for the first few days. But when it works smoothly, navigating with the ring is more subtle than any of the alternatives. You can check a notification, dismiss it, and move on without anyone noticing you’ve interacted with your glasses at all. That subtlety matters more than it sounds like it would, especially if you’re using features like the built-in teleprompter for presentations or the real-time translation during conversations. The glasses still support the old interaction methods too, so you’re not locked into one way of controlling them.

The AI side of things has been upgraded as well, with Even introducing what they call the Conversate assistant. It handles the usual smart glasses tasks like showing notifications, reading messages, and providing contextual information, but it’s designed to be less intrusive about it. You talk to it and get text responses on the display rather than audio, which keeps conversations private and avoids the awkwardness of having your glasses talk back to you in a quiet room. The system pulls from your phone’s connectivity, so there’s no separate data plan or complex setup required. The AI integration feels thoughtful rather than forced, providing information when you need it without constantly demanding attention.

One detail worth noting: the R1 ring is not compatible with the original G1 glasses. If you bought the first generation and want the ring’s functionality, you’ll need to upgrade to the G2 entirely. Even is offering a launch promotion where buying the G2 gets you the ring and other accessories at 50 percent off, which brings the combined price to $724 instead of $848. For context, Meta’s Ray-Ban smart glasses with their Neural Band controller and full-color display cost $799, though those come with cameras and all the privacy considerations that entails. The G2 and R1 combo sits in an interesting middle ground, offering more focused functionality at a similar price point.

The combination of display-only glasses and a gesture-controlled ring represents a particular vision of what smart eyewear could be. It’s not trying to replace your phone or capture every moment of your life. Instead, it extends your phone’s functionality into your field of view while giving you a discreet way to interact with that information. For people who give frequent presentations, the teleprompter feature alone could justify the cost. For travelers, having real-time translation floating in your vision during conversations is genuinely useful. And for anyone tired of constantly pulling out their phone to check notifications, the G2 offers a less disruptive alternative. Even Realities is refining an approach that feels increasingly relevant as smart glasses move from novelty to practical tool, and the G2 with R1 suggests they’re learning the right lessons from their first attempt.

The post Even Realities G2 Just Solved the Biggest Problem With Smart Glasses… Using A Ring first appeared on Yanko Design.

Apple is allegedly working on an Affordable, Consumer-grade Spatial Headset

Apple showed us what a mixed reality headset could be capable of with the debut of the Vision Pro at WWDC in 2023. It had all the bells and whistles required of an AR and VR headset from Apple, but didn’t find many takers. Perhaps because of its steep price tag or maybe, no one was ready for a headset positioning them into the spatial computing just yet.

For me, per se – it was the price, bulkiness, and small market size for a standalone device in the smart glasses category. Apple soon realized it after significant losses in projected sales. This is why rumors of Apple mulling the rollout of a more affordable non-Pro mixed reality headset model started doing the rounds.

Designer: Apple

Such a device would be made possible by trimming down the features and functionalities of the Vision Pro, but the Cupertino company has thought otherwise (at least for now). New reports by way of Bloomberg’s Mark Gurman, Apple is instead planning a pair of smart glasses that would be targeted at the masses – like the Meta’s Ray Bans – and fit better in the Apple ecosystem than the Vision Pro or its stripped-down brother.

The latest information suggests that the budget-friendly Vision model could have been postponed until after 2027, while the new internal study, codenamed project “Atlas” is running within Apple to understand from the company’s employees where they stand on the topic of smart glasses. Based on the internal understanding, Apple is thinking about smart glasses that would somewhat target the consumer segment that Meta’s Orion augmented reality glasses intend to.

The Orion glasses for now are a prototype themselves. It wouldn’t be the best choice to compare or base the two non-existent devices on the same footing. But the basic idea is that Apple could have a pair of smart glasses that look like regular glasses and are a combination of slick design and useful features that would allow a connected iPhone to do most of the computing.

At the time of writing, it is not known whether Apple has started building such a product. Still, we learn that feasibility studies are happening within the company to deliver eyewear that addresses the issues of convenience, weight, and battery life. Irrespective of what direction Apple intends to take with the idea of smart glasses, it’ll almost take a few years to reach the market. If you’re in a hurry, get your hands on the Meta options!

The post Apple is allegedly working on an Affordable, Consumer-grade Spatial Headset first appeared on Yanko Design.

Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model?

Ray-Ban’s Meta Wayfarer glasses have quickly become the intersection of fashion and technology, combining classic style with advanced smart features. Recently, Ray-Ban and Meta unveiled the new Shiny Transparent Wayfarer, featuring exposed internal components and Clear to Sapphire Transitions lenses. While this new model pushes the boundaries of what smart glasses can look like, the big question is: should you upgrade, especially if you already own a pair? Let’s break it down.

Designer: Ray-Ban + Meta

If Money Is No Object, Then Yes—Go for It

If price isn’t a barrier, the decision to upgrade is straightforward. At $429 USD, the Shiny Transparent Wayfarer offers a visually striking design that showcases the internal technology, creating a futuristic look that stands apart from the Matte Black version. The Clear to Sapphire Transitions lenses add another layer of sophistication, adapting to light conditions and giving the glasses a sleek sapphire tint when outdoors. This is an easy yes for those who enjoy staying at the forefront of wearable tech.

If You Want the New Lens Transition, It’s Worth Considering

If your current Ray-Ban Meta Wayfarer comes with standard clear lenses or basic non-adaptive sunglasses, upgrading to the new Transitions lenses could make a big difference in how you use the glasses day-to-day. The Clear to Sapphire Transitions lenses offer a smooth transition between indoor and outdoor settings, making it easier to adapt to different lighting conditions without needing to switch eyewear. When you’re indoors, the lenses remain clear, providing a natural and unobstructed view. However, once you step outside, they automatically darken to a sleek sapphire tint, adding a touch of style and protecting your eyes from harsh sunlight. For anyone who finds themselves frequently moving between environments, this flexibility could be a major convenience.

On the other hand, if you already own a pair with Clear to Green Transitions lenses, the upgrade may not offer enough of a difference to justify the change. Both lenses provide the same adaptive functionality, adjusting to light to enhance your vision while adding a color tint. The real difference lies in the aesthetic—whether you prefer the cooler sapphire tint or the more classic green hue. If you’re satisfied with the current performance and look of your lenses, there may be little reason to make the leap unless the sapphire color truly appeals to you.

If You Want a New Design with Exposed Tech, Then Yes

The most noticeable difference in the new model is the Shiny Transparent frame. This design exposes the inner workings of the glasses, giving them a high-tech look that contrasts with the more traditional Matte Black frame. The transparent frame brings an aesthetic shift, showcasing the cutting-edge technology that powers the glasses in a more visually pronounced way. It’s an intriguing design choice for those who appreciate a bold, futuristic look.

If you’re drawn to a more tech-forward, modern aesthetic, this new design is worth considering. The transparent frame is eye-catching and adds a fresh dimension to the Ray-Ban Meta Wayfarer collection. For those who want their eyewear to make a visual statement, the exposed components are a step forward in wearable tech design. However, if you prefer a more classic and understated look of the Matte Black Wayfarer, you might find that the new frame doesn’t offer enough reason to make the switch.

For Me, It’s a Hard No

For anyone who already owns the Matte Black Wayfarer with Clear to Green Transitions lenses, upgrading to the new Shiny Transparent model may not be necessary. Your current pair offers the same core features—AI-powered assistance, a 12MP camera, open-ear speakers, and a touchpad for easy control. The Clear to Green Transitions lenses provide excellent functionality, and if you’re happy with the design and tech you already have, there’s no pressing need to make the switch.

The Introduction of AI-Powered Features

With the recent updates, Ray-Ban and Meta have significantly improved the AI capabilities of the glasses. Now, you can use voice commands by simply saying “Hey Meta” and follow up with additional commands without repeating the wake word. The glasses can also remember important details like where you parked your car or set reminders for when you land after a flight. The ability to send voice messages via WhatsApp or Messenger while your hands are occupied adds an extra layer of convenience for staying connected on the go.

One of the more impressive AI features is real-time video assistance. Whether you’re exploring a new city or browsing the aisles of a grocery store, Meta AI can offer real-time help by identifying landmarks or suggesting meals based on the ingredients you’re looking at. Additionally, real-time language translation for Spanish, French, and Italian can remove language barriers, and future updates will likely support more languages.

Expanding Partnerships with Major Platforms

The glasses also support deeper integrations with platforms like Spotify and Amazon Music, but Ray-Ban has expanded these offerings to include Audible and iHeart as well. Now, you can use voice commands to search and play music or audiobooks without touching your phone. This makes the listening experience even more seamless, allowing you to ask questions like “What album is this from?” while on the move. These expanded partnerships deepen the glasses’ role in day-to-day media consumption.

The collaboration with Be My Eyes is another significant step in making the glasses more accessible. This app, designed for individuals who are blind or have low vision, pairs users with sighted volunteers who provide real-time assistance. The glasses’ camera allows the volunteer to see what the wearer sees, enabling them to help with tasks like reading mail or navigating new environments.

Are You Going for It?

Ultimately, the decision to upgrade comes down to personal preference and how much you value the new design and lens options. If money isn’t an issue or you’re drawn to the transparent frame and sapphire lenses, the upgrade makes sense. However, if you’re content with your current Matte Black Wayfarer with Clear to Green Transitions lenses, there’s no pressing reason to switch. The new features and design are exciting, but your existing pair still holds up as a stylish, highly functional piece of wearable tech.

The post Should you upgrade to the new Ray-Ban Meta Wayfarer Limited-Edition Transparent Model? first appeared on Yanko Design.

Spatial video camera concept lets you capture photos and videos hands-free

The way we capture videos and view them has constantly been changing and along with that, mobile devices have also been evolving. With the introduction of spatial videos, we see brands like Apple trying to become the pioneer in this new form. Basically, this is a video that lets you view photos and videos as if you were there instead of looking at them with just a single fixed view. It’s a more immersive form of 3D since you’re able to turn your head and see different perspectives.

Designer: Suosi Design

As more people get into spatial videos, we’ll see all kinds of tools that will be able to capture and view videos like these. One concept tool is called VISOO, a spatial video camera that you can to take videos that can later be viewed using devices like Apple’s iPhone 15 Pro. It is not a bulky camera at all as probably one of the things that will be taken into consideration when developing tools for this kind of video is that it’s easy to carry around.

Based on the product renders, VISOO is a pretty light device that you can either carry around or attach to the accompanying glasses. For the handheld option, the cameras will be stored onto the battery case as you shoot your photos and videos. It seems there’s also a tripod where you can attach the case to for times when you need to have it placed on something a little more stable. For the glasses option, the cameras are attached to the hinges so you can move around easily while capturing moments.

Since this is still in the conceptual stage, there’s no information about the quality of photos and videos that it will capture. And with spatial videos still in early days, there’s still a lot to be discovered and explored for brands that are developing tools that can be used to enjoy capturing and viewing them.

The post Spatial video camera concept lets you capture photos and videos hands-free first appeared on Yanko Design.

Solos AirGo Vision are the world’s first ChatGPT-4o powered smart glasses for interactive input

Smart glasses are the most practical wearables when it comes to your daily style statement. Compared to current generation AR headsets that are bulky, smart glasses are feasible and the buck only stops at how much tech can be fitted inside the compact frames keeping the overall weight down and form factor indistinguishable from a regular pair.

The idea has gone mainstream in the last year or so with Ray-Ban Meta and OPPO coming on board the bandwagon. The latter brought AI-capable Air Glass 3 to the industry in the early half of this year, and Solos has now upped the ante. They’ve revealed a smarter pair of wearables that are the world’s first to have OpenAI’s latest generative AI.

Designer: Solos

This is the AirGo Vision, the next iteration of the AirGo 3 glasses. Powered by OpenAI’s latest ChatGPT-4o, the wearable is also compatible with Google Gemini and Anthropic Claude models. You can expect hands-free operation aided by voice commands to answer any queries or provide real-time inputs for things like reading foreign language hoardings or getting assistance for city directions. The latter is aided by the latest versions of the GPT-4o and generative AI applications, first for any smart glasses. In essence, the glasses will be a visual extension of search information and bring interactive features to make them wantable.

The wearer can capture pictures hands-free and instantly inquire about information. Solos is also giving the option for swapping the main frame with a secondary one that doesn’t have any front camera. Yes, unlike other smart glasses, this one has the camera lens embedded in the arms, instead of the frame. This swap comes in handy for places where a more formal style is acceptable. To keep the wearer aware of any important notifications or calls, there are flashing LED indicators on the frame.

Solos hasn’t yet revealed details about the specific release date or the price, but we can safely assume they’ll cost more than $250.

The post Solos AirGo Vision are the world’s first ChatGPT-4o powered smart glasses for interactive input first appeared on Yanko Design.