Lumus brought a massively wider FOV to smartglasses at CES 2026

Lumus got a major boost in brand recognition when one of its waveguides was selected for use in the Meta Ray-Ban Display glasses. But that already feels like old tech now because at CES 2026, the company brought some of its latest components to the show and based on what I saw, they seem poised to seriously elevate the optical quality of the next wave of high-end smartglasses. 

When the Meta Ray-Ban Displays glasses came out, they wowed users as they were (and still are) one of a handful of smartglassess to feature a full-color in-lens display with at least a 20-degree field of view. But going by the specs on Lumus’ newest waveguides, we’re set for a major upgrade in terms of future capabilities. 

If you look closely, you can see where light from the waveguide propagates into the one of the smartglasses' lenses.
If you look closely, you can see where light from the waveguide propagates into the one of the smartglasses' lenses.
Sam Rutherford for Engadget

The first model I tried featured Lumus’ optimized Z-30 waveguides, which not only offer a much wider 30-degree FOV, they are also 30 percent lighter and 40 percent thinner than previous generations. On top of that, Lumus says they are also more power efficient with the waveguides capable of hitting more than 8,000 nits per watt. This is a big deal because smartglasses are currently quite limited by the size of batteries they can use, especially if you want to make them small and light enough to wear all day. When I tried them on, I was dazzled by both the brightness and sharpness I saw from the Z-30s despite them being limited to 720 x 720 resolution. Not only did the increase in FOV feel much larger than 10 degrees, colors were very rich, including white, which is often one of the most difficult shades to properly reproduce.

I had to take a photo of one of Lumus' non-functioning smartglasses with the company's 70-degree FOV waveguide, because two out of three of the working ones had already broke and the last one that I used was being held together by tape.
I had to take a photo of one of Lumus' non-functioning smartglasses with the company's 70-degree FOV waveguide, because two out of three of the working ones had already broke and the last one that I used was being held together by tape.
Sam Rutherford for Engadget

However, even after seeing how good that first model was, I was totally not prepared for Lumus’ 70-degree FOV waveguides. I was able to view some videos and a handful of test images and I was completely blown away with how much area they covered. It was basically the entire center portion of the lens, with only small unused areas around the corners. And while I did notice some pincushion distortion along the sides of the waveguide’s display, a Lumus representative told me that it will be possible to correct for that in final retail units. But make no mistake, these waveguides undoubtedly produced some of the sharpest, brightest and best-looking optics I’ve seen from any smartglasses, from either retail models or prototypes or. It almost made me question how much wider FOV these types of gadgets really need, though to be clear, I don’t think we’ve hit the point of diminishing returns yet. 

This is one of Lumus' thinnest waveguides measuring in at just 0.8mm.
This is one of Lumus' thinnest waveguides measuring in at just 0.8mm.
Sam Rutherford for Engadget

Other advantages of Lumus’ geometric reflective waveguides include better overall efficiency than their refractive counterparts along with the ability to optically bond the displays to smartglasses lenses. That means unlike a lot of rivals, Lumus’ waveguides can be paired with transitions lenses instead of needing to resort to clip-on sunglass attachments when you go outside. Lumus also claims its designs also simplifies the manufacturing process, resulting in thinner waveguides (as small as 0.8mm) and generally higher yields. 

Unfortunately, taking high-quality photos of content from smartglasses displays is incredibly challenging, especially when you’re using extremely delicate prototypes, so you’ll just have to take my word for now. But with Lumus in the process of ramping up production of its new waveguides with help from partners including Quanta and SCHOTT, it feels like there will be a ton of smartglasses makers clamoring for these components as momentum continues to build around the industry’s pick for the next “big” thing. 


This article originally appeared on Engadget at https://www.engadget.com/wearables/lumus-brought-a-massively-wider-fov-to-smartglasses-at-ces-2026-233245949.html?src=rss

An’An at CES 2026: Biomimetic Wool Panda That Responds to Your Hugs

Loneliness quietly settles into homes where older adults live alone, where families are spread across cities, and where evenings can stretch out with no one to talk to. Technology has tried to fill that gap with video calls and smart speakers, but those tools are still built around tasks and commands, around asking for something rather than simply being there when someone needs company or a gentle reminder that they are not forgotten.

An’An is a robot less interested in showing off and more interested in listening, remembering, and responding gently over months and years. It is a panda-shaped companion designed from the fur inward to offer long-term, stigma-free emotional support for people who might never ask for help directly, treating emotional care as something that can happen quietly through touch, voice, and the kind of daily rituals that build trust without demanding much in return.

Designer: Mind With Heart Robotics

A Panda Built for Feelings, Not Tricks

An’An is a biomimetic panda cub companion built around the simple idea that people relax more easily around animals than around machines. Its body is handcrafted from Australian wool and sheepskin for natural tactile comfort, inviting stroking, hugging, and lap-holding in a way that cold plastic or silicone never could. The form factor is intentionally soft and low-key, closer to a plush toy than a science fiction robot.

An’An is not designed to juggle, dance, or navigate obstacle courses. Its job is to be present, to respond to touch with gentle, lifelike behavior, and to make it feel safe to express emotion without judgment. The panda shape, the weight, and the way it settles into a lap are all tuned to trigger nurturing instincts, especially for older adults who may miss the feeling of holding a pet or a grandchild who has moved to another state.

This focus on emotional comfort extends to how An’An fits into a home. It can rest on a sofa, bed, or desk without looking like medical equipment, which matters when someone is already sensitive about needing support. The goal is to make companionship feel natural and dignified, not clinical, so people will actually reach for the robot when they feel low rather than hiding it in a drawer or treating it like another gadget they were supposed to use but never really warmed to.

Emotional Intelligence Under the Fur

Under the fur, An’An is a dense network of sensors and affective AI. A full-body tactile sensing system with more than 10 sensor suites recognizes how and where it is being touched, distinguishing between a gentle stroke, a firm squeeze, or being picked up. That information feeds into an emotional AI engine that also listens to voice patterns and tracks interaction habits over time, building a model of who you are and how you prefer to communicate.

An’An’s long-term memory allows it to personalize responses as it learns. Over weeks and months, it can adapt to a user’s routines, noticing when someone tends to be quiet, when they like to talk, and what kinds of interactions seem to lift their mood. The hybrid offline-online architecture, with four to five hours of continuous battery life and USB-C charging, keeps core functions running even when connectivity is limited or when someone prefers not to share everything with the cloud.

This combination of multimodal sensing and memory means An’An can move beyond scripted novelty. Instead of repeating the same phrases, it can vary its behavior, initiate interaction during long periods of inactivity, and gradually build a relationship that feels more like a familiar presence than a toy. Preliminary studies suggest that this sustained, personalized engagement can measurably improve mood, which is the metric that matters most when the goal is helping someone feel less alone.

From Living Rooms to Care Facilities

In a private home, An’An can simply be a companion that is always available. It can offer gentle conversation, respond to touch, and provide a sense of being seen and heard without the stigma some people feel around therapy or medication. For older adults who may not want to bother their families with every worry, having something that listens without judgment can make a surprising difference to how a day feels, especially during the long stretches between calls or visits.

In eldercare settings, An’An takes on an additional role. A clinical version can capture objective interaction data, such as touch patterns, conversation cues, and changes in engagement, and surface those trends to authorized clinicians through secure dashboards. That gives caregivers another lens on cognitive and emotional status, helping them notice when someone is withdrawing, agitated, or unusually quiet without relying solely on brief check-ins or self-reported surveys that people might downplay.

Because An’An delivers clinical-grade capabilities at roughly one-fifth the cost of traditional therapeutic robots, it becomes more realistic for care homes and institutions to deploy multiple units rather than a single shared device. The unified affective AI platform, backed by more than 30 patent filings and 18 granted patents, is designed to scale across different environments while keeping the core promise the same: emotionally meaningful companionship over time.

A Different Kind of Robot

When it appears at CES 2026 as an Innovation Awards Honoree in the Artificial Intelligence category, An’An represents a quiet shift in what people expect from robots. Instead of another on-stage demonstration of agility or speed, it offers a case study in emotionally intelligent, human-centered design, showing how biomimetic form, tactile materials, and affective AI can come together to support people who need comfort more than spectacle, and companionship more than commands, at a scale and cost that makes it a viable part of everyday care rather than a research curiosity.

The post An’An at CES 2026: Biomimetic Wool Panda That Responds to Your Hugs first appeared on Yanko Design.

Nuon Medical: Why the Future of Skincare Isn’t Another Serum

The beauty industry has spent decades perfecting what goes inside the bottle. Formulas have become more sophisticated, actives more potent, ingredient lists more transparent. Yet the objects that deliver those formulas have stayed mostly the same. Glass jars, plastic tubes, pump bottles, they’re passive containers designed to hold product, not enhance it. Meanwhile, beauty gadgets promised professional results at home but ended up in drawers, forgotten.

The real opportunity isn’t another breakthrough ingredient or another device. It’s that split second where formula actually meets skin. That’s the insight behind Nuon Medical, a company founded by Alain Dijkstra with roots in medical devices rather than traditional cosmetics. While everyone else obsessed over formulas, Nuon started looking at the packaging itself. We interview Senior Consultant Benny Calderone to get deeper insights into the company’s origins and perspectives.

Designer: Nuon Medical

From Chemical to Physical Innovation

According to Nuon Medical, “For decades, the beauty industry focused on ‘Chemical Innovation’—the juice inside the bottle. Nuon is pioneering the era of ‘Physical Innovation.’ In short: we are solving the ‘Last Inch’ of skincare.” It’s a shift that treats packaging as a performance-critical interface, one that determines whether those actives reach their biological targets or just sit on your skin doing nothing.

Nuon’s journey started with standalone light therapy devices used for hair growth and wrinkle reduction. After nearly two decades making those tools, founder Alain Dijkstra noticed they had a retention problem. They added steps to already crowded routines and mostly ended up unused. For clinical tech to work at scale, it had to disappear into something people already do every day.

By embedding tech into packaging, Nuon eliminated the compliance gap. “We realized that for clinical tech to scale, it must be invisible. By turning the packaging itself into the ‘treatment engine,’ we eliminate the compliance gap.” You’re not being asked to do something new. You’re just upgrading what you already do. From a business angle, this transforms a throwaway bottle into something worth keeping.

Frictionless Intelligence at the Point of Contact

Nuon doesn’t start with sketches or aesthetics. They start with human-factor engineering, figuring out how an applicator should guide contact, path, and speed to deliver the formula correctly every time. The result is what they call frictionless intelligence. “We design for ‘frictionless intelligence.’ If a user has to read a manual, the design has failed,” the company states.

Intelligence gets built into the haptics, the ergonomics, the physical interaction logic itself. The tool quietly guides your motion without you realizing it. The applicator steers where you press, how fast you move, the path you follow, all without instructions. Light therapy, microcurrent, thermal elements, vibration, they’re woven into the interaction, supporting the formula instead of distracting from it.

There’s a tricky balance here. Clinical devices can feel intimidating. Beauty objects need to feel inviting, something you want to pick up every morning. Nuon often prioritizes consistency over intensity. “A tool used daily with proper motion and interaction is far more effective than a high-intensity device left in a drawer,” they note. A lower setting used correctly beats a powerful tool that stays in the drawer.

The Hidden Operating System of Beauty

Nuon isn’t a consumer brand. They’re a B2B partner working behind the scenes with global beauty companies. Their modular tech stack works like an operating system, offering a validated foundation that brands dress up with their own materials. Luxury labels use glass and heavy metals. Mass brands use lighter plastics. The intelligence underneath stays the same.

“Nuon is the ‘Innovation Engine’ behind the world’s leading brands. Our philosophy lives in the UX Framework, not the visual skin,” Calderone explains. Brands can apply their aesthetic identity without messing with the validated technology underneath. “We provide the ‘Intelligence’; they provide the ‘Identity,'” he adds. It’s systems thinking applied to beauty packaging.

Data, Sustainability, and the Death of the Dumb Bottle

Once packaging gets smarter, it starts collecting data. Nuon’s applicators can measure skin hydration, texture, UV exposure, and more. But the company is deliberate about how that information gets used. “Data should be a concierge, not a surveillance tool,” according to their philosophy. Diagnostics should inform care, not flood you with vanity metrics. Nuon provides privacy-by-design infrastructure where consumers stay in control.

Then there’s sustainability, where Nuon takes a blunt stance. “Sustainability only scales if it improves the user experience. ‘Green theater’ is asking consumers to settle for less; True sustainability is ‘Assetization,'” the company states. They design the high-tech applicator as something durable that you want to keep. The formula becomes a refill that plugs into that base, separating Durable Intelligence from the Circular Consumable.

It’s not sustainability through guilt. It’s sustainable because the design makes refills the logical choice. You’ve invested in the smart hardware, so of course you’re going to buy the refill. Nuon’s vision is bold. “We are witnessing the death of the ‘dumb bottle.’ In a decade, a passive plastic cap will feel as obsolete as a rotary phone.”

Packaging will become responsive tools that sense conditions and guide your hand in one motion. “Scalp and hair care is the next great ‘blue ocean.’ It’s a category where wellness meets clinical results, and users can immediately feel the benefits of microcirculation and stimulation,” Nuon notes. The broader idea is that everyday objects on bathroom shelves are about to become quietly intelligent without looking like sci-fi props.

Companies like Nuon are writing that next chapter from behind the scenes, proving that clinically meaningful technology doesn’t need to sacrifice what makes beauty objects appealing. It’s a shift from containers to interfaces, from passive to active. If Nuon’s right, we’ll look back at today’s plain bottles the way we look at rotary phones, functional once but hopelessly outdated now.

The post Nuon Medical: Why the Future of Skincare Isn’t Another Serum first appeared on Yanko Design.

Handwriting is my new favorite way to text with the Meta Ray-Ban Display glasses

When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,

I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of  my favorite tings about the neural band is that it reduced my reliance on voice commands. I've always felt a bit self conscious at speaking to my glasses in public. 

Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly. 

Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn't perfect — it misread a capital "I" as an "H" — but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character). 

Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text — it supports up to 16,000 characters (roughly a half-hour's worth of speech) — and you can beam your text into the glasses' display. 

If you've ever used a teleprompter, Meta's version works a bit differently in that the text doesn't automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them. 

Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access. 

The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device's neural band and that it was delaying a planned international rollout of the device.

This article originally appeared on Engadget at https://www.engadget.com/wearables/handwriting-is-my-new-favorite-way-to-text-with-the-meta-ray-ban-display-glasses-213744708.html?src=rss

IXI’s autofocusing lenses are almost ready to replace multifocal glasses

While wave upon wave of smartglasses and face-based wearables crash on the shores of CES, traditional glasses really haven’t changed much over the hundreds of years we’ve been using them. The last innovation, arguably, was progressive multifocals that blended near and farsighted lenses — and that was back in the 1950s. It makes sense that autofocusing glasses maker IXI thinks it’s time to modernize glasses.

After recently announcing a 22-gram (0.7-ounce) prototype frame, the startup is here in Las Vegas to show off working prototypes of its lenses, a key component of its autofocus glasses, which could be a game-changer. 

IXI’s glasses are designed for age-related farsightedness, a condition that affects many, if not most people over 45. They combine cameraless eye tracking with liquid crystal lenses that automatically activate when the glasses detect the user’s focus shifting. This means that, instead of having two separate prescriptions, as in multifocal or bifocal lenses, IXI’s lenses automatically switch between each prescription. Crucially — like most modern smartglasses — the frames themselves are lightweight and look like just another pair of normal glasses.

IXI autofocus lenses
Mat Smith for Engadget

With a row of prototype frames and lenses laid out in front of him, CEO and co-founder Niko Eiden explained the technology, which can be separated into two parts. First, the IXI glasses track the movement of your eyes using a system of LEDs and photodiodes, dotted around the edges of where the lenses sit. The LEDs bounce invisible infrared light off the eyes and then measure the reflection, detecting the subtle movements of your eye and how both eyes converge when focusing on something close.

Using infrared with just a "handful of analog channels" takes far less power than the millions of pixels and 60-times-per-second processing required by camera-based systems. IXI’s system not only tracks eye movements, but also blinking and gaze direction, while consuming only 4 milliwatts of power.

IXI autofocus lenses
Mat Smith for Engadget

Most of the technology, including memory, sensors, driving electronics and eye tracker, is in the front frame of the glasses and part of the arms closest to the hinge. The IXI prototype apparently uses batteries similar in size to those found in AirPods, which gives some sense of the size and weight of the tech being used. The charging port is integrated into the glasses’ left arm hinge. Naturally, this does mean they can’t be worn while charging. IXI says that a single charge should cover a whole day’s usage.

The prototype frames I saw this week appeared to be roughly the same weight as my traditional chunky specs. And while these are early iterations, IXI’s first frames wouldn’t look out of place in a lineup of spectacle options.

The team has also refined the nose pieces and glasses arms to accommodate different face shapes. Apparently, when testing expanded from Finland to the UK, British faces were “...different.” A little harsh when talking to me, a Brit.

Eiden pulled out some prototype lenses, made up of layers of liquid crystal and a transparent ITO (indium tin oxide) conductive layer. This combination is still incredibly thin, and it was amazing to watch the layers switch almost instantly into a prescription lens. It seemed almost magical. As they’re so thin, they can be easily integrated into lenses with existing prescriptions. It can also provide cylindrical correction for astigmatism too.

Autofocus lenses could eliminate the need for multiple pairs of glasses, such as bifocals and progressives. Even if the glasses were to run out of power, they’d still function as a pair of traditional specs with your standard prescription, just lacking the near-sighted boost. IXI’s sensor sensitivity can also offer insight into other health conditions, detect dry eyes, estimate attentiveness and, by tracking where you’re looking, even posture and neck movement. According to Eiden, blink rate changes with focus, daydreaming and anxiety, and all that generates data that can be shown in the companion app.

IXI autofocus lenses
Mat Smith for Engadget

Hypothetically, the product could even potentially adapt prescriptions dynamically, going beyond the simple vision correction of Gen 1. For example, it could offer stronger corrections as your eyes get fatigued through the day.

IXI appears to be putting the pieces in place to make these glasses a reality. It still needs to obtain the necessary medical certifications in order to sell its glasses and get all the production pieces in place. It’s already partnered with Swiss lens-maker Optiswiss for manufacturing. Eiden says the final product will be positioned as a high-end luxury glasses option, selling through existing opticians. The company hopes to finally launch its first pair sometime next year.

This article originally appeared on Engadget at https://www.engadget.com/wearables/ixis-autofocusing-lenses-multifocal-glasses-ces-2026-212608427.html?src=rss

Razer put a waifu in a bottle at CES 2026

Last year Razer showed off Project Ava as a digital assistant that lived inside your computer to help adjust settings or provide gaming tips. But now at CES 2026, the company’s AI companion platform has gotten a major glow-up while moving into some new digs. 

Now, in lieu of being constrained entirely to your PC’s screen, Razer has given Project Ava a real home in the form of a small tube that can display a 5.5-inch animated hologram of the AI’s avatar. You’ll still need to connect it to your computer via USB-C to provide Ava with the power and data it needs. However, all of your companion’s other components are built into its abode, including dual far-field mics so you can talk to it, a down-firing full-range speaker so it can talk and an HD camera with an ambient light sensor so the AI can see and react to its surroundings.   

But perhaps the biggest upgrade to the project is that instead of just Ava, who Razer describes as “a calm, reliable source of energy to help you keep things clear, efficient, and always on point,” there are three or four new personas (depending on how we’re counting) joining the roster. Kira looks like a TikTok e-girl decked out in a frilly outfit complete with Razer neon green accents, while Zane is her edgy masculine alternative who kind of reminds me of the Giga Chad meme, but with extra snake tattoos. Then there’s Sao, who appears to be directly inspired by iconic Japanese salary woman Saori Araki. Finally, there’s an avatar made in the likeness of Faker (Lee Sang-hyeok), the most successful and well-known League of Legends player of all time and one of Razer's sponsored esports athletes.

The new peripheral for Project Ava is a cylinder that can display a 5.5-inch hologram of an AI companion.
The new peripheral for Project Ava is a cylinder that can display a 5.5-inch hologram of an AI companion.
Sam Rutherford for Engadget

The idea now is that instead of being trapped inside your computer, Ava or one of Razer’s other personas can sit on your desk and be your companion for everything. They can remind you of upcoming events, respond to questions or even comment on your outfit using Razer’s built-in camera. That said, if you need some privacy, the device’s mics can be muted and the company says its planning on putting a physical camera shutter on final retail models. Of course, Ava or any of the other avatars can still hang out while you game and give you advice. During my demo, Kira helped pick out a loadout in Battlefield 6 based on user criteria and even provided pros and cons for some of the game’s other equipment options. 

Project Ava's expanded roster of AI companions
Project Ava's expanded roster of AI companions
Razer

Unfortunately, while I did get to see Kira and Zane talk, dance and sway in their little bottles, Sao and Faker weren’t quite ready to make their holographic debuts. But according to Razer, that’s sort of by design as Project Ava is very much a work in progress. Currently, the avatars’ responses are generated by X AI’s Grok (yikes!), but the platform was created as a sort of open-source project that will support other models like Gemini or ChatGPT.

Down the line, Razer is hoping to add the ability for users to create their own unique avatars and companions based on their input or inspiration from real-world objects. Meanwhile, for avatars like Faker's because he’s also an actual person, Razer wants additional time to make the AI companion helpful with topics like real-time League of Legends coaching.

Say hello to Giga Chad, I mean Zane.
Say hello to Giga Chad, I mean Zane.
Sam Rutherford for Engadget

That said, while some folks might find Project Ava a bit weird or unnerving, it actually feels pretty tame (almost cute even) in an era where people are already marrying their AI partners. And if you’re the kind of person who prefers digital companions over flesh-and-blood alternatives (you know, people), I guess it’s kind of nice to have a more tangible representation of your electronic waifus and husbandos.

Faker's avatar was only viewable in this nearly life-size mock up.
Faker's avatar was only viewable in this nearly life-size mock up.
Sam Rutherford for Engadget

Sadly, Razer has not provided full pricing for Project Ava’s holographic peripheral, though a representative said that it will be in the same ballpark as the company’s other peripherals. I’m estimating a final cost of around $200. Reservations for Project Ava are currently live with a $20 deposit before official shipments begin sometime in the second half of 2026.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/razer-put-a-waifu-in-a-bottle-at-ces-2026-205315908.html?src=rss

YouTube will let you exclude Shorts from search results

YouTube introduced some new filters to its advanced search tools today. Possibly the most exciting change is that Shorts are now listed as a content type, so the three-minute-or-less videos can be excluded as results in your searches.

This is a welcome update for any of us who have been on the hunt for a long-form explainer only to wade through dozens of ten-second clips before finding anything close to our goal. Especially with the addition of even more AI slop last year thanks to the Google Veo 3 engine, an option to exclude Shorts may look even more appealing.

The other updates include a pair of renamed features within advanced search. The "Sort By" menu will now be called "Prioritize." Likewise, the "View Count" option has been renamed to "Popularity;" this will allow YouTube's algorithms to account for other metrics such as watch time to gauge how much other users are engaging with a particular video. A pair of former filter options have also been removed; there will no longer be choices to search for "Upload Date - Last Hour" and "Sort by Rating."

This article originally appeared on Engadget at https://www.engadget.com/entertainment/youtube/youtube-will-let-you-exclude-shorts-from-search-results-204500097.html?src=rss

Hands-on with Fender Audio’s headphones and speakers at CES 2026

Fender Audio may have announced its new headphones and speakers right before CES, but Las Vegas afforded us the first opportunity to see the brand’s new lineup in person. Fender Audio is a Fender-owned brand from Riffsound that’s designing and making new devices after licensing the name. It’s been a while since the guitar and amplifier company made any general-use speakers of its own, and this new arrangement is similar to what Zound was doing with Marshall for a spell.

Logistics out of the way, let’s get down to what the Mix and Ellie are like in the flesh. First, the Mix headphones offer a modular construction that allows you to replace nearly every piece as needed. The ear cups detach from the headband and the ear pads are replaceable. You can also swap out the battery, thanks to an easy-to-access slot behind one ear pad. And on the other side, a USB-C dongle for wireless lossless audio is stowed for safe keeping (wired lossless audio over USB-C is also available).

Fender Audio Mix headphones
Fender Audio Mix headphones
Billy Steele for Engadget

Fender Audio kept the controls simple on the Mix, opting for a single joystick for volume and playback changes. The joystick also serves as the power and pairing control as the only other button cycles through active noise cancellation (ANC) modes. In terms of sound, the Mix will satisfy listeners who crave deep bass, and vocals cut through clearly. In my brief demo, I would’ve liked more mid-range, but I’ll wait until I get a review unit for a full assessment there. I should mention the other standout feature is battery life: the Mix will offer up to 52 hours of use with ANC enabled (up to 100 hours with it off).

Then there are the Elie speakers. Both offer a similar set of features, which includes two wireless inputs for microphones (the company is working on its own model) and a combination XLR and 1/4-inch input for instruments. The Elie 06 is the smaller unit, housing a tweeter, full-range driver and subwoofer with 60 watts of output. The larger Elie 12 doubles all of that, serving as a more robust but still very portable option.

Fender Audio Ellie speakers
Fender Audio Ellie speakers
Billy Steele for Engadget

Both Elie units can be used in a single configuration or as a stereo pair. You can also connect up to 100 of the speakers via a Multi mode. Fender Audio has done a nice job here of checking all of the usual Bluetooth speaker boxes while offering something unique in terms all of those inputs. It’s like the company combined “regular” portable speakers with larger party boxes, offering something for customers who don’t want a massive device or any of the flashing lights.

Of course, none of these specs matter if the company didn’t ace the sound quality. While I’ll wait until I can spend several hours with some review units before I make any final judgement on these, I can tell you that both Elie speakers made a great first impression. There’s ample bass in the tuning for both, but obviously the larger Elie 12 offers more thump. Both units also provide excellent vocal clarity and nice details in the highs, as I made sure to select test tracks with lots of subtle sounds — like Bela Fleck’s banjo tunes.

The back of Fender Audio Ellie 06 speaker
The back of Fender Audio Ellie 06 speaker
Billy Steele for Engadget

Fender Audio says the arrival of the entire new lineup is imminent. Both the headphones and the Elie 6 will cost $299 and the Elie 12 is $399.

This article originally appeared on Engadget at https://www.engadget.com/audio/hands-on-with-fender-audios-headphones-and-speakers-at-ces-2026-203104561.html?src=rss

Dell debuts world’s first 52-inch curved monitor to replace multimonitor setups

Multimonitor setups have taken over professional and creative spheres in a big way, boosting productivity like never before. Dell has upped the ante at CES 2026 with the world’s first 52-inch ultrawide curved monitor that’s designed for data professionals who demand maximum screen real estate. The 6K IPS Black display is your command center with connectivity options that’ll leave nothing to chance.

Dell UltraSharp 52 Thunderbolt Hub Monitor is essentially a combination of a 43-inch 4K display with two 27-inch QHD vertical monitors combined into one display. It eliminates the need for multiple monitor setups, the accompanying organizing hassles, and the wire clutter.

Designer: Dell

The numbers are crazy in every aspect with the 52-inch beast. It has an ultrawide aspect ratio of 21:9 compared to the 16:9 used on most monitors. 6,144 x 2,560 resolution (at 129 pixels per inch) and the 120 Hz refresh rate supporting variable refresh rate ensure it displays any kind of content with maximum precision. Gaming is theoretically possible on this, but you’ll need to match it with a beast of a PC. The IPS Black panel might not be as sharp as an OLED, still it delivers deeper blacks, a 2000:1 contrast ratio, and professional-grade color accuracy according to Dell.

Watching such a big screen for long hours can take a toll on your eyes, and Dell has it covered with the 80 percent less blue light courtesy of the eye-comfort features. The ambient light sensor reduces the strain to a minimum by adjusting the display settings accordingly. Best of all, the monitor connects to four PCs or Macs simultaneously with the two HDMI 2.1 ports, two DisplayPort 1.4 ports, and a Thunderbolt 4 port with support for Power Delivery up to 140W. In addition to these, the monitor features three USB-C 10Gbps upstream ports, four 10Gbps USB-A ports, and an RJ45 Ethernet port. For quick access, the curved monitor has two USB-C ports and a USB-A port on the front. Both these ports support 10Gbps transfer speeds.

When connected to multiple systems, the wide screen can be partitioned into two sections. The KVM (keyboard, video, and mouse) feature allows users to connect their keyboard and mouse independently to the display. The monitor can be height-adjusted by up to 90 mm with support for tilting, swiveling, and slanting positioning for maximum work freedom. The monitor carries a price tag of $2,800, and if you want the stand, that’ll cost an extra $100. Surely, this is not a curved monitor for everyone; still, it is worth every penny for individuals who have required something like this all along.

If this huge monitor is a bit too much, Dell also announced the 32-inch UltraSharp display with 4K resolution and a QD-OLED panel. The 120 Hz refresh rate display has True Black 500 HDR and Dolby Vision support. The Dell UltraSharp 32 4K QD-OLED (U3226Q) is expected to launch in February 2026 for $2,599.

The post Dell debuts world’s first 52-inch curved monitor to replace multimonitor setups first appeared on Yanko Design.

Emerson Smart brings offline voice control to lamps and fans

Perhaps you like the idea of controlling your home appliances with your voice, but aren’t super keen on a data center processing recordings of you. Fair enough. The trade-off for most smart home conveniences is relinquishing at least some of your privacy. Today at CES, I saw a line of voice-controlled home appliances from Emerson Smart that adjust power and setting via voice commands. But commands are recognized on the devices themselves, not carried through Wi-Fi and processed elsewhere.

The huge array of smart plugs, fans, heaters and even air fryers require no app for setup and don’t need access to Wi-Fi. Instead, I said, “hey Emerson, lights on” or “hey fan, turn on low” and the devices in the demo space acted accordingly. A few of the devices combine the mic with a speaker and can respond when a command is received.

A bit of built-in programing on the air fryers allow them to understand commands for 100 cooking presets, so saying things like, “reheat this pizza” or “cook these frozen french fries” will set the correct mode, time and temp. Of course you can also just say, “cook at 350 degrees for 10 minutes” and it’ll comply. Most of the commands for the other items are pretty simple but allow you to do things like set a timer, turn on oscillation and set intensity speeds.

Selection of available commands for Emerson Smart devices.
Selection of available commands for Emerson Smart devices.
Amy Skorheim for Engadget

Some of the devices allow for a small amount of programmability. Pressing and holding the button on the smart plug, for example, changes the wake word to “plug two,” doing it again swaps it to “plug three,” and so on. That way, if you have more than one plug in a room, you can operate them individually.

The small demo space in which I talked to the devices (that did its best to shut out the thrumming noise of the CES show floor) had at least six models active and listening for my words. When I said “Hey Emerson,” both an air fryer and one of the heaters responded. That was one of the limitations that I could see with the devices: If you outfit your entire home in Emerson Smart gear, it might take some time to name and position everything so it works in a coherent way. The lack of an app means that programmability is limited, too. That’s the trade-off the privacy-conscious or app-averse user will have to make if they want to turn their lamp on and fan up when they walk into a room just by talking.

As the only offline, non-DIY voice-controlled lineup of appliances out there, you’re stuck with whatever design Emerson Smart thinks look good. Thankfully, the overall design is clean and modern, if a little basic. The upcoming models of air purifiers and fans were decidedly more attractive.

A new Emerson Smart air purifier and fan combo.
A new Emerson Smart air purifier and fan combo.
Amy Skorheim for Engadget

A handful of devices are available for sale now, but new Emerson Smart products will go into production later this year.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/emerson-smart-brings-offline-voice-control-to-lamps-and-fans-201500078.html?src=rss