Switchbot came to CES with a laundry robot you might actually be able to buy

CES 2026 isn't the first year we've seen a wave of interesting robots or even useful robots crop up in Las Vegas. But it's the first year I can remember when there have been so many humanoid and humanoid-like robots performing actually useful tasks. Of those, Switchbot's Onero H1 has been one of the most intriguing robot helpers I've seen on the show floor, especially because the company says that it will actually go on sale later this year (though it won't come cheap). 

Up to now, Chinese company Switchbot has been known for its robot vacuums and smart home devices. Much of that expertise is evident in Onero. The unexpectedly cute robot has a wheeled base that looks similar to the company's robot vacuums, but is also equipped with a set of articulated arms that can help it perform common household tasks. 

I was able to see some of its abilities at Switchbot's CES booth, where Onero dutifully picked up individual articles of clothing from a couch, rolled over to a washing machine, opened the door, placed the items inside and closed the door. The robot moved a bit slowly; it took nearly two minutes for it to grab one piece of clothing and deposit it inside the appliance which was only a few feet away. 

I'm not sure if its slowness was a quirk of the poor CES Wi-Fi, a demo designed to maximize conference-goers attention or a genuine limitation of the robot. But I'm not sure it matters all that much. The whole appeal of a chore robot is that it can take care of things when you're not around; if you come home to a load of laundry that's done, it's not that concerning if the robot took longer to complete the task than you would have. The laundry is done and you don't have to do it. That's the dream.  

Under the hood, Onero is powered by RealSense cameras and other sensors that help it learn its surroundings, as well as on-device AI models.

The demo of course only offered a very limited glimpse of Onero's potential capabilities. In a promotional video shared by Switchbot, the company suggests the robot can so much, much more: serve food and drinks, put dishes away, wash windows, fold clothes and complete a bunch of other — frankly, impressive — tasks. The Onero in the video also has an articulated hand with five fingers that gives it more dexterity than the claw-hand one I saw at CES. A Switchbot rep told me, though, that it plans to offer both versions when it does go on sale. 

Which brings me to the most exciting part about watching Onero: the company is actually planning on selling it this year. A Switchbot rep confirmed to me it will be available to buy sometime in 2026, though it will likely be closer to the end of the year. The company hasn't settled on a final price, but I was told it will be "less than $10,000." 

While we don't know how much less, it's safe to say Onero won't come cheap. It also seems fair to say that this will be a very niche device compared to many of Switchbot's other products. But, if it can competently handle everything the company claims it can, then there's probably a lot of people and businesses that would be willing to pay.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/switchbot-came-to-ces-with-a-laundry-robot-you-might-actually-be-able-to-buy-153000025.html?src=rss

Dreame’s latest robot vacuum concept has slightly terrifying legs that can climb full-size stairs

Robot vacuum companies are once again out in full force at CES 2026, giving their devices a new set of intriguing — and sometimes unsettling — capabilities. This year, Chinese appliance maker Dreame is showing off a vacuum prototype with giant legs that can climb up and down an entire flight of stairs.

The concept, called the Cyber X, was previewed last year at IFA in Berlin. The vacuum sports a somewhat terrifying set of legs with rubber treads that allow it to autonomously navigate multi-story environments. While Dreame has previously shown off vacuums that can move up smaller steps, it says the Cyber X can climb stairs up to 25cm (9.8 inches) high and slopes up to 42 degrees. It can manage both straight and curved staircases, and can climb a flight of steps in 27 seconds, according to the company.

We got a chance to see the Cyber X and its stair-climbing abilities at Dreame’s CES booth, and the device was able to deftly crawl up and down a flight of stairs. The Cyber X didn’t use its “legs” to walk up the steps, though. Instead, it used the treads horizontally, moving a bit like a miniature, cleaning tank.

Interestingly, the actual vacuum is separate from the climbing apparatus. As you can see in the image below, the larger device with legs has an opening where the actual robot vacuum can dock inside and sit while the Cyber X climbs stairs.

That likely means the Cyber X isn’t able to clean the stairs themselves, though it does cleverly solve the problem of transporting the vacuum throughout multi-story environments.

In addition to its legs, the Cyber X’s vacuum also has a built-in water tank to support mopping abilities, and a laser-powered navigation system to help it maneuver up stairs and around other obstacles. It also has a braking system that allows it to stay stable on floors and stairs, even if the battery dies.

Dreame's Cyber X.
Dreame's Cyber X.
Karissa Bell for Engadget

For now, Dreame says Cyber X is just a research prototype and hasn't indicated if it plans to make it, or a robo vac like it, more widely available at some point in the future. But Dreame has a history of showing off innovative features at CES ahead of an actual release. Last year, the company had a prototype vacuum with a mechanical arm at its CES booth. This year, it announced a new vacuum with very similar abilities

The company also announced the Dreame X60 Max Ultra, its latest flagship robot vacuum that can roll up smaller steps. The X60 Max Ultra, which costs $1,699, can move over stairs up to 8.8cm (about 3.4 inches), a small improvement over last year's X50, which could clear heights of 6cm (about 2.4 inches). That's not enough to manage a full-size stair, which is typically around 7 inches, but it should make the X60 flexible enough to navigate threshold steps and other small obstacles.

Update, January 6, 2026, 5:07PM PT: This post was updated with new photos and video and to add additional information about the Cyber X after seeing a live demonstration at Dreame’s CES booth.

This article originally appeared on Engadget at https://www.engadget.com/home/dreames-latest-robot-vacuum-concept-has-slightly-terrifying-legs-that-can-climb-full-size-stairs-210000399.html?src=rss

Dreame’s robot vacuum with an arm is back at CES 2026 and it can do more than pick up shoes

Last year at CES, Dreame showed off a robot vacuum prototype with a mechanical arm. But while we were able to see the arm extend and retract, we didn’t see the device, which was described as a prototype at the time, actually grab anything, which was a bit disappointing.

This year, though, the company has made its arm-enabled vacuum a reality with the Cyber 10 Ultra. Dreame previewed it recently at IFA in Berlin, but has now confirmed it will be on sale later this year. 

The vacuum has an extendable arm that looks pretty similar to the prototype version we saw last year. It extends from the top of the vacuum and has a claw-like device at the end for scooping up objects. According to Dreame, it can pick up items that weigh up to 500 grams (about 1 pound) so it should be able to grab a wider variety of stuff than the Roborock vac we saw last year, which had a 300-gram weight limit for its arm. 

The arm can also do more than pick up stuff from the floor. It supports its own cleaning accessories, and can grab vacuum nozzles and brush attachments from its base station. This allows the arm to act as an extension of the vacuum itself so it can be used similarly to how you might use hose attachments to reach hard-to-get areas with a traditional vacuum. 

We were able to see a brief demo of the Cyber 10 arm in action on the CES show floor. It was able to pick up balls and place them in a basket. Unfortunately, we didn’t get to see it lift any heavier object or grab its cleaning attachments, but we were able to get a good look at the base station and the small cubbies where they will be stored.

The base station that holds the attachments for the vacuum's arm.
The base station that holds the attachments for the vacuum's arm.
Karissa Bell for Engadget

And, like Dreame's other robot vacuums, the Cyber 10 Ultra also has mopping abilities and can climb up small steps up to 6cm (about 2.4 inches). That's not quite as impressive as the tank-like stair-climbing Cyber X prototype it also brought to CES, but should help the Cyber 10 reach a few extra places in the house. 

The company hasn't announced an exact release date, but says it's targeting August of this year and currently expects the Cyber 10 Ultra to cost around €1799 (about $2,100).

Update, January 6, 2026, 4:17PM PT: This story was updated with new photos, a video and information about the Cyber 10 Ultra.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/dreames-robot-vacuum-with-an-arm-is-back-at-ces-2026-and-it-can-do-more-than-pick-up-shoes-210000020.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

Update, Tuesday, January 6, 2026, 3:40PM PT: Added a video from garmin’s demo.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss

Agibot’s humanoid robots can give directions and learn your TikTok dances

For better or worse, CES 2026 is already shaping up to be a big year for humanoid robots. Chinese company Agibot showed up with two: the roughly human-sized A2 and the slightly smaller X2, both of which were displaying their surprisingly impressive dancing abilities. 

We watched both robots walk around, wave at passersby and show off their best moves. The larger A2 mostly kept its legs still and danced mainly with its arms. The smaller X2 on the other hand is a bit more nimble — it has a larger set of "feet" to give it more stability — and those abilities were on full display.

At the time we saw them, the robots were controlled partially by an Agibot rep using a dedicated controller, but the company told me the robots are able to move autonomously in spaces once they've been able to use their onboard sensors to map out their environmentThe company, which has already shipped several thousand robots in China and plans to make them available in the United States this year, says both the A2 and X2 are intended to provide a flexible platform so people can interact with the robots in a variety of situations.

Agibot envisions the larger A2 as a kind of hospitality helper robot that can greet visitors at museums or conferences (like CES) and provide directions or even walk alongside their human guests. 

The smaller X2 on the other hand could be suited for educational purposes or other scenarios when you might want a robot with slightly more human-like movements. It could even be a good TikTok companion, as Agibot's head of communications, Yuheng Feng explained to me. "Take a Tiktok video, for example, you can use that video to train the robot, [so] it can also dance exactly like you did in the video." 

The company hasn't given details on when its robots might show up in the US or how much they might cost. Feng told me a lot will depend on how companies want to use them because their hardware is able to be customized depending on the use case. For now, though, we'll just soak in the dance moves.

This article originally appeared on Engadget at https://www.engadget.com/ai/agibots-humanoid-robots-can-give-directions-and-learn-your-tiktok-dances-045049798.html?src=rss

Japanese startup Ludens AI brought two very adorable robots to CES 2026

CES 2026 is already shaping up to be an interesting year for robots. But while some companies are chasing humanoids that can help you do stuff, there are also a surprising number of robots whose main job is to be cute and keep you company.

Japanese startup Ludens AI is showing off two extremely adorable robot companions at CES. Cocomo is an autonomous robot pet that can follow you around the house and respond to voice and touch. It has a fuzzy, egg-shaped body, but the version we saw at CES was wearing an orange suit with ears that made it look a bit like a teddy bear. It was moving around on a wheeled base, but it also has tiny legs if you prefer to carry it around and hold it. 

Cocoo's exterior is meant to stay close to human body temperature at 98.6 degrees fahrenheit and the company says it will rise up to 102 degrees in "high contact" situations like hugging it. And while Cocomo can interact and respond to your actions, it "speaks" with hums and other sounds rather than words.

We didn't get to witness many of its abilities in action due to the loud environment, but Ludens says that Cocomo is designed to bond with its owners over time. "Cocomo engages through spontaneous gestures, imitation, and gentle initiation - learning what makes you laugh, what comforts you, and when to surprise you," the company says. 

Ludens didn't share pricing or availability info for Cocomo, but has a waitlist where you can sign up for updates in a forthcoming crowdfunding campaign. 

Ludens AI's Inu robot.
Ludens AI's Inu robot.
Karissa Bell for Engadget

Ludens also showed off a smaller, but also very adorable, robot called Inu, which it describes as a "desktop alien pupu." Rather than a robot that can move with you from room to room, Inu is meant to sit on your desk and keep you company while you work. It can also interact via audio and movement. It has a little tail that wiggles in response to voice and touch and its single eye can "blink." 

Ludens plans to launch a crowdfunding campaign for Inu later this year.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/japanese-startup-ludens-ai-brought-two-very-adorable-robots-to-ces-2026-021914130.html?src=rss

LG reveals its laundry-folding robot at CES 2026

LG has unveiled its humanoid robot that can handle household chores. After teasing the CLOiD last week, the company has offered its first look at the AI-powered robot it claims can fold laundry, unload the dishwasher, serve food and help out with other tasks. 

The CLOiD has a surprisingly cute "head unit" that's equipped with a display, speakers, cameras and other sensors. "Collectively, these elements allow the robot to communicate with humans through spoken language and 'facial expressions,' learn the living environments and lifestyle patterns of its users and control connected home appliances based on its learnings," LG says in its press release

The robot also has two robotic arms — complete with shoulder, elbow and wrist joints — and hands with fingers that can move independently. The company didn't share images of the CLOiD's base, but it uses wheels and technology similar to what the appliance maker has used for robot vacuums. The company notes that its arms are able to pick up objects that are "knee level" and higher, so it won't be able to pick up things from the floor.

The CLOiD robot unloading a dishwasher.
The CLOiD robot unloading a dishwasher.
LG

LG says it will show off the robot completing common chores in a variety of scenarios, like starting laundry cycles and folding freshly washed clothes. The company also shared images of it taking a croissant out of the oven, unloading plates from a dishwasher and serving a plate of food. Another image shows it standing alongside a woman in the middle of a home workout, though it's not clear how the CLOiD is aiding with that task.

We'll get a closer look at the CLOiD and its laundry-folding abilities once the CES show floor opens later this week, so we should get a better idea of just how capable it is. It sounds like for now LG intends this to be more of a concept rather than a product it plans to actually sell. The company says that it will "continue developing home robots with practical functions and forms for housework" and also bring its robotics technology to more of its home appliances, like refrigerators with doors that can automatically open.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/lg-reveals-its-laundry-folding-robot-at-ces-2026-215121021.html?src=rss

Instagram chief: AI is so ubiquitous ‘it will be more practical to fingerprint real media than fake media’

It's no secret that AI-generated content took over our social media feeds in 2025. Now, Instagram's top exec Adam Mosseri has made it clear that he expects AI content to overtake non-AI imagery and the significant implications that shift has for its creators and photographers.

Mosseri shared the thoughts in a lengthy post about the broader trends he expects to shape Instagram in 2026. And he offered a notably candid assessment on how AI is upending the platform. "Everything that made creators matter—the ability to be real, to connect, to have a voice that couldn’t be faked—is now suddenly accessible to anyone with the right tools," he wrote. "The feeds are starting to fill up with synthetic everything."

But Mosseri doesn't seem particularly concerned by this shift. He says that there is "a lot of amazing AI content" and that the platform may need to rethink its approach to labeling such imagery by "fingerprinting real media, not just chasing fake."

From Mosseri (emphasis his):

Social media platforms are going to come under increasing pressure to identify and label AI-generated content as such. All the major platforms will do good work identifying AI content, but they will get worse at it over time as AI gets better at imitating reality. There is already a growing number of people who believe, as I do, that it will be more practical to fingerprint real media than fake media. Camera manufacturers could cryptographically sign images at capture, creating a chain of custody.

On some level, it's easy to understand how this seems like a more practical approach for Meta. As we've previously reported, technologies that are meant to identify AI content, like watermarks, have proved unreliable at best. They are easy to remove and even easier to ignore altogether. Meta's own labels are far from clear and the company, which has spent tens of billions of dollars on AI this year alone, has admitted it can't reliably detect AI-generated or manipulated content on its platform.

That Mosseri is so readily admitting defeat on this issue, though, is telling. AI slop has won. And when it comes to helping Instagram's 3 billion users understand what is real, that should largely be someone else's problem, not Meta's. Camera makers — presumably phone makers and actual camera manufacturers — should come up with their own system that sure sounds a lot like watermarking to "to verify authenticity at capture." Mosseri offers few details about how this would work or be implemented at the scale required to make it feasible.

Mosseri also doesn't really address the fact that this is likely to alienate the many photographers and other Instagram creators who have already grown frustrated with the app. The exec regularly fields complaints from the group who want to know why Instagram's algorithm doesn't consistently surface their posts to their on followers.

But Mosseri suggests those complaints stem from an outdated vision of what Instagram even is. The feed of "polished" square images, he says, "is dead." Camera companies, in his estimation, are "are betting on the wrong aesthetic" by trying to "make everyone look like a professional photographer from the past." Instead, he says that more "raw" and "unflattering" images will be how creators can prove they are real, and not AI. In a world where Instagram has more AI content than not, creators should prioritize images and videos that intentionally make them look bad. 


This article originally appeared on Engadget at https://www.engadget.com/social-media/instagram-chief-ai-is-so-ubiquitous-it-will-be-more-practical-to-fingerprint-real-media-than-fake-media-202620080.html?src=rss

Trump’s TikTok deal is another step closer to finally actually happening

Remember back in September when President Donald Trump signed an executive order that seemingly finalized some of the terms of a deal to spin off TikTok's US business? Three months later, that same deal is apparently one step closer to being official.

According to Bloomberg, TikTok CEO Shou Chew told employees that TikTok and ByteDance had signed off the agreement for control of TikTok's US business. It sounds like terms of the deal are roughly the same as what Trump announced earlier this year. A group of US investors, including Oracle, Silver Lake and MGX will control a majority of the new entity while ByteDance will keep a smaller stake in the venture. 

According to Chew's memo, the deal is expected to close January 22, 2026. “Upon the closing, the US joint venture, built on the foundation of the current TikTok US Data Security (USDS) organization, will operate as an independent entity with authority over US data protection, algorithm security, content moderation and software assurance,” he wrote according to Bloomberg.  TikTok didn’t immediately respond to a request for comment.

Notably, it's still not clear where Chinese officials stand on the deal. Trump said back in September that China was "fully on board," but subsequent meetings between the two sides have so far produced vague statements. In October, China's Commerce Ministry said it would "work with the U.S. to properly resolve issues related to TikTok." 

If a deal is indeed finalized by next month, it will come almost exactly a year after Trump's first executive order to delay a law that required a sale or ban of the app front taking effect. He has signed off several other extensions since.

This article originally appeared on Engadget at https://www.engadget.com/social-media/trumps-tiktok-deal-is-another-step-closer-to-finally-actually-happening-001813404.html?src=rss