Dreame’s robot vacuum with an arm is back at CES 2026 and it can do more than pick up shoes

Last year at CES, Dreame showed off a robot vacuum prototype with a mechanical arm. But while we were able to see the arm extend and retract, we didn’t see the device, which was described as a prototype at the time, actually grab anything, which was a bit disappointing.

This year, though, the company has made its arm-enabled vacuum a reality with the Cyber 10 Ultra. Dreame previewed it recently at IFA in Berlin, but has now confirmed it will be on sale later this year. 

The vacuum has an extendable arm that looks pretty similar to the prototype version we saw last year. It extends from the top of the vacuum and has a claw-like device at the end for scooping up objects. According to Dreame, it can pick up items that weigh up to 500 grams (about 1 pound) so it should be able to grab a wider variety of stuff than the Roborock vac we saw last year, which had a 300-gram weight limit for its arm. 

The arm can also do more than pick up stuff from the floor. It supports its own cleaning accessories, and can grab vacuum nozzles and brush attachments from its base station. This allows the arm to act as an extension of the vacuum itself so it can be used similarly to how you might use hose attachments to reach hard-to-get areas with a traditional vacuum. 

We were able to see a brief demo of the Cyber 10 arm in action on the CES show floor. It was able to pick up balls and place them in a basket. Unfortunately, we didn’t get to see it lift any heavier object or grab its cleaning attachments, but we were able to get a good look at the base station and the small cubbies where they will be stored.

The base station that holds the attachments for the vacuum's arm.
The base station that holds the attachments for the vacuum's arm.
Karissa Bell for Engadget

And, like Dreame's other robot vacuums, the Cyber 10 Ultra also has mopping abilities and can climb up small steps up to 6cm (about 2.4 inches). That's not quite as impressive as the tank-like stair-climbing Cyber X prototype it also brought to CES, but should help the Cyber 10 reach a few extra places in the house. 

The company hasn't announced an exact release date, but says it's targeting August of this year and currently expects the Cyber 10 Ultra to cost around €1799 (about $2,100).

Update, January 6, 2026, 4:17PM PT: This story was updated with new photos, a video and information about the Cyber 10 Ultra.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/dreames-robot-vacuum-with-an-arm-is-back-at-ces-2026-and-it-can-do-more-than-pick-up-shoes-210000020.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss

Meta’s EMG wristband is moving beyond its AR glasses

Meta has been experimenting with EMG technology for years. In 2025, the company commercialized it for the first time in its Meta Ray-Ban Display glasses, which users control via a dedicated neural band that is able to interpret subtle muscle movements in the wrist.

Now, at CES 2026, the company is offering its first look at how its neural band could be used to control devices outside of its smart glasses lineup. Meta has teamed up with Garmin, as well as a handful of research partners, to explore some intriguing use cases for its wrist-based controller.

The social media company has previously worked with Garmin on fitness integrations for its glasses. But at CES, the companies were showing off a very early demo of how Meta's neural band inside of a car to control the built-in infotainment system. 

The experience is part of Garmin's "Unified Cabin" concept, which explores a bunch of AI-centric in-car experiences. The demo I tried was fairly limited: while wearing a neural band, I was able to navigate two apps on a touchscreen display in Garmin's cockpit setup. In one, I used pinch and swipe gestures to manipulate an onscreen model of a car, much like how I would use the band to zoom in and out of an image while wearing the display glasses. The second demo, somewhat bizarrely, was a game of 2048. I used the same swipe gestures to move the tiles around. 

Neither of those are the kinds of experiences you immediately think of when you imagine "in-car entertainment," but Garmin, which works with a number of major car brands on infotainment systems, seems to be thinking about some more practical use cases too. The company told me that it will explore using the neural band to control vehicle functions like rolling down windows or unlocking doors. 

Elsewhere, Meta also announced a research collaboration with the University of Utah that will explore how its EMG tech can be used to help people who have ALS, muscular dystrophy and other conditions that affect the use of their hands.

Researchers will work with Meta to test gestures that could enable people to control smart speakers, blinds, thermostats, locks and other household devices using the neural band.  "Meta Neural Band is sensitive enough to detect subtle muscle activity in the wrist — even for people who can’t move their hands," the company explains in a blog post. Researchers will also look at using the band for mobility use cases, like the University of Utah's TetraSki program, which currently uses a joystick or mouth-based controller to help participants ski.

Update, Tuesday, January 6, 2026, 3:40PM PT: Added a video from garmin’s demo.

This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html?src=rss

Agibot’s humanoid robots can give directions and learn your TikTok dances

For better or worse, CES 2026 is already shaping up to be a big year for humanoid robots. Chinese company Agibot showed up with two: the roughly human-sized A2 and the slightly smaller X2, both of which were displaying their surprisingly impressive dancing abilities. 

We watched both robots walk around, wave at passersby and show off their best moves. The larger A2 mostly kept its legs still and danced mainly with its arms. The smaller X2 on the other hand is a bit more nimble — it has a larger set of "feet" to give it more stability — and those abilities were on full display.

At the time we saw them, the robots were controlled partially by an Agibot rep using a dedicated controller, but the company told me the robots are able to move autonomously in spaces once they've been able to use their onboard sensors to map out their environmentThe company, which has already shipped several thousand robots in China and plans to make them available in the United States this year, says both the A2 and X2 are intended to provide a flexible platform so people can interact with the robots in a variety of situations.

Agibot envisions the larger A2 as a kind of hospitality helper robot that can greet visitors at museums or conferences (like CES) and provide directions or even walk alongside their human guests. 

The smaller X2 on the other hand could be suited for educational purposes or other scenarios when you might want a robot with slightly more human-like movements. It could even be a good TikTok companion, as Agibot's head of communications, Yuheng Feng explained to me. "Take a Tiktok video, for example, you can use that video to train the robot, [so] it can also dance exactly like you did in the video." 

The company hasn't given details on when its robots might show up in the US or how much they might cost. Feng told me a lot will depend on how companies want to use them because their hardware is able to be customized depending on the use case. For now, though, we'll just soak in the dance moves.

This article originally appeared on Engadget at https://www.engadget.com/ai/agibots-humanoid-robots-can-give-directions-and-learn-your-tiktok-dances-045049798.html?src=rss

Japanese startup Ludens AI brought two very adorable robots to CES 2026

CES 2026 is already shaping up to be an interesting year for robots. But while some companies are chasing humanoids that can help you do stuff, there are also a surprising number of robots whose main job is to be cute and keep you company.

Japanese startup Ludens AI is showing off two extremely adorable robot companions at CES. Cocomo is an autonomous robot pet that can follow you around the house and respond to voice and touch. It has a fuzzy, egg-shaped body, but the version we saw at CES was wearing an orange suit with ears that made it look a bit like a teddy bear. It was moving around on a wheeled base, but it also has tiny legs if you prefer to carry it around and hold it. 

Cocoo's exterior is meant to stay close to human body temperature at 98.6 degrees fahrenheit and the company says it will rise up to 102 degrees in "high contact" situations like hugging it. And while Cocomo can interact and respond to your actions, it "speaks" with hums and other sounds rather than words.

We didn't get to witness many of its abilities in action due to the loud environment, but Ludens says that Cocomo is designed to bond with its owners over time. "Cocomo engages through spontaneous gestures, imitation, and gentle initiation - learning what makes you laugh, what comforts you, and when to surprise you," the company says. 

Ludens didn't share pricing or availability info for Cocomo, but has a waitlist where you can sign up for updates in a forthcoming crowdfunding campaign. 

Ludens AI's Inu robot.
Ludens AI's Inu robot.
Karissa Bell for Engadget

Ludens also showed off a smaller, but also very adorable, robot called Inu, which it describes as a "desktop alien pupu." Rather than a robot that can move with you from room to room, Inu is meant to sit on your desk and keep you company while you work. It can also interact via audio and movement. It has a little tail that wiggles in response to voice and touch and its single eye can "blink." 

Ludens plans to launch a crowdfunding campaign for Inu later this year.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/japanese-startup-ludens-ai-brought-two-very-adorable-robots-to-ces-2026-021914130.html?src=rss

LG reveals its laundry-folding robot at CES 2026

LG has unveiled its humanoid robot that can handle household chores. After teasing the CLOiD last week, the company has offered its first look at the AI-powered robot it claims can fold laundry, unload the dishwasher, serve food and help out with other tasks. 

The CLOiD has a surprisingly cute "head unit" that's equipped with a display, speakers, cameras and other sensors. "Collectively, these elements allow the robot to communicate with humans through spoken language and 'facial expressions,' learn the living environments and lifestyle patterns of its users and control connected home appliances based on its learnings," LG says in its press release

The robot also has two robotic arms — complete with shoulder, elbow and wrist joints — and hands with fingers that can move independently. The company didn't share images of the CLOiD's base, but it uses wheels and technology similar to what the appliance maker has used for robot vacuums. The company notes that its arms are able to pick up objects that are "knee level" and higher, so it won't be able to pick up things from the floor.

The CLOiD robot unloading a dishwasher.
The CLOiD robot unloading a dishwasher.
LG

LG says it will show off the robot completing common chores in a variety of scenarios, like starting laundry cycles and folding freshly washed clothes. The company also shared images of it taking a croissant out of the oven, unloading plates from a dishwasher and serving a plate of food. Another image shows it standing alongside a woman in the middle of a home workout, though it's not clear how the CLOiD is aiding with that task.

We'll get a closer look at the CLOiD and its laundry-folding abilities once the CES show floor opens later this week, so we should get a better idea of just how capable it is. It sounds like for now LG intends this to be more of a concept rather than a product it plans to actually sell. The company says that it will "continue developing home robots with practical functions and forms for housework" and also bring its robotics technology to more of its home appliances, like refrigerators with doors that can automatically open.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/lg-reveals-its-laundry-folding-robot-at-ces-2026-215121021.html?src=rss

Instagram chief: AI is so ubiquitous ‘it will be more practical to fingerprint real media than fake media’

It's no secret that AI-generated content took over our social media feeds in 2025. Now, Instagram's top exec Adam Mosseri has made it clear that he expects AI content to overtake non-AI imagery and the significant implications that shift has for its creators and photographers.

Mosseri shared the thoughts in a lengthy post about the broader trends he expects to shape Instagram in 2026. And he offered a notably candid assessment on how AI is upending the platform. "Everything that made creators matter—the ability to be real, to connect, to have a voice that couldn’t be faked—is now suddenly accessible to anyone with the right tools," he wrote. "The feeds are starting to fill up with synthetic everything."

But Mosseri doesn't seem particularly concerned by this shift. He says that there is "a lot of amazing AI content" and that the platform may need to rethink its approach to labeling such imagery by "fingerprinting real media, not just chasing fake."

From Mosseri (emphasis his):

Social media platforms are going to come under increasing pressure to identify and label AI-generated content as such. All the major platforms will do good work identifying AI content, but they will get worse at it over time as AI gets better at imitating reality. There is already a growing number of people who believe, as I do, that it will be more practical to fingerprint real media than fake media. Camera manufacturers could cryptographically sign images at capture, creating a chain of custody.

On some level, it's easy to understand how this seems like a more practical approach for Meta. As we've previously reported, technologies that are meant to identify AI content, like watermarks, have proved unreliable at best. They are easy to remove and even easier to ignore altogether. Meta's own labels are far from clear and the company, which has spent tens of billions of dollars on AI this year alone, has admitted it can't reliably detect AI-generated or manipulated content on its platform.

That Mosseri is so readily admitting defeat on this issue, though, is telling. AI slop has won. And when it comes to helping Instagram's 3 billion users understand what is real, that should largely be someone else's problem, not Meta's. Camera makers — presumably phone makers and actual camera manufacturers — should come up with their own system that sure sounds a lot like watermarking to "to verify authenticity at capture." Mosseri offers few details about how this would work or be implemented at the scale required to make it feasible.

Mosseri also doesn't really address the fact that this is likely to alienate the many photographers and other Instagram creators who have already grown frustrated with the app. The exec regularly fields complaints from the group who want to know why Instagram's algorithm doesn't consistently surface their posts to their on followers.

But Mosseri suggests those complaints stem from an outdated vision of what Instagram even is. The feed of "polished" square images, he says, "is dead." Camera companies, in his estimation, are "are betting on the wrong aesthetic" by trying to "make everyone look like a professional photographer from the past." Instead, he says that more "raw" and "unflattering" images will be how creators can prove they are real, and not AI. In a world where Instagram has more AI content than not, creators should prioritize images and videos that intentionally make them look bad. 


This article originally appeared on Engadget at https://www.engadget.com/social-media/instagram-chief-ai-is-so-ubiquitous-it-will-be-more-practical-to-fingerprint-real-media-than-fake-media-202620080.html?src=rss

Trump’s TikTok deal is another step closer to finally actually happening

Remember back in September when President Donald Trump signed an executive order that seemingly finalized some of the terms of a deal to spin off TikTok's US business? Three months later, that same deal is apparently one step closer to being official.

According to Bloomberg, TikTok CEO Shou Chew told employees that TikTok and ByteDance had signed off the agreement for control of TikTok's US business. It sounds like terms of the deal are roughly the same as what Trump announced earlier this year. A group of US investors, including Oracle, Silver Lake and MGX will control a majority of the new entity while ByteDance will keep a smaller stake in the venture. 

According to Chew's memo, the deal is expected to close January 22, 2026. “Upon the closing, the US joint venture, built on the foundation of the current TikTok US Data Security (USDS) organization, will operate as an independent entity with authority over US data protection, algorithm security, content moderation and software assurance,” he wrote according to Bloomberg.  TikTok didn’t immediately respond to a request for comment.

Notably, it's still not clear where Chinese officials stand on the deal. Trump said back in September that China was "fully on board," but subsequent meetings between the two sides have so far produced vague statements. In October, China's Commerce Ministry said it would "work with the U.S. to properly resolve issues related to TikTok." 

If a deal is indeed finalized by next month, it will come almost exactly a year after Trump's first executive order to delay a law that required a sale or ban of the app front taking effect. He has signed off several other extensions since.

This article originally appeared on Engadget at https://www.engadget.com/social-media/trumps-tiktok-deal-is-another-step-closer-to-finally-actually-happening-001813404.html?src=rss

A Facebook test makes link-sharing a paid feature for creators

Creators and publishers have long worried about Meta's ability to throttle links to outside content. Now, the company is testing out a new scheme that effectively puts link-sharing behind a paywall for creators on Facebook.

Under the test, a Meta Verified subscription will determine how many links a creator can share another profile per month. According to a screenshot shared by social meda consultant Matt Navarra, creators in the test recently received a notification from Meta informing them that "certain Facebook profiles without Meta Verified, including yours, will be limited to sharing links in 2 organic posts per month."  

Meta is making link sharing pay to play with a new test.
Meta is making link sharing pay to play with a new test.

A spokesperson for Meta confirmed the test to Engadget. The test is currently affecting an unspecified number of creators and pages using "professional mode" on Facebook. Publishers aren't affected for now. "This is a limited test to understand whether the ability to publish an increased volume of posts with links adds additional value for Meta Verified subscribers," the spokesperson said.

While Meta seems to be trying to downplay the significance of the test, it's a notable shift for the company. Many creators and businesses rely on Facebook and reducing their ability to send traffic to outside websites could be a significant hit. Many creators are already frustrated that the company puts its better customer service features behind the Meta Verified subscription, which starts at $14.99/month. Making link-sharing a premium feature as well would be even more unpopular.

Have a tip for Karissa? You can reach her by email, on X, Bluesky, Threads, or send a message to @karissabe.51 to chat confidentially on Signal.

This article originally appeared on Engadget at https://www.engadget.com/social-media/a-facebook-test-makes-link-sharing-a-paid-feature-for-creators-224632957.html?src=rss

Meta is ‘pausing’ third-party VR headsets from ASUS and Lenovo

Last year, Meta announced that it was opening up its VR operating system to other headset makers, starting with ASUS and Lenovo. Now, it seems that Meta is pumping the brakes on the effort and those third-party Horizon OS headsets might never actually launch.

The company has "paused" the program, Road to VR reported. Meta confirmed the move in a statement to Engadget, saying that it's instead focusing on "building the world-class first-party hardware and software needed to advance the VR market." ASUS and Lenovo didn't immediately respond to a request for comment. Both companies have said little about the headsets since they were first announced in 2024. ASUS' was going to be a "performance gaming" headset under its Republic of Gamers (ROG) brand, while Lenovo's was intended to be a mixed reality device focused on "productivity, learning and entertainment" experiences 

The shift isn't entirely surprising. Meta Connect was very light on virtual reality news this year as smart glasses have become a central focus for the company. Earlier this month, Bloomberg reported that Meta was planning significant cuts to the teams working on virtual reality and Horizon Worlds. The company said at the time it was "shifting some of our investment from Metaverse toward AI glasses and wearables given the momentum there."

Still, Meta is seemingly leaving the door open for third-party VR headsets in the future. "We’re committed to this for the long term and will revisit opportunities for 3rd-party device partnerships as the category evolves," the company said.


This article originally appeared on Engadget at https://www.engadget.com/ar-vr/meta-is-pausing-third-party-vr-headsets-from-asus-and-lenovo-193622900.html?src=rss