SwitchBot’s New Onero H1 Robot Finally Does Your Chores

When humanoid robots started becoming the major thing that robotics companies were pursuing, there were probably two kinds of people who reacted to it. There were those that were scared that having robot overlords were just a few years away, and then those that were excited to finally have someone to do their chores for them. The former hasn’t happened yet, thank goodness, but it looks like we’re almost there with the latter.

SwitchBot’s Onero H1, currently making waves at CES 2026, may be the long-promised dream of having our own Rosie (that’s a Jetsons reference for you). They call it their “most accessible AI household robot” and it’s designed to be the household help that we need, one that will not grow tired or complain that it’s being overworked. Hopefully.

Designer: SwitchBot

One key aspect of this robot that makes it ideal for chores is that it has impressive flexibility and range of motion with its 22 degrees of freedom. It is an OmniSense VLA model with AI capabilities built in so that it can learn and adapt even without cloud connectivity. It is able to understand its environment with visual perception, depth sensing, and tactile feedback.

While it may not look like Rosie or Megan (again, thank goodness), this robot is a full-sized humanoid with arms, hands, head, and yes, even a face. It has a wheeled base so it can navigate easily throughout your space. Onero H1 also has articulated robotic arms labeled “A1” that can manipulate objects so it can help you or actually do your chores for you.

Some contact-intensive things that the robot can do include grasping and organizing objects, loading the dishwasher, cooking breakfast, preparing your morning and afternoon coffee, doing the dreaded laundry, washing the windows, and even opening and closing doors for you. It can also catch the jacket you throw at it when you come home. Talk about butler service!

Unlike the robot vacuums and single-purpose smart devices we’re used to, the Onero H1 represents something more ambitious. It’s part of SwitchBot’s “Smart Home 2.0” vision, where your home doesn’t just have gadgets but has systems that actually think and act on your behalf. The robot is designed to work seamlessly with SwitchBot’s existing ecosystem of task-specific robots, creating a unified smart home experience that feels less like managing technology and more like having a genuinely helpful presence in your home.

What’s particularly impressive is how it learns. The Onero H1 isn’t rigidly pre-programmed to perform tasks in one specific way. Instead, it adapts to YOUR home layout, YOUR routines, and YOUR preferences. It uses visual perception and tactile feedback to understand not just what objects are, but how they should be handled. This means it can figure out the difference between delicate glassware and sturdy pots, or learn where you prefer certain items to be organized. For those of us who’ve been juggling work, family, and the endless cycle of household chores, this kind of adaptable help could be genuinely life-changing. Imagine reclaiming those hours spent on repetitive tasks and using them for things that actually matter to you, whether that’s pursuing hobbies, spending quality time with loved ones, or simply enjoying a moment of peace.

Now, before you start clearing space in your home and budgeting for your new robot helper, there are a few things to keep in mind. While the Onero H1 will be available for pre-order through SwitchBot’s website, the company hasn’t announced pricing or a specific launch date yet, just that it’s coming “soon.” Multiple tech experts have noted that this is still very much a concept designed to show where the technology is headed, rather than a product ready for immediate mass adoption.

The SwitchBot Onero H1 represents an exciting glimpse into a future where household robots move beyond vacuuming floors to actually helping with the full range of domestic tasks. While we may need to wait a bit longer before Rosie arrives at our doorstep, it’s clear that the era of genuinely helpful household robots is no longer science fiction. It’s just around the corner.

For collectors and tech enthusiasts, the Onero H1 marks a significant milestone in consumer robotics history. It’s the moment when humanoid household robots transitioned from ambitious prototypes to accessible reality. Whether you’re excited about finally having help with the dishes or simply fascinated by the technology, one thing is certain: the future of smart homes is looking a lot more hands-on, literally.

The post SwitchBot’s New Onero H1 Robot Finally Does Your Chores first appeared on Yanko Design.

Japanese startup Ludens AI brought two very adorable robots to CES 2026

CES 2026 is already shaping up to be an interesting year for robots. But while some companies are chasing humanoids that can help you do stuff, there are also a surprising number of robots whose main job is to be cute and keep you company.

Japanese startup Ludens AI is showing off two extremely adorable robot companions at CES. Cocomo is an autonomous robot pet that can follow you around the house and respond to voice and touch. It has a fuzzy, egg-shaped body, but the version we saw at CES was wearing an orange suit with ears that made it look a bit like a teddy bear. It was moving around on a wheeled base, but it also has tiny legs if you prefer to carry it around and hold it. 

Cocoo's exterior is meant to stay close to human body temperature at 98.6 degrees fahrenheit and the company says it will rise up to 102 degrees in "high contact" situations like hugging it. And while Cocomo can interact and respond to your actions, it "speaks" with hums and other sounds rather than words.

We didn't get to witness many of its abilities in action due to the loud environment, but Ludens says that Cocomo is designed to bond with its owners over time. "Cocomo engages through spontaneous gestures, imitation, and gentle initiation - learning what makes you laugh, what comforts you, and when to surprise you," the company says. 

Ludens didn't share pricing or availability info for Cocomo, but has a waitlist where you can sign up for updates in a forthcoming crowdfunding campaign. 

Ludens AI's Inu robot.
Ludens AI's Inu robot.
Karissa Bell for Engadget

Ludens also showed off a smaller, but also very adorable, robot called Inu, which it describes as a "desktop alien pupu." Rather than a robot that can move with you from room to room, Inu is meant to sit on your desk and keep you company while you work. It can also interact via audio and movement. It has a little tail that wiggles in response to voice and touch and its single eye can "blink." 

Ludens plans to launch a crowdfunding campaign for Inu later this year.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/japanese-startup-ludens-ai-brought-two-very-adorable-robots-to-ces-2026-021914130.html?src=rss

LG reveals its laundry-folding robot at CES 2026

LG has unveiled its humanoid robot that can handle household chores. After teasing the CLOiD last week, the company has offered its first look at the AI-powered robot it claims can fold laundry, unload the dishwasher, serve food and help out with other tasks. 

The CLOiD has a surprisingly cute "head unit" that's equipped with a display, speakers, cameras and other sensors. "Collectively, these elements allow the robot to communicate with humans through spoken language and 'facial expressions,' learn the living environments and lifestyle patterns of its users and control connected home appliances based on its learnings," LG says in its press release

The robot also has two robotic arms — complete with shoulder, elbow and wrist joints — and hands with fingers that can move independently. The company didn't share images of the CLOiD's base, but it uses wheels and technology similar to what the appliance maker has used for robot vacuums. The company notes that its arms are able to pick up objects that are "knee level" and higher, so it won't be able to pick up things from the floor.

The CLOiD robot unloading a dishwasher.
The CLOiD robot unloading a dishwasher.
LG

LG says it will show off the robot completing common chores in a variety of scenarios, like starting laundry cycles and folding freshly washed clothes. The company also shared images of it taking a croissant out of the oven, unloading plates from a dishwasher and serving a plate of food. Another image shows it standing alongside a woman in the middle of a home workout, though it's not clear how the CLOiD is aiding with that task.

We'll get a closer look at the CLOiD and its laundry-folding abilities once the CES show floor opens later this week, so we should get a better idea of just how capable it is. It sounds like for now LG intends this to be more of a concept rather than a product it plans to actually sell. The company says that it will "continue developing home robots with practical functions and forms for housework" and also bring its robotics technology to more of its home appliances, like refrigerators with doors that can automatically open.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/lg-reveals-its-laundry-folding-robot-at-ces-2026-215121021.html?src=rss

How to watch the Hyundai CES 2026 press conference live

Hyundai's wheeled robot took home an award.
Hyundai

CES has long felt like a full-on auto show, but the car-centric energy seems somewhat muted at CES 2026. Sure, the Afeela electric vehicle from the Sony-Honda joint venture is returning to the show floor, but with the Trump administration yanking most EV incentives from the market, the industry isn't offering a full-court press of new vehicles in Las Vegas this year.

That includes Hyundai. While the company's Mobis subsidiary will present "more than 30 mobility convergence technologies" during CES week — including its Holographic Windshield Display — we're hearing the Korean auto giant will instead use its press conference to focus on its AI Robotics Strategy. That will apparently include showcasing its new Atlas robot, as well as the wheeled MobED robot line. We'll get into the details below, along with how to watch it today.

Hyundai's presentation takes place today, January 5 at 4PM ET, and you can livestream it on either its HyundaiUSA YouTube channel or its global YouTube channel. (We've embedded the link below.)

Hyundai is putting a huge focus on its AI Robotics Strategy during its presentation today — the theme is "Partnering Human Progress." That'll include its strategies for commercializing AI-enhanced robotics, keeping with the very AI-centric focus of this year's CES.

We'll also get a first-ever look at the company's new Atlas robot. In the teaser image shown in the press release, Atlas looks rather dog-like, which makes sense when you remember that Boston Dynamics was purchased by the Korean multinational back in 2020.

"This next-generation Atlas represents a tangible step toward the commercialization of AI Robotics, highlighting the Group’s commitment to building safe and adaptable robotic co-workers," the company said in the same press release.

But Atlas isn't the only robot the company has up its sleeve. There's also the MobED Droid, a wheeled 'bot that scored a CES 2026 Innovation Award as the show opened this week. 

While on stage, Hyundai says it will "reveal its strategic AI Robotics learning, training and expansion plans," via its Group Value Network and Software-Defined Factory approach. That includes a manufacturing strategy and an advanced smart factory.

We originally thought Hyundai would showcase its Holographic Windshield Display during its press conference, but a Hyundai representative notified us it won't be featured today. It will have a separate CES presence, though not a separate press conference. 

Update, January 5 2026, 2:58PM ET: This story has been updated to include information on the MobED robot line, and to note that the Holographic Windshield Display likely won't be featured at the press conference.

This article originally appeared on Engadget at https://www.engadget.com/transportation/how-to-watch-the-hyundai-ces-2026-press-conference-live-190051823.html?src=rss

LG will show off a humanoid robot for household chores at CES 2026

LG will present a home robot named CLOiD at CES 2026 in Las Vegas. With humanoid robotics sure to feature heavily at this year's tech conference, LG has teased its home assistant before a full unveiling in January.

The company says CLOiD's two articulated arms with five individually actuated fingers are designed to help with a variety of household tasks. However, LG has not yet given a specific example of a task CLOiD can handle. We're also not sure what it looks like, because aside from a couple of very close-up images of CLOiD's hands, LG is keeping what the robot looks like under wraps until the show. 

LG said CLOiD is part of the company's vision that “Zero Labor Home, Makes Quality Time.” It said its robot is a step toward a company goal of "freeing customers from the time-consuming demands of housework." 

CLOiD's chipset is housed in its head, which also sports a display, speaker, camera and a bevy of sensors meant to enable expressive communication. LG says its new robot is powered by its "Affectionate Intelligence" technology and is designed to interact in a neutral, user-friendly way. It's also designed to refine its responses through repeated interactions with a user.

CES often plays host to proof-of-concept products that offer a window into the future but may not make it to market. It remains to be seen if CLOiD is simply a booth-side attention-getter or something with real potential. Visitors can see CLOiD handle some real-life scenarios at LG's booth in the Las Vegas Convention Center.

This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/lg-will-show-off-a-humanoid-robot-for-household-chores-at-ces-2026-145411218.html?src=rss

Antigravity A1 Review: Reimagining What a Drone Feels Like to Fly

PROS:


  • Unique immersive experience with vision goggles

  • 8K 360 capture with post-flight reframing

  • Intuitive one-hand grip controller and automated modes lower the skill barrier

CONS:


  • Several pieces to carry and manage: drone, goggles, and controller

  • First-time setup and learning curve can feel overwhelming

  • Visual observer requirements in places like the U.S. limit solo flying

RATINGS:

AESTHETICS
ERGONOMICS
PERFORMANCE
SUSTAINABILITY / REPAIRABILITY
VALUE FOR MONEY

EDITOR'S QUOTE:

Antigravity A1 turns flying a drone into a new point of view, and once you are inside it, the experience feels hard to put a price on.

Antigravity is Insta360’s bold experiment in what happens when a 360‑camera company stops thinking only about the camera and starts redesigning the entire act of flying. It is an independent drone brand, incubated by Insta360, built on the same obsession with immersive imaging and playful storytelling, but free to rethink the aircraft, the controls, and the viewing experience as one coherent object. Instead of asking how to strap a 360 camera onto a drone, Antigravity asks how to make the whole system feel like a natural extension of your point of view.

Antigravity A1 is the first expression of that idea. It is a compact 8K 360 drone that arrives as a complete kit, with Vision goggles and a single‑hand Grip controller that you steer with subtle tilts and gestures. You do not fly it by staring at a phone and juggling twin sticks. You put on the goggles, step into a 360‑degree bubble of imagery, and guide the drone by moving your hand in the direction you want to travel. What was the experience with Antigravity A1 like? We tested it to bring you that answer.

Designer: Antigravity

Aesthetics

Antigravity A1 presents itself more as a system than a single object. There is the compact drone with its dual cameras, the Vision goggles, and the one‑hand Grip controller. Visually, the aircraft itself is quite understated. Aside from the two opposing lenses and the leg that shields the lower camera on the ground, it looks like a neat, functional quadcopter. The drama is reserved for what the system does, not how the airframe shouts for attention.

The Vision goggles lean into an almost character-like, even bug-like look, especially when you fold up the black antennas on each side that resemble insect feelers. The front shell is white with two large, dark circular eyes, giving the whole front a slightly cartoonish face. In between and just above those eyes sits an inverted triangle-shaped grille with a subtle Antigravity logo, adding a small technical accent without breaking the simplicity.  The fabric strap and thick face padding sit behind this front mask. Wearing the goggles does look strange at first, but in a strangely cool way.


 
The Grip motion controller has a white plastic shell with buttons and a dial that uses color and icon cues to hint at their functions. On the back, a black trigger-style pull bar sits where your index finger naturally rests. There are additional buttons on each side. The mix of white body, black accents, and clearly marked controls makes the Grip look approachable rather than intimidating, which suits a controller that is meant to translate simple hand movements into flight.

Overall, the drone, goggles, and controller share a cohesive design language. They all use the same soft white shell, black accents, and gently rounded forms. The whole kit feels like a single, intentional system rather than three unrelated gadgets.

Ergonomics

The Vision goggles are where comfort really matters, and Antigravity has clearly spent time on fit. The goggles weigh 340 grams, yet the padding and strap geometry distribute that weight in a way that avoids obvious pressure points, even during longer sessions. The side that meets your face feels soft and accommodating, so the hardware never feels harsh. Once the 360-degree image appears, the headset fades faster than you might expect, which is exactly what you want from an immersive device. Optional corrective inserts mean many glasses wearers can enjoy a sharp view without wrestling frames under the band, which makes the experience more inclusive and less fussy.

Power for the goggles lives in a separate battery pack that you can wear on a lanyard around your neck. At 175 grams, it is not heavy, but over time, it can feel slightly cumbersome to have it hanging there, especially when you are moving around. Antigravity sells a 1.2 metre (3.9 foot) USB-C to DC power cable that lets you route the battery to a trouser pocket or bag instead, which makes the whole setup feel less dangly and more integrated.

You adjust the head strap with velcro, which works, but it is not perfect. A small buckle or hinge mechanism would make it much easier to put the goggles on or take them off while wearing a hat, without having to readjust the strap length every time. It is a minor detail, yet it shows how close the design already is. You start wishing for refinements, not fixes.

The Grip controller is where Antigravity’s ergonomic thinking really shows. It rests comfortably in one hand, with a form that supports a natural, slightly relaxed grip rather than a tense, clawed hold. For my hand, it is just a tiny bit on the large side, enough to notice but not enough to break the experience. This is very much nitpicking, and it actually underlines how well resolved the controller already is. When you are down to debating a few millimetres of girth, it means the fundamentals of comfort and control are in a very good place.

Performance

My experience with Antigravity A1 actually started at IFA in Berlin in early September. Outside the exhibition halls, I slipped on the Vision goggles while an Antigravity staff member flew the drone. As the A1 lifted and the IFA venue unfolded beneath me in every direction, my legs actually shivered a little, even though I like heights. Being wrapped in a live 360-degree view felt less like watching a screen and more like I was flying. That first taste was magical, which made me both excited and nervous to test the A1 myself later. I had almost crashed a friend’s drone years ago and had not flown since, so my piloting skills were close to none.

That magic comes with a setup phase that feels more like preparing a small system than turning on a single gadget. The first time you connect the drone, pair the Vision goggles, update firmware, and learn the grip controls, it can feel overwhelming. There are menus on the drone, options in the goggles, and status lights to decode, and they all compete for your attention at once. After a few sessions, it settles into a rhythm, but that initial ramp is something you feel before you ever lift off on your own.

Mobile app – Tutorial

Packing the Antigravity A1 means finding room for the drone, the goggles and their separate battery, and the grip controller, often in a dedicated case or carefully arranged backpack. This nudges the whole experience away from “throw it in your bag just in case” and toward “plan a proper flying session.” The result is that the A1 feels more like a deliberate outing than a casual accessory.

On paper, the A1 looks quite sensible. With the standard battery, it weighs 249 g, staying just under the 250 g threshold that works nicely with regulations in many places, and it offers up to about 24 minutes of flight time in ideal conditions. Pop in the high-capacity battery, and the weight goes over 250 g, but Antigravity quotes up to around 39 minutes in the air. In reality, you get a solid single session per pack and will want spares if you plan to film seriously.

Flight behaviour is also adjustable. There are three flight modes, Cinematic, Normal, and Sport, so you can match how the drone responds to the scene you are flying in. Together with Free Motion and FPV, that gives the A1 enough range to feel relaxed and floaty when you want it, or more direct and energetic when the shot calls for it.

Vision goggles menu

On top of those basics, Antigravity adds automated tools like Sky Genie, Deep Track, and Sky Path. Sky Genie runs preprogrammed patterns that give you smooth, cinematic moves with minimal effort. Deep Track follows a chosen subject automatically, so you can focus more on timing than stick precision. Sky Path lets you record waypoints and have the A1 repeat the route on its own, which is handy for repeated takes or for nervous pilots.

Safety and workflow sit quietly in the background, which is exactly where they should be. Obstacle sensors on the top and bottom help protect the drone when you are close to structures or changes in elevation, and one click Return to Home acts as a psychological parachute. Knowing you can call the drone back with a single command does a lot to calm the nerves, especially if your last memory of drones involves a near crash.

In the United States, FAA rules treat goggle-only flying as beyond visual line of sight, so you are meant to have a visual observer watching the drone while you are wearing the headset. That nudges the A1 away from solo, spur-of-the-moment flights and toward planned sessions with someone beside you acting as spotter.

On the imaging side, the A1 records up to 8K 360-degree video, with lower resolutions unlocking higher frame rates when you want smoother motion. Footage can be stored on internal memory or a microSD card, and you can offload it either by removing the card or plugging in via USB-C, so it slips neatly into most existing editing habits.

Vision goggle screen recording

The real leap, though, comes from the goggles. They are the thing that truly sets A1 apart from almost every other consumer drone. Instead of glancing down at a phone, you step into an immersive 360-degree view that tracks your head and surrounds your vision. The drone feels less like a gadget in the sky and more like the spot your eyes and body are occupying. A double-tap on the side button flips you into passthrough view, so you can check your surroundings without pulling the headset off, and a tiny outer display mirrors a miniature version of the live feed for people nearby.

That small detail turned out to be important in Bali, where a group of local kids noticed the goggles and the moving image, wandered over, and suddenly found themselves taking turns “flying” above their own neighbourhood. Their gasps, laughter, and stunned silence were as memorable as the footage itself.

Mobile app

The magic continues even after you land. Because the A1 captures everything in 360 degrees, you can decide on your framing after the flight, which feels a bit like getting a second chance at every shot. Antigravity provides both mobile and desktop apps for this, so you can scrub through the sphere, mark angles, and carve out regular flat videos without having to nail every move in real time.

Desktop app

If you have used the Insta360 app, the Antigravity app will feel instantly familiar, with similar timelines, keyframes, and swipe-to-pan gestures. Even if you have not, it is straightforward to learn, helped by clear icons and responsive previews. There is also an AI auto-edit mode that can assemble quick cuts for you, which is handy when you just want something shareable without sinking an evening into manual reframing.

In the end, A1’s performance is not just about how long it stays in the air or how many modes it offers. Those pieces matter, and they are solid, but what you remember is the feeling of lifting off inside the goggles and the ease with which you can hand that experience to someone else. It still behaves like a well-mannered compact drone on the spec sheet, yet in use it edges closer to a shared flying machine, one that turns a patch of ground into a small, temporary viewing platform in the sky.

Sustainability

Antigravity does not make any big sustainability claims with the A1. There is no mention of recycled materials or lower-impact manufacturing, and the packaging and hardware feel very much in line with a typical consumer drone. This is not a product that sells itself on being green, and the company does not pretend otherwise. 

What you do get is some support for repairing rather than replacing. The A1 ships with spare propellers in the box, which encourages you to swap out damaged blades instead of treating minor knocks as the end of the drone. Antigravity also sells replacement lenses, so a scratched front element does not automatically become a total write-off. It is a small step, but it nudges the A1 slightly toward a longer, more fixable life rather than a purely disposable gadget.

Value

The standard Antigravity A1 bundle starts at 1599 USD, with Explorer and Infinity bundles stepping up battery count and accessories for longer, more serious flying. It is undeniably an expensive system, especially compared to regular camera drones that only give you a phone view.

At the same time, what you are really paying for is the experience of being inside the flight and reframing your shots after the fact. That sense of presence and flexibility is hard to put a number on, and for me, it nudges the A1 from “costly gadget” toward something closer to a priceless experience machine, if you know you will actually use it.

Verdict

Antigravity A1 is not the simplest drone in terms of equipment. You are managing goggles, a grip controller, multiple batteries, and in some places, you also need a visual observer if regulations require it. On top of that, the price sits firmly in premium territory. In return, you get a very different kind of flying. At first, setup and piloting can feel overwhelming, but it becomes natural surprisingly quickly, and there are plenty of automated features to help you keep the drone under control and capture cool shots. Combined with 360-degree capture and post-flight reframing in the Antigravity app, it feels less like operating hardware and more like stepping into a movable viewpoint.

If you just want straightforward aerial clips, the A1 is probably more than you need. If you care about immersive perspective and shared experiences, the mix of kit, software, and feeling it delivers starts to justify the cost. It is fussy, ambitious, and occasionally awkward, yet when you are inside that live 360-degree view, it really does reimagine what a drone can feel like to fly.

The post Antigravity A1 Review: Reimagining What a Drone Feels Like to Fly first appeared on Yanko Design.

This $7,000 Robot Shapeshifts Into 3 Different Machines

Imagine a robot that can transform like a high-tech LEGO set, swapping out legs for arms or wheels depending on what the day throws at it. That’s exactly what LimX Dynamics has cooked up with their latest creation, the Tron 2, and honestly, it’s making me rethink everything I thought I knew about what robots can do.

The Tron 2 isn’t your typical one-trick-pony robot. This thing is basically the Swiss Army knife of the robotics world. Chinese startup LimX Dynamics just unveiled this modular marvel that can morph between three completely different configurations: a dual-armed humanoid torso, a wheeled-leg explorer, or a bipedal walker that can actually climb stairs without making you nervous. And get this, you can switch between these forms with just a screwdriver. No fancy tools, no complicated procedures. Just some strategic unscrewing and you’ve got a whole new robot.

Designer: LimX Dynamics

The company’s demo video starts with something delightfully surreal: just a pair of robotic legs casually strolling along, completely headless and armless. Then, like watching a transformer come to life in real time, those same leg components get repurposed into arms, complete with a head and torso. Suddenly, you’ve got a full humanoid lifting heavy water bottles and showing off its surprisingly impressive strength.

What makes the Tron 2 particularly fascinating is its intelligence layer. This isn’t just a mechanical chameleon. It’s powered by advanced AI and built on what’s called a vision-language-action platform, which essentially means it can see, understand commands, and actually do something useful with that information. The robot comes with a fully open software development kit that plays nice with both ROS1 and ROS2, making it a dream for researchers and developers who want to experiment without fighting proprietary systems.

Performance-wise, the specs are genuinely impressive. Each of its dual arms features seven degrees of freedom with a reach of 70 centimeters and can handle up to 10 kilograms of payload together. The wheeled configuration offers about four hours of runtime and can haul around 30 kilograms of cargo, while the bipedal mode excels at navigating tricky terrain like staircases that would leave most wheeled robots stuck at the bottom. The demo footage shows Tron 2 doing things that feel almost show-offy: playing table tennis, performing cartwheels, rolling around smoothly on wheels, and conquering staircases with the confidence of someone who’s done it a thousand times. It’s the kind of versatility that makes you wonder why we’ve been so committed to single-purpose robots for so long.

And here’s where things get really interesting. LimX is positioning the Tron 2 as ideal for future Mars missions. Think about it: on Mars, you can’t exactly call a repair truck when something breaks or send a specialized robot for every different task. You need something adaptable, something that can switch roles as mission needs evolve. The modular design means you could potentially swap out damaged components or reconfigure for different tasks without needing an entirely new robot shipped from Earth.

For research labs, the Tron 2 offers something that’s been surprisingly rare: a flexible test bed that can support multiple types of projects without requiring a whole fleet of different robots. Whether you’re studying manipulation, locomotion, or AI integration, you can configure the same platform to suit your specific needs. Perhaps most surprisingly, this technological marvel starts at just 49,800 Chinese yuan, which translates to around $7,000 USD. For context, that’s dramatically cheaper than many specialized robots that can only do a fraction of what the Tron 2 offers. Pre-orders are already open, though LimX hasn’t fully disclosed all the pricing details or specified exactly who their target customers are.

The Tron 2 represents something bigger than just another cool robot demo. It’s pointing toward a future where adaptability matters more than specialization, where one well-designed platform can handle whatever challenges come its way. Whether it ends up exploring Mars or revolutionizing warehouse operations here on Earth, this shape-shifting bot is definitely one to watch.

The post This $7,000 Robot Shapeshifts Into 3 Different Machines first appeared on Yanko Design.

Based on sensors in game controllers, this upper-limb wearable robot will help you with your daily chores

One thing exoskeletons have done right is help with motor rehabilitation. Of course, their size and weight have decreased over time, but most of those available are suitable for rehabilitation, load-bearing assistance, and similar purposes. However, they are not designed for daily wear. Not concentrating on the lower limb, which is a saturated market, a duo of budding South Korean designers has targeted the upper limb; creating a wearable robot that can be worn for daily usage.

It’s called the Sleev. For now, it’s not far beyond the drawing books, but from how and what it’s projected to be built for, its God damn great solution for the purpose. Sleev is designed as a daily upper-limb exosuit (wearable robot). It supports independent arm movement and is effortless to wear and remove: just one hand, no more!

Designers: Youngha Rho and Sungchan Ko

It’s not that we are seeing a robotic assistant for the arm for the first time. The market is flooded with iterations of bulky and inconvenient wearable robots that are designed with a great level of technological input and robotic sensors, but somehow make the wearer feel like a cyborg. With its sleek and lightweight limb, the Sleev is conceptualized to change that for a robotic assistant that you would like to wear. It can be strapped on like any other elbow brace to provide assistance in its movement. In addition to being a crucial option for people recovering from stroke or sports injury, the Sleev (for its design and attractive appearance) will augment daily tasks like lifting and carrying; you will like wearing it when carrying a baby for a long time or doing groceries and have a lot of packets to carry back home.

As a wearable robot conceptualized to integrate exoskeletons into our daily life, the Sleev is also strong and intelligent enough to support with rehabilitation activities. To ensure this, the design is integrated with FMG (force myography), a method that detects movement intentions through muscle pressure. The muscle pressure is different in people based on their gender, height, weight, and age. So, for the data accuracy and for the correct functioning of the wearable robot, this information about the users will be necessary. And a larger database will ensure better results, the designers believe.

Collaborating FMG with IMU sensors, the designers suggest, they can allow the algorithm to know where the user intends to move and help them with it accordingly. Both these sensors are affordable and commonly used in game controllers, so they should not be overly expensive when Sleev can find itself into mass production. Interestingly, it relates its movements based on muscle strength and intention. The Sleev doesn’t need to be worn directly on the skin; users can wear it over a thin innerwear as well and go on with it during their daily activities.

The post Based on sensors in game controllers, this upper-limb wearable robot will help you with your daily chores first appeared on Yanko Design.