LG developed a two-legged AI-powered robot that can watch your pets for you

LG is going to start selling a compact bipedal robot that can roll around your house freely. The AI-powered robot, which will debut at CES 2024 in Las Vegas, has a wide range of capabilities — from notifying you that you left the AC on while you're away to watching your pet while you're at work. Like stationary smart home aids, like Alexa or Apple HomePod, LG’s robot can also tell you the weather and remind you to take your medications on time.

The robot is powered by Qualcomm’s Robotics RB5 Platform, which entails a mix of hardware and software that run the bot’s AI program. Some of these include its ability to recognize faces and voices, process the emotions of those around it and engage in conversation. LG says the bot will be able to greet you at your door, analyze your emotions and play music to either boost your good mood or lull you to sleep. It can even “emote” by changing its posture thanks to its articulated leg joints. Although it's a cute feature, it might not have any practical use beyond making it approachable.

The robot is also equipped with a camera in its face, a speaker and various sensors throughout that give it the ability to navigate, speak and listen. It can also measure indoor air quality and temperature, however, it's unclear if it can actually be linked to a smart home system and control your thermostat. LG has not yet responded to comment on this and said the price of the robot will be announced at a later time.

Because the bot can move around freely, you can program it to look after your pets while you're gone and send your smartphone notifications “if any unusual activity is detected.” Using the same monitoring tools, the bot can act as a mobile “security guard” and send you notifications if there is movement in the house while you're away. Or more likely, just let you know you left the kitchen lights on.

We're reporting live from CES 2024 in Las Vegas from January 6-12. Keep up with all the latest news from the show here.

This article originally appeared on Engadget at https://www.engadget.com/lg-developed-a-two-legged-ai-powered-robot-that-can-watch-your-pets-for-you-192034931.html?src=rss

Duck Look-Alike Robots Will Revolutionise Waste Management And Help You Keep The Society Clean

In the rapidly evolving landscape of smart cities, the integration of technology into everyday life is becoming increasingly prevalent. One innovative solution to the challenges of urban waste management is “Qua” – a system of smart baskets designed to change the paradigm of garbage collection through a bio-inspired and playful approach to social robotics.

Designer: Luca Fiorentino

The creators of Qua have recognized the importance of seamlessly integrating robots into daily life. Unlike traditional robotic designs that may feel intimidating or too ‘robotic,’ Qua takes a different approach by drawing inspiration from nature, particularly the graceful and iconic single-file line movement of ducks. This design philosophy aims to make these robots a familiar and affable presence in urban environments, fostering acceptance and reducing the perceived intrusion of technology.

At its core, Qua is a system of autonomous waste collection baskets designed to move independently throughout the city. The baskets are equipped with sensors and artificial intelligence that allow them to recognize when a user needs to dispose of waste. Instead of requiring users to seek out a designated waste bin, it takes a proactive approach by approaching users when it detects the need for disposal.

The functionality of the robot extends beyond its bio-inspired design. Once it identifies a user ready to dispose of waste, it autonomously navigates towards them, streamlining the waste disposal process. After facilitating the user in discarding their waste, Qua then autonomously returns to a designated charging hub responsible for waste disposal.

One of the key advantages of these robots is their potential to address the issue of littering, particularly among individuals who may be less inclined to walk the extra mile to find a waste bin. By bringing waste collection directly to the user, it offers a convenient and accessible solution to urban waste management. This is particularly beneficial in encouraging responsible waste disposal practices and minimizing littering in public spaces.

However, it’s essential to consider the potential downside of such technology. As noted, Qua could inadvertently cater to the convenience of those who are lazy or unwilling to make the effort to dispose of waste properly. It is definitely serving the larger purpose for society but it raises important questions about the role of technology in shaping behaviors and the need for a balance between convenience and responsibility.

Having said that, Qua represents a groundbreaking approach to waste management in smart cities. By combining bio-inspired design with advanced robotics, Qua aims to redefine the relationship between technology and urban living. As with any technological advancement, it’s crucial to consider the societal implications and strive for a balance that promotes convenience without compromising responsible behavior. The future of waste management may indeed be shaped by innovations like these robots, where technology not only serves a functional purpose but also harmoniously integrates with the natural flow of city life.

The post Duck Look-Alike Robots Will Revolutionise Waste Management And Help You Keep The Society Clean first appeared on Yanko Design.

Intelligent robot can become your child’s protector, teacher, playmate

While us adults (or maybe it’s just me?) may have recurring nightmares about our robot overlords one day rising up against us, most children don’t have such bad thoughts about robots. In fact they would probably enjoy having one as a companion especially if they don’t have any playmates yet. A concept for an “intelligent social robot” is more than just a companion for your child and may even replace you one day.

Designer: Igor Jankovic

Okay, that last part is just a joke (or is it?) but the concept for a robot called Sipro is one that will help parents be more productive, according to the designer. From the description itself, this robot will be able to do so many things for your child, from being a protector to a teacher to a playmate. It’s like a smart device, baby monitor, and babysitter all rolled into one robotic entity that will hopefully not take over your household.

The device has four microphones that is able to detect voices and various sensors that can help it move around the room. It will send a notification to the parent when the baby is crying or it may actually try to stop it crying by playing music, reading a story, or even playing the recorded voice of the parent if they’re not around. It also serves as an air purifier and a temperature sensor to adjust the air conditioner or to detect smoke or fire.

For fun things it can also project videos and photos on the wall, help your child with speech lessons, and even serve as a baby walker since a small child can ride on it since it has a self-balancing system. On paper, this robot can do a lot so if ever it is translated into an actual product, it is pretty interesting (and maybe a bit terrifying).

The post Intelligent robot can become your child’s protector, teacher, playmate first appeared on Yanko Design.

The DJI FPV2 ‘hybrid’ drone can race as well as take aerial photos with its Hasselblad camera system

After years of developing some of the world’s leading aerial drones, DJI debuted the Avata last year, their first-ever ‘FPV’ racing drone… and that got designer Kim Seung-cheol asking – Why must there be separate drones for aerial photography and first-person racing? Why can’t one drone successfully do both? To that end, the DJI FPV2 does the unthinkable by being the world’s first ‘hybrid’ drone capable of FPV racing as well as stabilized aerial photo and videography, thanks to its clever design that borrows the best from both worlds.

Designer: Kim Seung-cheol

The FPV2 drone doesn’t have a radically different design, but rather relies on a few tweaks to its appearance and control system to give it the power of rapid directional flight as well as controlled hovering for stable videography. It relies on a leaning propeller format that’s ideal for FPV-style racing drones. The propellers are located at a slant and positioned diagonally, making the drone look like it’s bending forward. This is perfect for allowing the drone to lunge ahead as it takes off, giving it a significant advantage when racing with other drones or when trying to reach high speeds. However, for aerial photography and videography, the drone simply leans backwards, allowing the propellers to now be parallel to the ground. The gimbal-mounted camera makes up for this while in aerial photography mode.

As an obvious upgrade to its Avata and Mini lines, the FPV2 has a new dual-lens camera system powered by Hasselblad (a partnership continuing from their collaboration on the Mavic 3). Quite similar to the Air 3 drone, the FPV2 has a dual-lens gimbal-mounted camera that can look in all directions for filming sceneries, focusing on subjects, and racing. This doesn’t include the multiple cameras located around its periphery for tracking its environment, avoiding objects, and navigating routes.

A large, easily replaceable battery powers the FPV2, allowing you to quickly hot-swap modules to keep your FPV2 running without downtime for charging. The battery’s mass and its rear location help it counterbalance the drone’s forward-leaning stance, or rather the inverse. The drone races forward with a raised back, preventing the battery pack from dragging it down or influencing its course.

To accompany the drone, Kim Seung Cheol also designed a new set of MR goggles and a controller handle. The compact goggles come with their own pass-through cameras, and sport flip-out antennas for better signal during flight (especially FPV racing). A cushioned headrest with a built-in battery keeps the equilibrium of the headset while also ensuring you can wear it for longer hours without feeling any strain.

Given the immersive nature of the MR headset, the FPV2 also comes with its own RC Motion 2-inspired handheld control that you can intuitively use to maneuver your drone while in flight. The single handheld controller has a gyroscope that detects when it’s being tilted forward or backward, translating that into instructions for the drone to follow. A trigger lets you accelerate, while a joystick gives you more precise control. A large button on the front marked M lets you alternate between racing and aerial modes.

What really gives the FPV2 its edge is the case it comes in, which doubles as a massive battery pack for the drone, controller, and MR headset. Think TWS earbud charging case but bigger and better. Designed to hold your gear when not in use, the carrying case also juices your device batteries while giving you a battery status indicator in the bottom right corner, so you know which particular gizmo needs a recharge.

What the DJI FPV2 proposes isn’t too radical. Some drones are built for racing, others for stabilized content creation… so why not build a drone that can do both? It’s not like the hardware is massively different between the two drone types, and as far as the overall design goes, I’m sure both functions can be achieved within a specially tuned form factor. Maybe DJI is working on something like this, it’s difficult to tell. The company hasn’t debuted a Gen-2 of its Avata FPV drone, so we’re due for an updated racing drone from the consumer/professional-grade UAV manufacturer.

The post The DJI FPV2 ‘hybrid’ drone can race as well as take aerial photos with its Hasselblad camera system first appeared on Yanko Design.

AI Powered Robot Completes Marble Labyrinth In Near Record Time

CyberRunner is a Labyrinth marble game modified to be played by autonomous AI, using two motors as hands, a camera for eyes, and a computer for its brain. It does not have a sense of humor though, and clearly became frustrated when I kept replacing its marble with a piece of chewed gum.

After six hours of model-based reinforcement learning, the robot was able to complete the full maze in just 14.69 seconds — within a second of the human world record. That’s impressive… and with only 6 hours of practice! Imagine what it could do with a full week.

During the robot’s learning, it found several shortcuts it could take to bypass part of the maze, but the humans behind the project stepped in to force the robot to follow the whole path. Me? I always try to jump the ball straight from start to finish in one violent leap.

[via TechEBlog]

Former Toy Designer Constructs a Giant Furby: XL Sized Creepy

Tasked with making a toy for Makers’ Secret Santa gift exchange recipient Look Ma No Computer, former toy designer James Bruton decided to construct a giant-sized version of a Furby. The XL Furby features a regular-sized version living inside its chest and has a moving body and eyes that run on a loop, as well as 16 different sound effects, hopefully none of which are, “I’m coming to get you.”

When the Furby’s motion detector is activated, it performs a different series of moves and sound effects, so its actions appear random. That’s great news because the last thing I’d want is a predictable giant Furby in my living room. I like to be kept on my toes.

The fact that people like James have the ability to imagine a giant-size Furby and then actually successfully design and build one never ceases to amaze me. I’m great at imagining things, but turning that idea into an actual physical manifestation is the tricky part. And by tricky, I mean next to impossible, especially if electronics are involved.

Tesla brings (scary) improvements to Gen 2 of Optimus humanoid robot

It’s frightening to think that it will not be a surprise to all of us if one of these days, we’ll wake up to the fact that our new robot overlords have taken over the planet. We’re seeing advances in robotics that will not make that an impossibility. We’re still far away from robots becoming sentient beings that will enslave us though so for now we can enjoy how these humanoid devices are still being created to help us rather than replace us.

Designer: Tesla

It’s also not a surprise that Elon Musk and Tesla are at the forefront of trying to make these robots better. The latest version of their humanoid robot, the Optimus Gen 2, brings many improvements from their first one, the Bumblebee back in 2022, and the Optimus Gen 1 from just earlier this year. It received a lot of hardware upgrades for this version, specifically the Tesla-designed actuators and sensors that are now more precise and accurate and now has integrated electronics into it. You get articulated toe sections based on human foot geometry so it can walk a bit more naturally.

It now also has a 2-DoF actuated neck so it’s able to move its head in a more human way, which can be amazing or terrifying. Its hands now has 11-DoF and tactile sensing in all of its digits so it will be able to handle eggs and other delicate things without dropping them. It is also now lighter by 10kg and gets a 30% walk speed boost so it can easily move around better than its predecessors, although you can still outrun it if needed. Because of these improvements, it has improved balance and full body control that it can do things like squats.

The Optimus humanoid robot is envisioned to be a helper for humans, taking over some of the monotonous tasks that we would like to escape from. The Gen 2 is still in the prototype phase though and there is no news yet if they will eventually manufacture and sell it. It gives us time to think about whether we will risk an eventual robot uprising just to take away tedious tasks from our every day life.

The post Tesla brings (scary) improvements to Gen 2 of Optimus humanoid robot first appeared on Yanko Design.

Agility’s Digit warehouse robot understands natural language commands thanks to AI smarts

Agility Robotics shared a demo video Wednesday of one of its Digit robots upgraded with AI. Although that may conjure terrifying pop-culture images of sentient sci-fi machines taking over the world, the demo video reveals something much more pedestrian, if not boring. The bipedal warehouse robot ploddingly works to complete a slightly puzzling task without direct human control or detailed guidance. In the clip, it slowly but successfully interprets and executes the command, “Take the box that’s the color of Darth Vader’s lightsaber, and move it to the tallest tower in the front row.”

The company, which added a “head” and “hands” to Digit earlier this year, pitches the demonstration as a glimpse into how large language models (LLMs) can enhance its humanoid machines. It suggests it’s a natural fit, describing Digit as “a physical embodiment of artificial intelligence.”

“We created a demo space with a series of numbered towers of several heights, as well as three boxes with multiple defining characteristics,” Pras Velagapudi, Agility Robotics’ vice president of innovation and chief architect, explained in the clip. “Digit was given information about this environment, but was not given any specific information about its tasks. Just natural language commands of varying complexity to see if it could execute them.”

Still from a demo video of the Digit robot holding a red crate. An overlay on the lower left reveals the robot’s inner monologue, while one on the lower right shows a 3D map the robot uses for navigation.
Agility Robotics

The clip shows Digit’s beat-by-beat inner monologue overlaid on the screen as the robot thinks its way through the task. For example, “I should find out what commands I can use,” “I should first assess the environment to see what items and locations are available,” and “Darth Vader’s light saber is red, and so the red box (box2) is the one I need to move. The tallest tower in the front row is tower4. I will first move to the location of box2.” 

After moving to the location of the red box, Digit continues. “Now that I’m at the location of box2, I should pick it up,” it reasons. It continues, “Now that I have box2, I should move to the location of the tallest tower in the front row, which is tower4” and “Now that I’m at tower4, I should place box2 on it.” Finally, it declares its work finished: “I believe I have completed the task, as I have moved the box that is the color of Darth Vader’s light saber (red, box2) to the tallest tower in the front row (tower4).”

Agility Robotics, which pitches Digit as sparing human workers from the most taxing physical activities, says it created the demo to “show how LLMs could make our robots more versatile and faster to deploy.” The company is building an Oregon factory to produce 10,000 humanoid robots annually. It has also inked a deal with Amazon for the retailer to test Digit in a Seattle-area facility. Fiction-fueled fears aside, the robots are much more likely to hurt humans by stealing their warehouse jobs than by shapeshifting, murdering innocents or reenacting other Hollywood-fueled dystopian nightmares.

This article originally appeared on Engadget at https://www.engadget.com/agilitys-digit-warehouse-robot-understands-natural-language-commands-thanks-to-ai-smarts-214415066.html?src=rss

Tesla’s latest Optimus robot can handle an egg without breaking it

Tesla has offered a look at the latest version of its Optimus robot. In a new video, the second-gen humanoid machine appears to have greater dexterity than its predecessor, though you’ll likely have to wait quite a while longer before you can pick up one of these to help around the house. Milan Kovac, who works on the Optimus project, noted on X that the footage is in real-time and that there was no CGI involved.

While the previous version of Optimus struggled to walk during a live demo, the latest model is able to move with more grace, perhaps thanks to its Tesla-designed actuators and sensors. The machine has an actuated neck with two degrees of freedom and it's said to be 30 percent faster at walking while mimicking the geometry of human feet.

The second-gen Optimus has a sleeker design and Tesla says it has been able to reduce the weight of the robot by 10 kilograms without sacrificing any functionality. The company claims this model has improved balance and full-body control — it's shown squatting and getting back up in the video.

Among the biggest upgrades are to the hands. Tesla says these now have 11 degrees of freedom and they can move more quickly. Optimus is able to handle objects more delicately, as shown by a demo of it picking up and gently placing down an egg. These all seem like marked improvements over the last iteration of Optimus, which we first saw in September last year.

While the robot looks mechanically more impressive than its predecessor, that's only one piece of the puzzle, as Electrek points out. If the robot is to be used in the real world as a "general purpose, bi-pedal, humanoid robot capable of performing tasks that are unsafe, repetitive or boring" (as Tesla is aiming for), it will need to have a robust artificial intelligence that allows it to operate safely and independently. 

That's likely many years away from becoming a reality, particularly when Tesla has had problems with the AI features of its cars. In fact, the company just recalled nearly every car it has shipped in the US to fix issues with the Autopilot system.

This article originally appeared on Engadget at https://www.engadget.com/teslas-latest-optimus-robot-can-handle-an-egg-without-breaking-it-154610781.html?src=rss

Wheeled quadruped robot can stand up to chuck boxes into bins

While the fear that our robot overlords will eventually take over the planet is still real, we’ve seen advances in robotics that are more helpful for humanity. There are tasks that we would much rather a robot will do for us like carrying heavy things (although that may be one of the reasons why the revolution will start) to avoid injuries. We’re seeing experiments on how to train them to do even more advanced skills so they can eventually take over the world, I mean these heavy, menial tasks.

Designer: Swiss Mile

The ANYmal robot is one such robot experiment that can get around either as a dog-like quadruped or mimic a human when it stands up on its hind legs, hence its name. Last year, it learned to squat back and stand up with its motorized wheels and now they’re experimenting with it to do heavier tasks through something called “curiosity-driven learning”. Basically it gets rewarded when it is able to complete the task it’s given by figuring out how to do it by itself.

In the video they posted showing how the ANYmal robot completed the task of putting a package into a bin, it was able to actually lift the box up and then put it where it’s supposed to go. However, it seemed to just throw it into that bin like how some baggage handlers supposedly do their task if they think no one is looking. The robot is probably thinking, “Hey, they just told me to put the box into the bin, not really to do it carefully and precisely.”

For now, the robot is still a robotics research project for things like Reinforcement Learning and Random Network Distillation. But if they do decide to actually manufacture the robots for industrial and commercial use, it would be interesting to see how the wheeled quadruped with the humanoid form can actually reduce heavy grunt work for humans.

The post Wheeled quadruped robot can stand up to chuck boxes into bins first appeared on Yanko Design.