Interactive jellyfish robot responds to hand gestures

When I visited Kelly Tarlton’s Aquarium in Auckland last year, one of the most fascinating things I saw are the jellyfish. They were very pretty and captivating and looking at them gave me a certain sense of calm. I still don’t know much about them except for the fact that they don’t have the usual organs we see in animals like hearts, brains, and even eyes. It would be interesting to know more about these creatures and this project may just be something that can connect humans more to these “free-swimming marine coelentrata”.

Designer: Adonis Christodoulou

The aim of the project is to establish a sort of communication between humans and jellyfish, even if it’s not the actual animal but an interactive robot driven by machine learning. After going through several prototypes, they came up with something that has actuators with reels that are able to wrap around the “tentacles”. The sides also have holes that will keep the threads perpendicular to each of the reels. There are fours strings attached to a single level of the reels and the next tentacle is located above the previous one.

The software design is where things get really interesting. There are connected through communication channels among Wekinator, Processing, and Arduino. They teach the machine hand gestures that are then translated into “emotions” for the jellyfish robot. Raising the hand will induce a calm attitude while doing the finger heart will make the jellyfish happy. If you want to make it mad, do a fist and if you want to make it sad, do a thumbs down. Once the robot processes this emotion it triggers movements in the robot.

It’s a bit unclear how this will actually translate to understanding jellyfish more. But by the anthromorphization of the jellyfish through servos, LED, and robotic articulation, they are able to “give life” to this mysterious but interesting specie. It’s also cool to see different robotic transmutations, as long as they don’t someday overthrow humans.

The post Interactive jellyfish robot responds to hand gestures first appeared on Yanko Design.

Tesla brings (scary) improvements to Gen 2 of Optimus humanoid robot

It’s frightening to think that it will not be a surprise to all of us if one of these days, we’ll wake up to the fact that our new robot overlords have taken over the planet. We’re seeing advances in robotics that will not make that an impossibility. We’re still far away from robots becoming sentient beings that will enslave us though so for now we can enjoy how these humanoid devices are still being created to help us rather than replace us.

Designer: Tesla

It’s also not a surprise that Elon Musk and Tesla are at the forefront of trying to make these robots better. The latest version of their humanoid robot, the Optimus Gen 2, brings many improvements from their first one, the Bumblebee back in 2022, and the Optimus Gen 1 from just earlier this year. It received a lot of hardware upgrades for this version, specifically the Tesla-designed actuators and sensors that are now more precise and accurate and now has integrated electronics into it. You get articulated toe sections based on human foot geometry so it can walk a bit more naturally.

It now also has a 2-DoF actuated neck so it’s able to move its head in a more human way, which can be amazing or terrifying. Its hands now has 11-DoF and tactile sensing in all of its digits so it will be able to handle eggs and other delicate things without dropping them. It is also now lighter by 10kg and gets a 30% walk speed boost so it can easily move around better than its predecessors, although you can still outrun it if needed. Because of these improvements, it has improved balance and full body control that it can do things like squats.

The Optimus humanoid robot is envisioned to be a helper for humans, taking over some of the monotonous tasks that we would like to escape from. The Gen 2 is still in the prototype phase though and there is no news yet if they will eventually manufacture and sell it. It gives us time to think about whether we will risk an eventual robot uprising just to take away tedious tasks from our every day life.

The post Tesla brings (scary) improvements to Gen 2 of Optimus humanoid robot first appeared on Yanko Design.

This robot is an autonomous product designed for enhancing digital interactions like a modern R2-D2!

Imagine if R2-D2 got a 2021 makeover? Well, BEBOP Design did something like that…they took the concept and gave it a sleek makeover to give us all Information Robot! This is an autonomous robot designed specifically for the Korean startup Zetabank that aims to make human lives safer and healthier with the help of robots.

Zetabank has a range of robots and this is their second collaboration with BEBOP. The company’s mission is to imrpove our lives using artificial intelligence. Their Disinfectant Robot, Hospitality Robot, and Untact Robot are all designed keeping in mind how they can maximize utility and bring practicality to make our day-to-day more efficient. Continuing that legacy is Information Robot which is created as a service platform for digital interactions building upon the Hospitality Robot’s intelligence. These digital interactions are enhanced by the robot’s autonomous movement in various commercial and residential spaces.

Information Robot not only takes the best from its hospitality counterpart but also includes the best from the Disinfectant Robots. It features tech that enables it to purify the air and it becomes more efficient due to its ability to move around smoothly in large environments so it hits two targets with one arrow! Information Robot maintains a coherent design language while accommodating its unique functional purpose. It presents a friendly and approachable personality while its large display is the focal point that invites people to interact with it. R2-D2 would certainly be envious of the minimal aesthetic and the tech upgrade!

Designer: Soohun Jung, Byungwook Kang, Rich Park, Kikang Kim of BEBOP Design

inforobot

This smart robot is a playful egg designed to simulate the experience of raising actual pets!

Every time we think of robots it is a scary visual. When we think of pet robots it’s usually a faceless dog. But what if I tell you that this good egg is actually a pet robot? Well, it is a concept but a very realistic one at that. As we advance in the world of robotics, designs like Eggo remind us that not all robots are bad and some can actually be cute like Eggo!

Eggo’s mission is simple – to give you a robot pet that is always by your side and provides a positive experience to you. This egg-shaped companion lets you raise a pet online or offline without taking away from the experience. It has a simple design, minimal interface, and an organic shape that invites interaction. Eggo moves autonomously by grasping the terrain through a camera. The smart pet also automatically goes to charge itself when the battery is low and I honestly wish my phone did the same thing. Even though it is a robot, designer Hyunjae Tak made sure to include an emotional side so Eggo can express how it is ‘feeling’ through the LED colors which are extremely important when interacting with children. It uses the inner wheel to move on its own and actually forms a unique personality according to how you take care of it just as you would with a real-life pet!

You don’t have to always be online to interact with or raise Eggo, it retains everything offline as well and that helps you build a realistic connection with the product as a pet as opposed to an ‘online game’ feeling (remember Neopets?). Eggo comes with its own smart app and with the various integrations, you can communicate deeper with it.

Designer: Hyunjae Tak

Samsung designed public service bots to provide contact-free delivery with a smile!

The world as we know it is changing, and robots are steadily taking over, a sci-fi fantasy that is soon turning into reality. Moving toward a healthier and more hands-off future does mean that technology is taking a lot of the work off our hands. Designers are building robots to take care of public service jobs primarily rooted in collecting, cleaning, and distributing. Recently debuting their public service bot, Samsung’s Bot Public does all of that and even winks.

The designers behind Samsung Bot Public built the public service robot to provide contact-free professional services in public spaces. Samsung Bot Public looks like the public service robot we’ve come to expect based on all the sci-fi movies we’ve seen, capable of shifting into different configurations for different service jobs. The versatile base of Samsung Bot Public works as a universal charging base that can register other mounted domains to complete various tasks throughout the day.

The different domains are configured according to the task at hand– while one domain features a three-tier shelving unit, another features only a display screen and slim carrying tray. The most sought-after public service bots in circulation today are serving, delivery, and guide robots, and these were the three services prioritized. While the robots are generally programmed and made for free movement, their software is still accessible through external devices and storage systems. If serving the public wasn’t enough for the robot’s plate, each interaction is also met with a wink and smile.

Designer: Samsung

The Samsung Bot Public’s design hinges on the main base platform that adapt to each changing domain.

One domain features a large panel display screen, built-in charging port, and small tray table.

Another domain seems to indicate collecting and distribution duties with a three-tier shelving unit and drawers.

Each Samsung Bot Public comes with an emergency stop button and a rear LED display panel.

The original domain’s drawers pull out to create more storage space for collecting and delivering.

With all of this, Samsung Bot Public also ends each interaction with a good ole robot wink.

This fully automated bionic coffee maker is just like a robot straight from The Jetsons!

If you sometimes feel like a robot before your first cup of coffee, you’re in good company. Without even fully opening my eyes, I get my first cup of coffee going for the morning, and while it brews I get myself ready. On good days, I turn my stove off on time, and on other days, I gulp down a burnt cup of coffee. Coffee is a necessary part of the day for a lot of us and having that perfect cup in the morning might be all we need to get our day off on the right foot. To save us from those ‘other days,’ Beijing’s Orion Star Technology Co. Ltd. recently designed a robotic coffeemaking system, the Zhi Ka Master that was shortlisted at 2021’s iF Design Awards.

Zhi Ka Master is a coffee-making system that employs the use of twin-arm robotics to perform traditional coffee and tea brewing for hand-poured, automated cups of coffee. The entire system comprises a twin-arm, six-axis robot and accompanying work table. Twin-arm robotic systems are typically chosen for their efficient and automated execution of more involved assembly operations. Through bi-manual manipulation, twin-arm robots can perform complicated tasks in a human-like manner. The incorporation of twin-arm robotics for Zhi Ka Master and a bionic profile design equips the robot with enough know-how to stimulate masterful coffee or tea-making methods with the push of a button. A pre-sized and programmed worktable is used to keep all the machines and tools necessary to make any drink on a typical coffee menu.

You’re like me if your coffee order comes with some conditions: an extra shot of espresso please and not too much ice. Rest assured, Zhi Ka Master knows how to receive special input for specific coffee orders that veer from the menu. Through integrated software, Zhi Ka Master can make coffee and tea drinks for specific tastes all without human intervention. So maybe, don’t push that button.

Designer: Orion Star Technology Co. Ltd.

Zhi Ka Master is a six-axis, twin-arm robot coffeemaker.

Integrated software adjusts the robot’s mechanical grip to fit whatever item it grasps.

Through a built-in RGB camera, the robot performs duties and responds to feedback in real-time to ensure safe operation.

Equipped with an emergency stop button, Zhi Ka Master prioritizes safety even before coffee.

Zhi Ka Master occupies a total of only three square meters.

Boston Dynamics’ latest robot moves away from biomimicry to design a practical warehouse solution!





Thirty years ago, starting out as a tightknit research company, Boston Dynamics began its quest to create robots that could go where people go, do what people do, and move as people move. Today, a leading engineering and robotics design company, the team behind Boston Dynamics continues to produce and deliver commercial robotics equipped with dynamic control, cutting-edge electronics, and next-generation software. Designed for easy rollout servicing in existing warehouses, Stretch is Boston Dynamics’ latest mobile, automated case-handling robot.

In appearance, Stretch resembles an excavator or backhoe construction truck, with a solid, bottom-heavy base and tensile robotic arm. Filled out with four small wheels for tight turning and lots of movement, Stretch’s mobile base is capable of sliding in every direction and designed to allow the fuller robot to fit anywhere a pallet fits. The long robotic arm provides plenty of reach and height with seven degrees-of-freedom, granting Stretch access to cases and shipping goods throughout any freight space or pallet.

At the end of Stretch’s robotic arm, a smart gripper embedded with sensors and active controls grants Stretch with handling mechanisms to grasp a wide array of different types of packages. Keeping the whole operation going throughout the workday are high-capacity batteries and an advanced perception mast for long-lasting, precise, and stable power. Speaking of how Stretch differentiates the currently saturated truck unloading robots, palletizing and depalletizing robots, and mobile bases with arms, Kevin Blankespoor, Boston Dynamics’ VP of Product Engineering and chief engineer for both Handle and Stretch says “Stretch is built with pieces from Spot and Atlas and that gave us a big head start. For example, if you look at Stretch’s vision system, it’s 2D cameras, depth sensors, and software that allows it to do obstacle detection, box detection, and localization. Those are all the same sensors and software that we’ve been using for years on our legged robots. And if you look closely at Stretch’s wrist joints, they’re actually the same as Spot’s hips. They use the same electric motors, the same gearboxes, the same sensors, and they even have the same closed-loop controller controlling the joints.”

While Stretch is still a prototype, the wheeled robot is the commercial version of a smaller, earlier model from Boston Dynamics called Handle. Stretch currently enacts unloading and building applications for trucks and warehouses, with future plans for truck loading in the works for Boston Dynamics. While the team behind Stretch has yet to name a price, Boston Dynamics is working to make the case-handling robot compatible with other warehouse systems.

Designer: Boston Dynamics

Four wheels fill out Stetch’s mobile base, allowing it to fit anywhere a shipping pallet fits.

Smart gripping technology allows Stretch to reach for and take hold of a multitude of varying package types.

Stretch’s lengthy robotic arm grants the robot access to packages throughout the warehouse and full extension for easy rollout.

The team at Boston Dynamic equipped Stretch with seven degrees-of-freedom, providing plenty of reach and height.

Stretch was designed for warehouse case-handling and truck unloading.

Microsoft’s Azure cloud platform will be the brain for their future autonomous robots!

Autonomous Robotic Systems are paving the way for the future, sometimes literally. Everywhere, it seems, new autonomous robots are cleaning the gates of airports and taking our orders at restaurants. Robots are managing real-life situations for the very species that created them, a major technological upheaval, which Microsoft, paired with their Azure cloud computing platform, is helping to move forward. Mohamed Halawany’s Microsoft Azure Robot design was recently recognized by Behance for its intricate render and friendly, futuristic personality.

Depending on what each situation needs, Azure, will compute real-time in the robot in order to deliver help with speed and without any latency. Azure is run by Artificial Intelligence that provides users with a customizable space to work, save, share, and connect through multi-functional, serverless software. Halawany’s robot renders for Microsoft Middle East, mimics the physical disposition of a human being, but operates through AI on a serverless, scalable platform. The main screen on Halawany’s rendering of a robot is the classic homepage for Microsoft users: a grid, lined with applications and software analytics, that organizes all a user might need in order to put the functions of an autonomous robot to use.

The potential and capabilities of autonomous robots change with different users’ needs. The robot’s operating system adheres to each individual’s inquires and predicts how each user might interact with AI technology based on previous use. The cloud program, Azure, offers cross-platform collaboration, real-time speech translation, face-tagging analytics, voice-identification capabilities, and diagnostic monitors that take care of software trouble, among other features. This technological adaptation to distinct human needs is a result of detail-oriented software development that is ever-changing and expansive with consistent use. Additionally, the exterior of Halawany’s robot is customizable according to the user’s preferences. In a world that will soon prioritize the development of AI-technology, the creativity in design that brings software and impressive technology into focus will be what bridges emotion with robotics. Check out Mohamed Halawany’s render on Behance and scroll through the photos below.

Designer: Mohamed Halawany x Microsoft

Fuseproject’s new robot Moxie is what happens when AI meets Pixar

With all kids staying home, parents are running out of ideas to keep them occupied and most importantly, emotionally healthy during these crazy times. Some are lucky to have siblings or a pet, but there are many kids out there without a companion that is not working from home. Fuseproject has designed a brilliantly adorable robot, Moxie, who is going to be your child’s new best friend! Trust me, parents are going to love this one (hint: homeschooling feature) and even though I am not a parent or a child, I really would like to have Moxie too so it doesn’t get too lonely in quarantine.

Moxie was born to give children (between the ages 6 and 9) an emotionally aware pal who also came equipped with teaching capabilities – a dream come true and even more so given the current times. Technology and design are proving to be the saving grace for not only essential workers but also for everyone staying home. Fuseproject is taking this opportunity to create positive human-machine interaction for the next generation. “Moxie, a revolutionary animate companion designed in partnership with Embodied, Inc. that promotes social, emotional, and cognitive learning for children across the ability spectrum—from neurotypical to neurodivergent—through play-based learning and interaction,” says the team. It is built to be so smart that children can even whisper secrets to communicate with Moxie through the teardrop-shaped ears and microphone on either side of its head – one of its most distinctive physical traits that encourages a meaningful bond. The Pixar-ish robot has an all-star group of AI investors like Amazon, Intel, Sony, and Toyota including Chief Creative Officer Craig Allen previously worked at Jim Henson and Disney.

The $1,500 price tag comes with the skills of experts in child development, engineering, technology, game design, entertainment, and designers who poured in their best to create Moxie. This resulted in immaculate features like facial expression, overall shape, color palette, and exterior materiality to encourage prolonged interest and optimize fluid social interaction. The team focused on balancing the details to inspire engagement and make sure children resonated with the robot. “All its expressive and endearing features—including the ears, head/playful projection tip, speakers, and arm/hand/finger details—come together to tell an otherworldly story about Moxie as a friendly companion from The Global Robotics Laboratory (G.R.L.) sent out as a robot ambassador with a mission to learn what it means to be a good friend to humans,” elaborates the team on why they chose the forms and functions they did for the robot. The expressions truly are so cute it almost looks like a cross between a robot and an elf!

Moxie radiates a playful persona with a positive character, unlike most AI robots we’ve seen so far. Children will learn and safely practice essential life skills such as turn-taking, eye contact, active listening, emotion regulation, empathy, relationship management, and problem-solving. Sure, parents can teach that too – but is a child more likely to listen to parents instructing them or through a smart ‘toy’? Children are always curious about the world around them, now more than ever they have more questions about it. Moxie steps into the picture to help them comprehend the marvels of technology at their level of understanding. Will Moxie be the robot that changes how we feel about their kind?

Samsung’s robot dog concept lacks puppy eyes but still chases balls!

We are in 2020 and so far it has been one crisis after another. What truly gives me hope is the time I spend with my dog. It is a moment away from the chaos and I often wonder if my dog knows what is happening around us? Do our pets have a sixth sense for disasters as they do for our emotions? Dogs are one of the most intelligent domestic companions to have and our furry friends truly form an inexplicable bond with us. Which leads me to this – what is the future of robot dogs?

Devoid of real feelings but with advancing AI, they will surely be smarter and more efficient. Dog bots will probably be a hybrid of a smart pet and a household assistant, I imagine features like security cameras for the eyes while still being sweet enough to bring you your newspaper and waking you up in the morning. Dog bots may have the benefit of being low maintenance, they won’t require mandatory walks on days when you’re sick or make you panic if you forget leaving their food out during emergencies. And as you can guess, they definitely won’t be troublesome during bath time.

So for the future, it actually sounds like a practical option because AI will be able to mimic a dog’s behavior closely but what about our conditioned behavior towards dogs? This conceptual Samsung dog bot replaces the dog’s features with a screen, so instead of a confused head tilt the face aka screen of the robot will show you a question mark. If the tech giants are to make a robot dog, using a screen as an interactive interface will save a lot more material than using plastic-like materials to replicate the real build of your pet. With the rapid rate at which AI is growing and the conceptual renders show, the dog bot will be able to chase balls and give you a leaping welcome when it senses your arrival. It is interesting how the design is so futuristic and yet when you look at it, you can tell it was made to resemble a dog. You may not even have noticed that the ‘tail’ is missing and yet our brains have evolved to associate emotion with robots.

For most of us, our dogs are considered family. The strong bond teaches us a lot about our own emotions, caring for another in all times, communicating without an actual language and the invincible power of puppy eyes! While technology can make robots so realistic that we start questioning what is real, something like a dog placing his head on your lap can never be replicated by a bot right? Let’s take a pawse (see what I did there?) and think about what life will be like if Samsung were to make a robot dog like this one.

Designer: Gaetano De Cicco