COVID-19 pandemic is still far from over and social distancing is pretty much the health protocol to be followed. In wake of this new normal and need to enhance the user experience in showrooms, Hyundai Motor Group has built the DAL-e an AI-powered robot, that’s an acronym for “Drive you, Assist you, Link with you-experience.” The four-foot-tall robot boasts facial recognition technology, omnidirectional movement, and an advanced communication system with language comprehension to interact with customers in a way that’s welcoming and intuitive. If you also believe it looks so similar to the loner WALL-E robot, then you’re not alone – it does look inspired by the cute robot and the name also seems eerily similar.
To kick-off things, DAL-e debuted yesterday in Hyundai Motor Showroom in southern Seoul for a pilot run, post which, the AI robot will be employed in more Hyundai and Kia showrooms if it all goes as planned and the robot keep learning the tricks of the trade, so to speak. For now, the cute robot escorts the customers to the intended spots and its facial recognition tech enables it to recognize a customer not wearing a mask, and advise them to wear one. The combination of the emotive physical features and the prompt dialogue delivery makes DAL-e a very welcoming assistant for people who visit the showroom. As the intelligent robot guides you through the showroom thanks to its omnidirectional movement capability, you can also get more insight about the products from the touchscreen display on top of its head. The robot can also wirelessly connect to the large display at the showroom to display detailed information about the products. It can even move its arms to emphasize a point or make welcoming gestures. To add to the human element DAL-e has quite a few tricks up its sleeve. For example, it can even ask visitors to take selfies with it.
According to Dong Jin Hyun, Vice President and Head of the Robotics Lab at Hyundai Motor Group, DAL-e will “provide fresh, pleasant experiences to our valued customers in a contact-free environment,” with the software updates and AI learning algorithm. “Our objective is to enable the DAL-e to engage in a smooth and entertaining communication with customers and present valuable services to them.” To this end, the AI robot seems like a valued proposition to enhance the whole experience of buying a car.
Robot dogs have come a long way since Sega Toys’ Poo-Chi hit the scene. I still remember the day I got my Poo-Chi, whose digital bark soon turned into what sounded like a chain-smoking robot’s panic signal. Since its debut in 2000, Poo-Chi, along with many other robotic dog products have seen some major modifications and upgrades. Today, the world’s first decentralized AI robotic dog has been unveiled at CES 2021 by KODA Inc. Designed to offer both emotional companionship and practical, physical support, KODA, Inc.’s DAI robotic dog “is the perfect combination of function and performance,” as CEO of KODA, Inc., Emma Russell puts it.
Unlike the Poo-Chi, who couldn’t even hold its note singing “Ode to Joy,” KODA, Inc.’s robotic dog comes with four 3-D cameras, a single 13-megapixel front-facing camera, an ergonomic structure that incorporates realistic dog-like features such as a purely aesthetic tail, 14 high-torque motors with two on the neck offering full-range mobility for activities like climbing the stairs or trudging through snow, along with an 11 Teraflop processing unit. Since KODA, Inc. is dedicated to providing technology-based solutions to help people with everyday problems, either chronic or otherwise, the secure blockchain network of KODA robot-dogs is closely monitored and cross-checked for consistent and effective AI improvements. For instance, a KODA, Inc. robot-dog in Detroit might be the first to slip on a patch of ice, but thanks to a “futureproof,” supercomputing network, robot-dogs who find their home in a warmer climate will know not to slip on a patch of ice even if the dog’s home ground temperature might never call for one.
The development of decentralized artificial intelligence is integral to the success of robot-operated emotional and physical support products. Decentralized AI essentially equips the built-in software with the ability to solve the reasoning, planning, learning, and decision-making problems that centralized artificial intelligence does not compute. By endowing the robotic dog with Decentralized AI technological capabilities, KODA, Inc. provides a robotic, smart companion that can offer care and guidance for several different purposes including but not limited to, simple companionship, walking guidance for blind users, protective services as a tech-savvy guard dog, or KODA, Inc.’s robotic dog can operate as an animalistic personal assistant capable of solving ordinarily complex issues.
Designer: KODA, Inc.
What you’re looking at is an artificial intelligence-driven furry electronic pet that can express its own emotions via movement and sound. The Moflin’s AI allows its emotions to constantly change based on its environment – much like a real pet’s, except you’ll save a small fortune on pet food. Originally launched as a Kickstarter project, the campaign was a success, and Moflins will be available later this year for around $400. Still, no word if they’ll breed as quickly as Tribbles.
A Moflin constantly scans its surroundings with its sensors and uses its own interactions to determine patterns and respond accordingly, with “an infinite number of movement and sound combinations” available to express its feelings. Not bad. For reference, I’m only capable of grunting and shaking my head no.
Pretty cool, but if they really wanted to sell these things they should have scored the Star Wars licensing rights and made them look like Baby Yoda. You wouldn’t be able to keep them on store shelves! They’d be this year and every year’s must-have Christmas gift. Wait – did I just come up with a multi-million dollar idea? We need The Mandalorian’s blessing, STAT. Somebody call him, tell him The Child is in trouble.