Tiny Drone Swarm Navigates Bamboo Forest Autonomously

Because the robotic apocalypse can’t come soon enough for some people, researchers at China’s Zhejiang University have programmed a swarm of small drones to navigate autonomously to avoid obstacles. In this case, those obstacles are the entirety of a bamboo forest. It’s been real, folks, but there is officially nowhere to run and nowhere to hide.

All of the ten-drone army “are equipped with depth cameras, altitude sensors and a small computer, all running a custom algorithm for collision avoidance, coordination, and flight efficiency.” Wow, so not only are they flying around, not crashing into things, but they’re doing it efficiently. The future, ladies and gentlemen! Humanity doesn’t stand a snowball’s chance in the devil’s butt.

The drones were allegedly developed to be utilized for aerial mapping applications, as well as conservation and disaster relief. Maybe they originally were, but all that goes out the window when they become sentient and decide the only disaster that needs relief is the planet wiped clean of humans. Now, if you’ll excuse me, I have a rocket to the moon to build.

[via TechEBlog]

Toyota Patents Dog Walking Robot That Can Pick Up After Your Pet

Because picking up poop is one of the least desirable aspects of dog walking, Toyota recently applied for patents related to a dog-walking robot that can even clean up after a dog takes care of its business. The future, ladies and gentlemen! It finally doesn’t involve me standing around with a plastic bag on my hand, waiting to pick up a turd.

The “guidance vehicle” features a moving platform that an owner can ride on, which moves along a pre-programmed route, constantly monitoring to ensure it maintains a safe distance from the dog. And when your dog does pee, it sprays a jet of water to help dilute the urine, so it doesn’t kill the grass. And when it poops? It uses a robotic arm to pick up the nuggets, so your neighbors don’t yell at you and/or become passive-aggressive.

Will Toyota’s dog walking robot ever see actual production? That’s debatable, although stranger things have happened, including Toyota applying for dog-walking robot patents in the first place. But if it ever does see the light of day, they better call it the Pet Prius.

[via Autoblog]

Skeletonics Kinetic-Energy Exoskeleton: Humans In Disguise

Determined to win the costume contest at this year’s Halloween party? Look no further than the Skeletonics kinetic-energy powered exoskeleton – perfect for taking your Transformer costume to the next level. The next level being the 1st place pedestal at the costume contest, just so we’re clear. I can practically feel that $100 gift certificate to Spirit Halloween in my robotic hands!

Unlike some other exoskeletons, the Skeletonics relies on no outside power source, instead using a wearer’s kinetic energy to mirror their movements on a larger scale – including hand and finger movements like grasping. The whole thing stands approximately 9-feet tall and weighs only 88-pounds, making it easy to strap to the top of your car like you just bagged yourself a Decepticon.

The video demonstration really is impressive, considering the lack of an external power supply. Granted, the Skeletonics exoskeleton might not be capable of picking up a car or battling an alien queen like a Power Loader, but I really don’t want to be battling alien queens anyway – I just want to win a costume contest for once.

[via TechEBlog]

Robotic Hands Taught to Delicately Peel Bananas

What good is a robot servant if it can’t even peel your breakfast banana without smashing it to bits? With that in mind, researchers at the University of Tokyo’s ISI Laboratory have used AI to teach a pair of robotic hands how to delicately peel bananas. What a time to be alive and not a banana!

To achieve banana peeling success, the researchers first recorded 811 minutes of humans peeling bananas, with the process divided into nine stages, “from grasping the banana to picking it up off the table with one hand, grabbing the tip in the other hand, peeling it, then moving the banana so the rest of the skin can be removed.” The how-to data was then fed to the robots, which can now successfully peel bananas without damaging the fruit a relatively unimpressive 57% of the time. Hopefully, those bananas are going into smoothies, because they certainly wouldn’t pass inspection for banana splits.

There was probably a time just years ago when scientists thought robots would never be able to peel bananas but look at us now. Welcome to the future! We may not have hoverboards, but at least we have banana-peeling robots. Just to be clear, I’m rolling my eyes right now in case you couldn’t tell.

[via Laughing Squid]

The Cryptide: A Fully 3D Printed Shoe Inspired by Mythical Beasts

Meet the Cryptide, the brainchild of German designer Stephan Henrich, who set out to design a shoe inspired by cryptids that could be entirely 3D printed. Interesting design perimeters. The shoe is 3D printed via selective laser sintering (SLS, in which a high-power laser forms tiny particles of polymer powder into a solid) using a thermoplastic elastomer (TPE) material, so they aren’t rigid and painful like the entirely-too-small wooden clogs my dad brought me back from a business trip to Holland.

The idea behind the Cryptide is a shoe that can be 3D printed on-demand to fit an individual’s unique feet after taking 3D scans of them. And, I think I speak for everyone here who has two different-sized feet when I say that’s terrific news because I’m tired of having to buy one pair of 12’s and another of 7’s just to make a pair that fits.

Stephan says the unique patterns left by the shoe’s soles were inspired by cryptids like Bigfoot and the Loch Ness Monster. The patterns left by my soles? They were inspired by the cheapest pair of shoes I could find on Amazon. I’m just saying; there’s no way they wouldn’t leave marks all over the gymnasium floor and get me kicked out of PE, that’s for sure.

[via TechEBlog]

Liquid-Filled Eyeglasses Automatically Adjust Focus: Bye Bye Bifocals!

Have 20/20 vision? That must be nice. My eyes are awful, and if they were any worse I’d be wearing dual eye patches right now. But here to push the envelope in vision correction is University of Utah electrical engineering professor Carlos Mastrangelo and Ph.D. student Mohit Karkanis, who are developing a pair of “smart” eyeglasses that automatically adjust their focus to the distance of whatever a wearer is looking at.

The lenses consist of a thin window that clear glycerine can be pumped in or out to change their shape and adjust focus based on the distance an object is from the wearer’s face. That data is gathered by a distance sensor, and a processor (both housed in the glass’s thick arms) makes the necessary changes in glycerine volume in the lenses. Goodbye, bifocals! Or, in my case, goodbye quadfocals!

This is definitely a fascinating use of technology and all, but I think I speak for everyone here when I say but where are the x-ray glasses? I mean I thought this was supposed to be the future, I should be able to spot winning lotto scratchers without actually having to buy and scratch them first.

Qudi LED Emotion Face Masks: Wearing Your Heart on Your Face

Have trouble expressing your emotions? I’m with you; my couples counselor says it’s something I really need to work on. And here to help is the Qudi Mask, a full face mask consisting of translucent ski-style goggles with LED eye rings and a bottom portion with 199 smart LED lights for better expressing yourself. RIGHT NOW, I’M ANGRY. Just kidding, only tired.

The $289 mask is available in black and white and is controlled via a smartphone app that allows you to choose the mode and display and LED color. In emotion mode, the mask displays the emoticon of your choice (smile, love, shocked, confused, angry, and cat — the most important emotion of all) and animates the mouth to match your speech. The emotions can also be attached to triggers (e.g., nodding yes or shaking your head no) so they can be changed without having to access your phone. That’s a good thing too because I can never find mine.

In addition to emotion mode, there are also 25+ preset animations, an equalizer mode that responds to audio, and a text mode that allows you to display any message you want. The Qudi’s battery lasts for between three and four hours of use and takes approximately 1.5 hours to fully charge. How are you going to use yours? I’m going to use mine to improve my relationship with my wife. Mostly by using the cat emoticon and pretending I’m a cat. She loves cats.

[via Man of Many]

Virtual Reality Boots Promise an Even More Immersive Experience

What good is virtual reality if it can still be distinguished from actual reality? So companies are hard at work trying to make the VR experience as immersive as possible. And one of those companies is Ekto One, which is developing a pair of VR boots that allow a user to actually walk without moving forward. Soon we won’t even have to go outside to experience life!

The boots work via a motion tracking system that keeps tabs on the boots’ movement and location and wheels that return the user to their starting position after each step. The Ekto Ones allow a user the sensation of walking in virtual reality without the need of a bulky and difficult-to-move omnidirectional treadmill so that you could bring virtual reality with you virtually anywhere.

Of course, the boots come with their own drawbacks, including taking five minutes to put on and being awkward and weighty enough that early testers have described 20 minutes of use as a workout itself. Granted, I already consider twenty minutes of regular walking a workout, so I’m really not sure what their point is there.

[via dornob]

Finally, Researchers Teach Goldfish How to Drive a Car and Avoid Obstacles

In long-awaited news, Israeli researchers at Ben-Gurion University have successfully taught goldfish how to steer a vehicle in order to reach a target and receive a treat, using a specially designed FOV (fish-operated vehicle). The future, ladies, and gentlemen – we’re finally here.

The FOV is outfitted with a LiDAR (light detection and ranging) system that uses lasers to determine the fish’s location inside the tank, and the vehicle’s location on land, with the vehicle moving in the direction in which the fish swims. The researchers say that after just a few days of training, fish were able to consistently navigate the vehicle to the target, regardless of starting point and obstacles such as walls, or the presence of false targets. You know, maybe we haven’t been giving goldfish the intellectual credit they deserve.

Obviously, the goldfish’s real target is the nearest pond, and there’s no doubt in my mind once it finally gets its driver’s license that’s exactly where it’ll be headed. If only it’d been riding shotgun the time I accidentally drove into that pond I could have saved it the trouble of taking an in-car driving test!

[via The Washington Post]

Lickable Monitor Tastes Like What’s on Screen: Willy Wonka, Here We Come!

Because some people still care about making the future we all dreamed about as kids a reality, professor Homei Miyashita at Meiji University in Japan has developed a monitor that can imitate on-screen flavors, appropriately naming it Taste The TV (TTTV). I just licked my own old television set to test it, but it appears to be a regular TV and not a TTTV. Tastes like static.

Using a carousel of ten different flavor canisters, the TTTV can mix the basic flavor building blocks in different proportions to create a variety of tastes, which it dispenses via spray on a hygienic film overlaying a flatscreen. But do the snozzberries really taste like snozzberries?

Miyashita estimates a retail version would cost around $835 to produce, and I wouldn’t be the least bit surprised to see them in the Hammacher Schlemmer catalog before next Christmas. I only hope they figure out what taste an explosion leaves in your mouth so they can make action movies that much more real.

[via 9gag]