Scientists Remotely Control Cockroaches with a Solar Powered Backpack

An international team of mad scientists at Japan’s RIKEN Cluster for Pioneering Research (CPR) has created cyborg cockroaches capable of being steered remotely by humans. And not only that but each cockroach is outfitted with a solar-charging backpack and lithium polymer battery to provide it with all the power it needs to consistently power its steering capabilities. This will end well. And by well, as usual, I mean badly.

The cyborg cockroaches are controlled via minute electrical impulses to either the left or right side of the abdomen (administered via wireless button press by a human), which causes them to turn in that direction. That’s cool, but don’t even think about steering them in the direction of my kitchen.

The scientists imagine the cyborg roaches being used for worthy causes like search-and-rescue missions, although I have the sneaking suspicion they’ll also be used for unworthy causes, like crawling up my pant leg with one of my friends at the controls.

[via TechEBlog]

Mad Scientists Use Wolf Spider Carcasses as Robotic Grippers

A group of researchers at Texas’s Rice University have developed a method of turning wolf spider carcasses into robotic grippers, making the legs open and extend when a small amount of air is applied inside the carcass and close and grip when the air is drawn back out. The researchers have named their unholy field of experimentation “necrobotics.” Just to be perfectly clear, this is not good news.

In tests, the mad scientists discovered the necrobot spiders could lift more than 130% of their own body weight. They also endured about 1,000 cycles of air application/removal before the spider’s internal tissue began to degrade and, presumably, legs started falling off. They hope that the spiders can last even longer with the addition of a polymer coating, but I hope they abandon the project altogether.

What will they possibly think of next? Honestly, I’m scared to find out. Remember yesterday when you didn’t know anything about necrobotic spider grippers? Those were simpler times, weren’t they? Better times, even. I sure miss those days.

[via NewAtlas]

Researchers Develop Octopus Sucker Glove for Grasping Objects Underwater

Researchers from the Department of Mechanical Engineering at Virginia Tech (my alma mater!), led by Assistant Professor Michael Bartlett, have developed the Octa-Glove, a glove with octopus-like suckers on the fingers designed for firmly grasping objects underwater without requiring grip strength. That’s great news because my grip strength has always been lacking.


The glove features soft sucker-like membranes, which, when actuated, attach to objects much like an actual octopus’s tentacles without needing to apply any grip pressure. An array of micro-LIDAR optical proximity sensors detect just how far away an object is, and a microcontroller can activate or releases adhesion almost instantly. When reached for comment, Doctor Octopus says he wishes he’d thought of this.

The researchers envision the gloves being utilized in future underwater search and rescue missions, presumably rescuing mermaids from the evil grasp of Ursula. But will you be able to fight her organic suckers with robotic ones? Only time will tell, but I imagine Ariel is pretty worried about it.

[via TechEBlog]

Scientists Create Self-Healing Skin For Robots Using Human Cells

Researchers at The University of Tokyo have covered a robotic finger with skin created from actual human skin cells. It’s also capable of repairing itself when a collagen sheet is applied. And repair itself, it’s going to need to because I’m taking that Terminator finger down!

Professor Shoji Takeuchi believes realistic skin is the key to robots becoming human-like enough for society to accept them. I don’t know about you, but I feel like ultra-realistic humanoid robots are the opposite direction we should be headed. I think robots should all look like Rosie the Robot from The Jetsons. The key is non-threatening, not lifelike. That’s just creepy.

Takeuchi says that the current silicone skin used for robots just isn’t lifelike enough for humans to foster a kinship with our robotic brethren and plans on adding sweat glands, hair follicles, and fingernails to the robots in the future. And on the day that happens, I’ll be waving goodbye as my rocket blasts off far into space, away from all the hairy, sweaty robots on earth.

[via CNet]

Liquid-Filled Eyeglasses Automatically Adjust Focus: Bye Bye Bifocals!

Have 20/20 vision? That must be nice. My eyes are awful, and if they were any worse I’d be wearing dual eye patches right now. But here to push the envelope in vision correction is University of Utah electrical engineering professor Carlos Mastrangelo and Ph.D. student Mohit Karkanis, who are developing a pair of “smart” eyeglasses that automatically adjust their focus to the distance of whatever a wearer is looking at.

The lenses consist of a thin window that clear glycerine can be pumped in or out to change their shape and adjust focus based on the distance an object is from the wearer’s face. That data is gathered by a distance sensor, and a processor (both housed in the glass’s thick arms) makes the necessary changes in glycerine volume in the lenses. Goodbye, bifocals! Or, in my case, goodbye quadfocals!

This is definitely a fascinating use of technology and all, but I think I speak for everyone here when I say but where are the x-ray glasses? I mean I thought this was supposed to be the future, I should be able to spot winning lotto scratchers without actually having to buy and scratch them first.

The Eyecam: A Webcam That Looks Like a Moving, Blinking Human Eyeball

Because it was inevitable we reach the pinnacle of human achievement at some point, researcher Marc Teyssier has developed the Eyecam, a webcam that resembles a moving, blinking human eyeball. One thing’s for certain: it’s going to be nearly impossible to look away from the camera during Zoom meetings now.

Developed at Saarland University’s Human-Computer Interaction Lab, the Eyecam was designed to make us “speculate on the past, present, and future of technology.” And, I think I speak for everyone when I say if this is the future of technology, maybe 2020 wasn’t as bad as we’re all making it out to be.

The Eyecam uses six servos to replicate the human eye muscles, and the autonomous eye can move both laterally and vertically, with the eyelids closing (and webcam briefly going dark as a result) and eyebrow moving. Per Dr. Ian Malcolm in Jurassic Park: “Your scientists were so preoccupied with whether or not they could, that they didn’t stop to think if they should.” Truer words have never been spoken, particularly in the case of human eyeball webcams.

[via The Verge]

Rats Have Learned to Drive Tiny Cars

Apparently, because we don’t already have enough problem with scooters weaving in and out of traffic, scientists are working on a way for rats to drive around in miniature cars now. Yes, they can now literally join the rat race.

Image: Kelly Lambert / University of Richmond

As part of their studies of the cognitive abilities of rats, researchers from the Lambert Behavioral Neuroscience Laboratory at the University of Richmond have built miniature cars that rats can use to drive around. Now, these rats aren’t just driving to work or the shopping mall. They need more incentive than that. Their objective: Froot Loops.

The ratmobiles were built using see-through food containers mounted onto motorized platforms. Each one is rigged up with copper bars that the rats press on to steer the cars left and right, or to drive forward towards the colorful breakfast cereal treat. As scientists placed Froot Loops around the floor, the rats gradually learned to navigate towards their targets.

It’s pretty cool that rats can drive cars now, but I won’t be truly impressed until they learn how to drive a stick shift.

[via NewScientist via CNet]

Scientists Develop Butt Scanner Because Fingerprints Aren’t Enough

Our butts: just like our fingerprints, they’re all unique. And now scientists at Stanford University have developed a prototype ‘smart toilet’ (links to their scientific paper) that can identify an individual based on their unique, um, analprint. That’s cool, that’s cool, we’re all mature adults here.

Using both a traditional fingerprint scanner and an image recognition algorithm to identify a user’s unique anoderm (the exterior part of the anus), the system then uses its under-the-seat mounted camera and sensor array to analyze a person’s urine and excrement for health evaluation and discerning potential concerns. You’d think a fingerprint scanner and maybe a voice recognition program or something would have been sufficient to identify a toilet user, but I suppose why not scan the ol’ anoderm for good measure?

The system was developed specifically for being able to identify the different members of a household for separate waste analysis and not as a stand-alone biometric identification system, which is probably for the best since the use of analprint scanners would make identifying yourself for access to an office building significantly more awkward.

[via Vice]

MIT Scientists Working on Tech That Can Manipulate Your Dreams

Dreams are weird enough all on their own. So the idea that technology could be used to influence your dreams seems like it could produce even stranger results. Now, researchers from MIT are developing a system that could do just that.

A team of scientists at MIT Media Lab’s Fluid Interfaces group has come up with a way to monitor one’s sleep cycle and to induce thoughts using a method called Targeted Dream Incubation or TDI. The technique involves the use of a hand-worn sleep tracking device that monitors the subject’s heart rate, skin conductivity, and the position of their fingers to determine when they have entered an early sleep state called hypnagogia. Working in concert with an app, the system delivers audio cues to the subject, then wakes the sleeper with prompts to record what they remembered in a journal.

Photo by Andrea Piacquadio from .

In their study of 25 participants, the researchers found that 67% of their dream reports mentioned some obvious incorporation of the suggestion. Take the example of the word “tree.” The dream journals included descriptions of a tree-shaped car, a shaman sitting beneath a tree, and images of trees splitting into pieces. We’re clearly not talking about the kind of detailed world-building depicted in Christopher Nolan’s Inception, but the technique could be used to help inspire creative brainstorming sessions on a particular topic or to help redirect stressful dreams in a more positive direction.

You can find more detail on the dream manipulation research study and its possibilities over on Live Science and the MIT Media Lab website.

[via adafruit blog]

This Tiny Robot Beetle Runs on Methanol Instead of Electricity

When it comes to robots, most of them are powered by batteries, which in turn drive servos or other motors. But if you’re trying to build insect-sized robots, it’s tough for them to get around them without an external power source. Now, engineers have come up with a bug-sized robot that runs on methanol instead.

Image: Science Robotics

The specific energy stored in fuels like methanol is significantly higher than that of batteries, which means you need less of it to go just as far. With that in mind, Xiufeng Yang, Longlong Chang, and Néstor O. Pérez-Arancibia from the University of Southern California developed a tiny robot which can carry its own fuel, while keeping its size and weight down to something much closer to that of an actual insect. The robot beetle, aka “RoBeetle” measures just 15mm, weighs just 88mg, and can carry roughly 2.6 times its own weight.

This little dude ambles along using a catalytic artificial muscle that can flex and transmit movements to a leaf spring, which in turn moves its legs. Most of RoBeetle is made from carbon fiber, along with polyimide film, which are both very lightweight materials, and its musculature is made using nitinol, a metal that has a sort of “shape memory” depending on whether its hot or cold, along with platinum black, which works as a catalyst for the fuel, causing it to combust and push against the nitinol wire. An additional mechanism captures methanol vapor on opposite strokes causing the system to reset, causing an oscillation between its two states, and driving the robot forward. This method of locomotion is extremely efficient, allowing the RoBeetle to walk for hours between refuelling.

There are some limitations to the current design, which only allow RoBeetle to move in a straight line and forward, though you would think by creating separate fuel cells for backwards, left, and right motions would be feasible. While this approach makes sense for keeping weight and size down for miniature robots, I wonder if a similar approach could be used for larger robots.

[via Science Robotics via IEEE Spectrum]