ICYMI: Submersible sticky situations and elongating elastomer electrodes

Today on In Case You Missed It: Researchers from Purdue University and the Office of Naval Research teamed up to develop a new kind of glue that even works underwater. The synthetic compound is derived from proteins used by muscles to keep themse...

Georgia Tech receives $900,000 grant from Office of Naval Research to develop ‘MacGyver’ robot

Georgia Institute of Technology received $900,000 grant from Office of Naval Research to develop 'Macgyver' robot

Robots come in many flavors. There's the subservient kind, the virtual representative, the odd one with an artistic bent, and even robo-cattle. But, typically, they all hit the same roadblock: they can only do what they are programmed to do. Of course, there are those that posses some AI smarts, too, but Georgia Tech wants to take this to the next level, and build a 'bot that can interact with its environment on the fly. The project hopes to give machines deployed in disaster situations the ability to find objects in their environment for use as tools, such as placing a chair to reach something high, or building bridges from debris. The idea builds on previous work where robots learned to moved objects out of their way, and developing an algorithm that allows them to identify items, and asses its usefulness as a tool. This would be backed up by some programming, to give the droids a basic understanding of rigid body mechanics, and how to construct motion plans. The Office of Navy Research's interest comes from potential future applications, working side-by-side with military personnel out on missions, which along with iRobot 110, forms the early foundations for the cyber army of our childhood imaginations.

Filed under: ,

Georgia Tech receives $900,000 grant from Office of Naval Research to develop 'MacGyver' robot originally appeared on Engadget on Fri, 12 Oct 2012 10:59:00 EDT. Please see our terms for use of feeds.

Permalink GizMag  |  sourceGeorgia Tech  | Email this | Comments

MIT’s real-time indoor mapping system uses Kinect, lasers to aid rescue workers

MIT's realtime indoor mapping system uses Kinect, lasers to aid rescue workers

We've seen the Kinect put to use to help you find your groceries, but the sensor's image processing capabilities have some more safety-minded applications as well. The fine minds at MIT combined the Kinect with a laser range finder and a laptop to create a real-time mapping rig for firefighters and other rescue workers. The prototype, called SLAM (for Simultaneous Localization and Mapping) received funding from the US Air Force and the Office of Naval Research, and it stands out among other indoor mapping systems for its focus on human (rather than robot) use and its ability to produce maps without the aid of any outside information, thanks to an on-board processor.

Continue reading MIT's real-time indoor mapping system uses Kinect, lasers to aid rescue workers

Filed under:

MIT's real-time indoor mapping system uses Kinect, lasers to aid rescue workers originally appeared on Engadget on Tue, 25 Sep 2012 13:16:00 EDT. Please see our terms for use of feeds.

Permalink ZDNet UK  |  sourceMIT Computer Science and Artificial Intelligence Lab  | Email this | Comments