MIT’s real-time indoor mapping system uses Kinect, lasers to aid rescue workers

MIT's realtime indoor mapping system uses Kinect, lasers to aid rescue workers

We've seen the Kinect put to use to help you find your groceries, but the sensor's image processing capabilities have some more safety-minded applications as well. The fine minds at MIT combined the Kinect with a laser range finder and a laptop to create a real-time mapping rig for firefighters and other rescue workers. The prototype, called SLAM (for Simultaneous Localization and Mapping) received funding from the US Air Force and the Office of Naval Research, and it stands out among other indoor mapping systems for its focus on human (rather than robot) use and its ability to produce maps without the aid of any outside information, thanks to an on-board processor.

Continue reading MIT's real-time indoor mapping system uses Kinect, lasers to aid rescue workers

Filed under:

MIT's real-time indoor mapping system uses Kinect, lasers to aid rescue workers originally appeared on Engadget on Tue, 25 Sep 2012 13:16:00 EDT. Please see our terms for use of feeds.

Permalink ZDNet UK  |  sourceMIT Computer Science and Artificial Intelligence Lab  | Email this | Comments

MIT projection system extends video to peripheral vision, samples footage in real-time

MIT projection system extends video to peripheral vision, samples footage in real-time

Researchers at the MIT Media Lab have developed an ambient lighting system for video that would make Philips' Ambilight tech jealous. Dubbed Infinity-by-Nine, the rig analyzes frames of footage in real-time -- with consumer-grade hardware no less -- and projects rough representations of the video's edges onto a room's walls or ceiling. Synchronized with camera motion, the effect aims to extend the picture into a viewer's peripheral vision. MIT guinea pigs have reported a greater feeling of involvement with video content when Infinity-by-Nine was in action, and some even claimed to feel the heat from on-screen explosions. A five screen multimedia powerhouse it isn't, but the team suggests that the technology could be used for gaming, security systems, user interface design and other applications. Head past the jump to catch the setup in action.

Continue reading MIT projection system extends video to peripheral vision, samples footage in real-time

MIT projection system extends video to peripheral vision, samples footage in real-time originally appeared on Engadget on Mon, 25 Jun 2012 04:55:00 EDT. Please see our terms for use of feeds.

Permalink Gizmodo  |  sourceMIT  | Email this | Comments

MIT engineers develop glucose-based fuel cell to be used in neural implants

MIT engineers develop glucose-based fuel cell to be used in neural implants

We've seen fuel cells used in a variety of gadgets -- from cars to portable chargers -- and while medical devices aren't exactly at the top of the list, they're yet another application for these mini power sources. MIT engineers are turning to sugar to make fuel cells for powering brain implants. The scientists developed cells that use platinum to strip electrons from glucose molecules found in a patient's cerebrospinal fluid to create a small electric current. The fuel cells are fabricated on a silicon chip so they can interface with other circuits in a brain implant. The prototype can generate up to hundreds of micro watts, which is enough to power neural implants used to help paralyzed patients move their limbs. Mind you, this technology is years away from making it to market. The next step will be proving that the devices work in animals, which reminds us of one Ricky the rat, who survived a biofuel cell implant back in 2010.

MIT engineers develop glucose-based fuel cell to be used in neural implants originally appeared on Engadget on Wed, 13 Jun 2012 20:53:00 EDT. Please see our terms for use of feeds.

Permalink Extreme Tech  |  sourceMIT  | Email this | Comments

MIT researchers teach computers to recognize your smile, frustration

MIT researchers teach computers to recognize your smile, frustration

Wipe that insincere, two-faced grin off your face -- your computer knows you're full of it. Or at least it will once it gets a load of MIT's research on classifying frustration, delight and facial expressions. By teaching a computer how to differentiate between involuntary smiles of frustration and genuine grins of joy, researchers hope to be able to deconstruct the expression into low-level features. What's the use of a disassembled smile? In addition to helping computers suss out your mood, the team hopes the data can be used to help people with autism learn to more accurately decipher expressions. Find out how MIT is making your computer a better people person than you after the break.

[Thanks, Kaustubh]

Continue reading MIT researchers teach computers to recognize your smile, frustration

MIT researchers teach computers to recognize your smile, frustration originally appeared on Engadget on Mon, 28 May 2012 11:06:00 EDT. Please see our terms for use of feeds.

Permalink Crazy Engineers  |  sourceMIT News  | Email this | Comments