ZeroN slips surly bonds, re-runs your 3D gestures in mid-air

zeron-levitation-mit-media-labs

Playback of 3D motion capture with a computer is nothing new, but how about with a solid levitating object? MIT's Media Lab has developed ZeroN, a large magnet and 3D actuator, which can fly an "interaction element" (aka ball bearing) and control its position in space. You can also bump it to and fro yourself, with everything scanned and recorded, and then have real-life, gravity-defying playback showing planetary motion or virtual cameras, for example. It might be impractical right now as a Minority Report-type object-based input device, but check the video after the break to see its awesome potential for 3D visualization.

Continue reading ZeroN slips surly bonds, re-runs your 3D gestures in mid-air

ZeroN slips surly bonds, re-runs your 3D gestures in mid-air originally appeared on Engadget on Mon, 14 May 2012 16:07:00 EDT. Please see our terms for use of feeds.

Permalink The Verge  |  sourceJinha Lee  | Email this | Comments

EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on)

Image

Ready to swap that diamond for a finger-mounted camera with a built-in trigger and Bluetooth connectivity? If it could help identify otherwise indistinguishable objects, you might just consider it. The MIT Media Lab's EyeRing project was designed with an assistive focus in mind, helping visually disabled persons read signs or identify currency, for example, while also serving to assist children during the tedious process of learning to read. Instead of hunting for a grownup to translate text into speech, a young student could direct EyeRing at words on a page, hit the shutter release, and receive a verbal response from a Bluetooth-connected device, such as a smartphone or tablet. EyeRing could be useful for other individuals as well, serving as an ever-ready imaging device that enables you to capture pictures or documents with ease, transmitting them automatically to a smartphone, then on to a media sharing site or a server.

We peeked at EyeRing during our visit to the MIT Media Lab this week, and while the device is buggy at best in its current state, we can definitely see how it could fit into the lives of people unable to read posted signs, text on a page or the monetary value of a currency note. We had an opportunity to see several iterations of the device, which has come quite a long way in recent months, as you'll notice in the gallery below. The demo, which like many at the Lab includes a Samsung Epic 4G, transmits images from the ring to the smartphone, where text is highlighted and read aloud using a custom app. Snapping the text "ring," it took a dozen or so attempts before the rig correctly read the word aloud, but considering that we've seen much more accurate OCR implementations, it's reasonable to expect a more advanced version of the software to make its way out once the hardware is a bit more polished -- at this stage, EyeRing is more about the device itself, which had some issues of its own maintaining a link to the phone. You can get a feel for how the whole package works in the video after the break, which required quite a few takes before we were able to capture an accurate reading.

Continue reading EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on)

EyeRing finger-mounted connected cam captures signs and dollar bills, identifies them with OCR (hands-on) originally appeared on Engadget on Wed, 25 Apr 2012 13:53:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Perifoveal Display tracks head positioning, highlights changing data on secondary LCDs (hands-on)

Image

If there's a large display as part of your workstation, you know how difficult it can be to keep track of all of your windows simultaneously, without missing a single update. Now imagine surrounding yourself with three, or four, or five jumbo LCDs, each littered with dozens of windows tracking realtime data -- be it RSS feeds, an inbox or chat. Financial analysts, security guards and transit dispatchers are but a few of the professionals tasked with monitoring such arrays, constantly scanning each monitor to keep abreast of updates. One project from the MIT Media Lab offers a solution, pairing Microsoft Kinect cameras with detection software, then highlighting changes with a new graphical user interface.

Perifoveal Display presents data at normal brightness on the monitor that you're facing directly. Then, as you move your head to a different LCD, that panel becomes brighter, while changes on any of the displays that you're not facing directly (but still remain within your peripheral vision) -- a rising stock price, or motion on a security camera -- are highlighted with a white square, which slowly fades once you turn to face the new information. During our hands-on demo, everything worked as described, albeit without the instant response times you may expect from such a platform. As with most Media Lab projects, there's no release date in sight, but you can gawk at the prototype in our video just after the break.

Continue reading Perifoveal Display tracks head positioning, highlights changing data on secondary LCDs (hands-on)

Perifoveal Display tracks head positioning, highlights changing data on secondary LCDs (hands-on) originally appeared on Engadget on Wed, 25 Apr 2012 13:28:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

DIY Cellphone has the footprint of an ice cream sandwich, definitely doesn’t run ICS (hands-on)

Image

Building your own wireless communications device isn't for the faint of heart, or the law-abiding -- the FCC tends to prefer placing its own stamp of approval on devices that utilize US airwaves, making a homegrown mobile phone an unlikely proposition. That didn't stop a team at the MIT Media Lab from creating such a DIY kit, however. Meet the Do-It-Yourself Cellphone. This wood-based mobile rig, while it's currently in the prototype phase (where it may indefinitely remain), would eventually ship with a circuit board, control pad, a fairly beefy antenna and a monochrome LCD. Sounds like it'd be right at home at some kid's garage workshop in the early '80s, not showcased at an MIT open house. The argument here is that people spend more time with their phone than with any other device, so naturally they'd want to build one to their liking. Nowadays, folks expect their pocketable handset to enable them to not only place and receive phone calls, but also store phone numbers, offer a rechargeable battery, and, well, in some cases even send and receive email, and surf the web -- none of which are available with such a kit.

The prototype we saw was fully functional. It could place calls. It could receive calls. There was even Caller ID! The phone does indeed feel homemade, with its laser-cut plywood case and a design that lacks some of the most basic gadget essentials, like a rechargeable battery (or at very least some provisions for replacing the 9-volt inside without unscrewing the case). Audio quality sounded fine, and calls went out and came in without a hitch -- there's a SIM card slot inside, letting you bring the nondescript phone to the carrier of your choice. Does it work? Yes. Is it worth dropping $100-150 in parts to build a jumbo-sized phone with a microscopic feature set? No, there's definitely nothing smart about the DIY Cellphone. If you want to throw together your own handset, however, and not risk anyone questioning the legitimacy of your homemade claim, you might want to keep an eye out for this to come to market. The rest of you will find everything you need in the video just past the break. We're just happy to have walked away without any splinters.

Continue reading DIY Cellphone has the footprint of an ice cream sandwich, definitely doesn't run ICS (hands-on)

DIY Cellphone has the footprint of an ice cream sandwich, definitely doesn't run ICS (hands-on) originally appeared on Engadget on Wed, 25 Apr 2012 12:22:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

OLED Display Blocks pack six 128 x 128 panels, we go hands-on at MIT (video)

Image

How do you develop an OLED display that gives a 360-degree perspective? Toss six 1.25-inch panels into a plastic cube, then turn it as you see fit. That's an overly simplistic explanation for the six-sided display on hand at the MIT Media Lab today, which is quite limited in its current form, but could eventually serve an enormous variety of applications. Fluid Interfaces Group Research Assistant Pol Pla i Conesa presented several such scenarios for his Display Blocks, which consist of 128 x 128-pixel OLED panels. Take, for example, the 2004 film Crash, which tells interweaving stories that could be presented simultaneously with such a display -- simply rotate the cube until you land on a narrative you'd like to follow, and the soundtrack will adjust to match. It could also go a long way when it comes to visualizing data, especially when in groups -- instead of virtually constructing profiles of individuals who applied for a slot at MIT, for example, or segments of a business that need to be organized based on different parameters, you could have each assigned to a cube, which can be tossed into an accepted or rejected pile, and repositioned as necessary.

Imagine having a group of display cubes when it comes time to plan the seating chart for a reception -- each cube could represent one individual, with a color-coded background and a name or photo up top, with different descriptive elements on each side. The same could apply to products at monstrous companies like Samsung or Sony, where executives need to make planning decisions based on product performance, and could benefit greatly from having all of the necessary information for a single gadget listed around each cube. On a larger scale, the cubes could be used to replace walls and floors in a building -- want to change the color of your wallpaper? Just push a new image to the display, and dedicate a portion of the wall for watching television, or displaying artwork. You could accomplish this with networked single-sided panels as well, but that wouldn't be nearly as much fun. The Media Lab had a working prototype on display today, which demonstrated the size and basic functionality, but didn't have an adjustable picture. Still, it's easy to imagine the potential of such a device, if, of course, it ever becomes a reality. As always, you'll find our hands-on demo just past the break.

Continue reading OLED Display Blocks pack six 128 x 128 panels, we go hands-on at MIT (video)

OLED Display Blocks pack six 128 x 128 panels, we go hands-on at MIT (video) originally appeared on Engadget on Tue, 24 Apr 2012 17:44:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Droplet and StackAR bring physical interface to virtual experiences, communicate through light (hands-on)

Image

Light-based communication seems to wind throughout the MIT Media Lab -- it is a universal language, after all, since many devices output light, be it with a dedicated LED or a standard LCD, and have the capacity to view and interpret it. One such device, coined Droplet, essentially redirects light from one source to another, while also serving as a physical interface for tablet-based tasks. Rob Hemsley, a research assistant at the Media Lab, was on hand to demonstrate two of his projects. Droplet is a compact self-contained module with an integrated RGB LED, a photodiode and a CR1216 lithium coin battery -- which provides roughly one day of power in the gadget's current early prototype status. Today's demo used a computer-connected HDTV and a capacitive-touch-enabled tablet. Using the TV to pull up a custom Google Calendar module, Hemsley held the Droplet up to a defined area on the display, which then output a series of colors, transmitting data to the module. Then, that data was pushed to a tablet after placing the Droplet on the display, pulling up the same calendar appointment and providing a physical interface for adjusting the date and time, which is retained in the cloud and the module itself, which also outputs pulsing light as it counts down to the appointment time.

StackAR, the second project, functions in much the same way, but instead of outputting a countdown indicator, it displays schematics for a LilyPad Arduino when placed on the tablet, identifying connectors based on a pre-selected program. The capacitive display can recognize orientation, letting you drop the controller in any position throughout the surface, then outputting a map to match. Like the Droplet, StackAR can also recognize light input, even letting you program the Arduino directly from the tablet by outputting light, effectively simplifying the interface creation process even further. You can also add software control to the board, which will work in conjunction with the hardware, bringing universal control interfaces to the otherwise space-limited Arduino. Both projects appear to have incredible potential, but they're clearly not ready for production just yet. For now, you can get a better feel for Droplet and StackAR in our hands-on video just past the break.

Continue reading Droplet and StackAR bring physical interface to virtual experiences, communicate through light (hands-on)

Droplet and StackAR bring physical interface to virtual experiences, communicate through light (hands-on) originally appeared on Engadget on Tue, 24 Apr 2012 15:03:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

MIT gets musical with Arduino-powered DrumTop, uses household objects as a source of sound

Image

Everyone's favorite microcontroller has been a boon among hobbyists and advanced amateurs, but it's also found a home among the brilliant projects at MIT's Media Lab, including a groovy instrument called DrumTop. This modern take on the drum pad delivers Arduino-powered interactivity in its simplest form -- hands-on time with ordinary household objects. Simply place a cup, or a plastic ball, even a business card on the DrumTop to make your own original music.

The prototype on display today includes eight pads, which are effectively repurposed speakers that tap objects placed on top, with an FSR sensor recognizing physical pressure and turning it into a synchronized beat. There's also a dial in the center that allows you to speed up or slow down the taps, presenting an adjustable tempo. DrumTop is more education tool than DJ beat machine, serving to teach youngsters about the physical properties of household objects, be it a coffee mug, a CD jewel case or a camera battery. But frankly, it's a lot of fun for folks of every age. There's no word on when you might be able to take one home, so for now you'll need to join us on our MIT visit for a closer look. We make music with all of these objects and more in the video after the break.

Continue reading MIT gets musical with Arduino-powered DrumTop, uses household objects as a source of sound

MIT gets musical with Arduino-powered DrumTop, uses household objects as a source of sound originally appeared on Engadget on Tue, 24 Apr 2012 12:35:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NewsFlash uses high-frequency light to transmit data from iPad to smartphone, we go hands-on (video)

Image

MIT's Media Lab is chock-full of cutting-edge tech projects that researchers create, then often license to manufacturers and developers. One such project is called NewsFlash, and uses high-frequency red and green light to transmit data to the built-in camera on a receiving device -- in this case Samsung's Epic 4G. The concept is certainly familiar, and functions in much the same way as a QR code, generating flashing light that's invisible to the human eye instead of a cumbersome 2D square. In the Media Lab's implementation, an iPad is used to display a static news page with flashing colored bands at the top, representing just a few vertical pixels on the LCD.

As the device presents the standard touch experience you're already familiar with, it also broadcasts data that can be read by any camera, but flashes too quickly to be distracting or even noticeable to the naked eye. A NewsFlash app then interprets those flashes and displays a webpage as instructed -- either a mobile version with the same content, or a translation of foreign websites. As with most MediaLab projects, NewsFlash is simply a concept at this point, but it could one day make its way to your devices. Jump past the break to see it in action.

Continue reading NewsFlash uses high-frequency light to transmit data from iPad to smartphone, we go hands-on (video)

NewsFlash uses high-frequency light to transmit data from iPad to smartphone, we go hands-on (video) originally appeared on Engadget on Tue, 24 Apr 2012 10:41:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments