Intel designs neuromorphic chip concept, our android clones are one step closer

Neuromancer chip

Most neurochip projects have been designed around melding the brain and technology in the most literal sense. Intel's Circuit Research Laboratory, however, is betting that we might get along just fine with neuromorphic (brain-like) computers. By using valves that only have to respond to the spin of an electron, as well as memristors that work as very efficient permanent storage, the researchers believe they have a design that operates on the same spikes of energy that our noggins use rather than a non-stop stream. Along with simply using power levels closer to those of our brains, the technique allows for the very subtle, massively parallel computations that our minds manage every day but which are still difficult to reproduce with traditional PCs. There's still a long path to take before we're reproducing Prometheus' David (if we want to), but we've at least started walking in the right direction.

Intel designs neuromorphic chip concept, our android clones are one step closer originally appeared on Engadget on Mon, 18 Jun 2012 16:28:00 EDT. Please see our terms for use of feeds.

Permalink MIT Technology Review  |  sourceIntel proposal (Cornell University)  | Email this | Comments

Sign language translator turns gestures into spoken letters, makes for a better world (video)

Image

By far one of the greatest challenges of sign language has been to translate it for everyday folk that wouldn't know where to begin a conversation with the deaf. Cornell University engineering students Ranjay Krishna, Seonwoo Lee and Si Ping Wang -- along with some help from Jonathan Lang -- used their final project time this past semester to close this gap in one of the more practical solutions we've seen to date. Their prototype glove uses accelerometers, contact sensors and flex sensors to translate complex finger gestures from the American Sign Language alphabet into spoken letters: after converting hand positions to digital signals, the test unit both speaks out the resulting letters and sends them to a computer, where they can be used for anything from a game (shown in the video below) to, presumably, constructing whole sentences. Along with being accurate, the Cornell work is even designed with a mind towards how it would work in the real world, as the glove and its transmitter are both wireless and powered by 9-volt batteries. We hope that the project leads to a real product and an extra bridge between the deaf and the rest of us, but in the meantime, we'll be happy that at least one form of powered glove is being put to the noblest use possible.

Continue reading Sign language translator turns gestures into spoken letters, makes for a better world (video)

Sign language translator turns gestures into spoken letters, makes for a better world (video) originally appeared on Engadget on Tue, 15 May 2012 07:45:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceSign Language Translation (Cornell)  | Email this | Comments

Cornell students steer Pong using brain waves, can’t quite play during naps (video)

Cornell brain wave PongWe here at Engadget are always fans of brain wave experiments, and so we were delighted when two Cornell University electrical engineering students, Chuck Moyes and Mengxiang Jiang, wrapped up a final project using brain waves in the best way possible: playing Pong. Their experiment links a baseball cap full of EEG-scanning electrodes to a computer, letting the cap wearer control a paddle using Alpha or Mu waves. Depending on the waves you use, you can move the paddle either by changing your concentration level or by thinking about moving your feet. You won't rack up a high score while napping (or with a teammate narrating over your shoulder), but with a budget under $75, it's hard to find fault. You can grab the source code below, and check out a video of Jiang and Moyes' handiwork after the break.

[Thanks, Chuck and Mengxiang]

Continue reading Cornell students steer Pong using brain waves, can't quite play during naps (video)

Cornell students steer Pong using brain waves, can't quite play during naps (video) originally appeared on Engadget on Wed, 02 May 2012 15:19:00 EDT. Please see our terms for use of feeds.

Permalink   |  sourceBCI, source code (GitHub)  | Email this | Comments

Auto-composing keyboard creates tunes tailored to your taste

Auto-composing keyboard creates tunes tailored to your taste

We love listening to our favorite tunes, as they provide a soundtrack to our otherwise dull and silent blogging existence. But, sometimes the lyrical stylings of Jay-Z and glorious jams of Trey Anastasio simply don't meet our musical needs. We need something different, something never before heard by human ears, to get us through the news day. Enter Cornell students Charong Chen and Siyu Zhan, who have constructed an electric keyboard that one ups Yamaha's singing piano by creating and playing its own compositions. Users simply select between two mood modes -- happy or tender -- to determine the tune's tempo, then play a couple notes and the piano sets to sating sonic cravings. There's another mode that allows users to play a melody to "train" the keyboard, which then plays permutations of that melody in an automated jam session. In that training mode, users can play as long as they like to give the keyboard a better idea of what they're into, which allows the algorithm to better tailor its audial output. The hardware making the music happen is comprised of a microcontroller (MCU) with the composing algorithm on board, a numpad for choosing the operational mode, and a 23-key piano that communicates with the MCU through a trio of encoders. The results are impressive, if not quite concert-hall quality. Hear it for yourself in the video after the break.

Continue reading Auto-composing keyboard creates tunes tailored to your taste

Auto-composing keyboard creates tunes tailored to your taste originally appeared on Engadget on Wed, 02 May 2012 00:06:00 EDT. Please see our terms for use of feeds.

Permalink Hack a day  |  sourceCornell University  | Email this | Comments

GE partners with Livermore Labs to explore efficient aircraft fuel injectors (video)

GE partners with Livermore Labs to explore efficient aircraft fuel injectors (video)
What would you do with six months of dedicated access to 261.3 teraflops of computational power? As you ponder that question, consider the case of GE Global Research, which has just announced its participation with the Lawrence Livermore National Laboratory in an effort to design more powerful and efficient aircraft engines by way of computer simulation. Specifically, GE will partner with researchers from Arizona State University and Cornell University to study the unsteady spray phenomena that's thought to be ideal for fuel injectors. Through Large Eddy Simulation, GE hopes to discover an ideal spray pattern and fuel injector design, and reduce its number of lengthy, real-world optimization trials. While the research is initially aimed at aircraft engines, the knowledge gained from these experiments may work its way into GE's other products, such as locomotive engines and land-based gas turbines. For a glimpse into GE's current research, be sure to hop the break.

Continue reading GE partners with Livermore Labs to explore efficient aircraft fuel injectors (video)

GE partners with Livermore Labs to explore efficient aircraft fuel injectors (video) originally appeared on Engadget on Tue, 10 Apr 2012 06:09:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments