Motus Motion-based Synthesizer: Shake it Off

A few days ago we talked about Mogees, a tiny digital musical instrument that triggers sound by vibration. The Motus is another unorthodox yet intuitive portable instrument. It lets you create or trigger sounds by motion. Fellow air guitarists, our time has come.

motus_spatial_instrument_1zoom in

Motus – no relation to music giant MOTU – uses the same sensors found in many mobile devices to gather data about its motion, orientation, speed and more. All of that are used to dynamically trigger effects, samples or prerecorded sounds. In that way it’s like Korg’s Kaoss system but in three-dimensional space. You can also use Motus as a MIDI or OSC controller or even control visuals with it.

motus_spatial_instrument_2zoom in

Motus inventor TZM Creative Lab will release its SDK and API and open an app store so that developers can add more effects and sounds. It’s also offering Arduino-compatible DIY kits.

Here’s just a very small taste of what you can do with Motus:

Motus may not be as precise as other musical instruments, but as you can see it frees up your options and adds new layers to music and performance art. Pledge at least $79 (USD) on TMZ’s independent fundraiser to receive a Motus as a reward.

Mogees Turns Any Object Into Any Musical Instrument: Play All the Things!

There are different gadgets that can turn bodies, steering wheels or food into musical instruments. Now you can turn pretty much any rigid solid object into an instrument using just one device: Mogees.

mogees_2zoom in

Mogees consists of a very sensitive contact microphone and an iOS app. You stick the mic on an object using the included reusable adhesive. Then you use the app to make taps, strikes, scratches and other gestures trigger notes, samples or sounds. The app will launch with a drum machine, a bass synth and a modeling synth.

You don’t have to be a musical outdoorsman either. Mogees has VST and Audio Unit plugins that you can use with audio editing software for Mac and Windows, allowing you to turn objects into MIDI controllers. Finally, you can use the sensor as a standalone microphone.

mogees_3zoom in

mogees_4zoom in

mogees_5zoom in

mogees_1zoom in

Pledge at least about $150 on Kickstarter to receive a Mogees sensor and the app as a reward.

Google’s Project Soli: Gesture All of the Things

Don’t touch. That should be the motto of Google’s new technology that allows people to control many of their favorite devices without ever laying a hand on them.

google_project_soli_1zoom in

Dubbed Project Soli, it’s a radar-based system that allows people to do things like rub their fingers together or tap their fingers in order to interact with the machines they love. Ivan Poupyrev, the Soli project manager, describes it this way in a video demonstrating the amazing new advance: “Capturing the possibility of the human hand was one of my passions.” He goes on to explain that he wants to take the incredible flexibility and dexterity of the hand and use it for myriad virtual purposes.

To do so, he and his team developed a tiny radar system to link your hands’ movements to your devices. In Google’s Soli video, you can see some pretty amazing implementations of the technology – resetting a watch by rubbing your fingers together above the timepiece, and using similar hand movements to adjust volume and change songs on your music player. At times it looks as if a symphony conductor is using his hand motions to control and empower a small orchestra. Soli uses a very small chip, and because radar can detect extremely precise movements, the potential applications of Soli seem wide and deep.

Now only if they could implant a chip in my three-year-old that would allow me to snap my fingers and dial down his wailing when he isn’t getting what he wants.

[via ZME Science]

Pinć Phone VR Headset Comes with Motion Sensing Rings: Don’t VR & Drive

While the world awaits for the prophesied Oculus Rift to rise, we’re seeing more and more virtual reality headsets that use smartphones as their brain. However, most of these peripherals are content to provide you a case and a pair of lenses. But with Pinć (“pinch”), Cordon Media wants to give you an ecosystem – a new way to interact with your phone.

pinc-smartphone-virtual-reality-headsetzoom in

Aside from the headset itself, Pinć will come with two motion sensing rings and an app. When combined, all three will let you use your phone in a virtual environment, using head and hand gestures as inputs. The initial applications include a web browser, a YouTube viewer and a maps app.

pinc_vr_2zoom in

As for the headset itself, it’s compatible with the iPhone 6/6 Plus and other Android 4.1+ phones of similar size, such as the Galaxy S5/Note 3/Note 4 and the Nexus 5/6. It folds flat and can be used as a protective – albeit bulky – phone case. It will also have interchangeable lenses for different focus levels.

It certainly looks cool, but I can’t tell if it’s a better way of interacting with smartphones, or if it’s just a gimmick. Also, Cordon Media doesn’t mention if the headset works with other VR software outside of its ecosystem, though I wouldn’t be surprised if it does. Pledge at least $99 (USD) on Indiegogo to get a Pinć headset as a reward.

[via Urban Daddy]

Multi Layer Interaction Uses Ultrasound to Detect Midair Gestures: Leap Mobile

One of the Moto X’s convenient features is that you can wave your hand over it to toggle certain apps, such as silencing a call or an alarm or simply to wake the screen. Elliptic Labs’ Multi Layer Interaction technology enables what appears to be a more advanced form of Moto X’s touch-less gesture recognition.

ultrasound multi layer interaction by elliptic labs 620x393magnify

Unlike the Moto X or the Leap Motion, which both depend on infrared LEDs and cameras, Multi Layer Interaction uses ultrasound speakers and microphones to detect gestures. According to Elliptic Labs, its technology has a 180º range and consumes less power than camera-based sensors. This enables gesture detection not just directly over the phone but even above, below or from either side of the device and from up to 20″ away, giving users more leeway. It also allows for distance-based gestures, as you’ll see in the video below.

I think touchless gestures could someday eliminate physical buttons on mobile devices, but on the other hand I can’t help but think of my phone’s poor battery.

[via Elliptic Labs (pdf) via Ubergizmo]

Motix Motion Sensor for Mouse and Touch Input: Minority Minority Report

Last week we saw Microsoft Research’s gesture-sensing keyboard prototype that removes the need for a mouse or touchpad. A new keyboard peripheral called Motix achieves the same effect as Microsoft’s keyboard. Its basic function lets you move your finger above the keyboard to control the mouse or simulate a touch input.

motix motion sensor 620x358magnify

Motix is a USB plug-and-play device that works with Windows, OS X and Linux, with tablet support in the works. Along with the Motix sensor itself, which is designed to sit in front of a keyboard, it also comes with a touch-sensitive strip that inventor Brent Safer calls the Position Pad. The Position Pad is meant to be placed just below a keyboard’s space bar.

motix motion sensor 2 620x342magnify

From what I understand, it behaves like a touchpad. Swiping your thumb across the Position Pad moves the cursor onscreen left or right. I could be wrong, but Brent probably included the Position Pad to make it easier to navigate widescreen displays. It does seem easier to swipe your thumb from its natural position rather than move your index finger in laterally mid-air. Besides that shortcut, the Position Pad can also have other functions, such as quick scrolling or zooming or media playback controls.

Speaking of shortcuts, Motix can also be used to map your fingers to keyboard or mouse commands. For example, in a video game you can map one of your fingers to control movement while another finger fires your weapon.

Pledge at least $80 on Kickstarter to get a Motix sensor as a reward. I prefer the setup of Microsoft’s prototype because it seems more comfortable to use. Then again, Motix is already on its way to production and seems to be more useful. Overall, it sounds an awful lot like the Leap Motion, which you can already buy for $100. I wonder which of the two is more versatile.

[via Gadgetify]

Parrot AR.Drone Controlled with Head Movement Using Oculus Rift: OculusDrone

Last year we saw a drone camera system that streamed live 3D video that can be viewed through the Oculus Rift headset. Diego Araos wrote a program that not only lets you use the Rift to view the feed from a Parrot AR.Drone 2′s camera, it also uses control the drone through the headset.

oculusdrone parrot ar drone oculus rift controller by diego araos 620x246magnify

Diego’s program OculusDrone taps into the Rift’s head tracking feature to control the AR.Drone 2 remotely. However, you need to use a keyboard command to order the AR.Drone to takeoff  (Enter) and land (Escape).

Zip to GitHub to download OculusDrone.

[via BGR via Reddit]

Festo BionicKangaroo: Energizer Joey

After creating a robot bird and dragonfly, automation company Festo shows off with another impressive animal replica. Like real kangaroos, Festo’s BionicKangaroo is not only great at jumping and keeping its balance, it can also store the energy generated from landing and use it for the next jump.

festo bionic kangaroo 620x355magnify

BionicKangaroo uses a combination of pneumatic actuators and electric servos to move and keep its balance.

festo bionic kangaroo 3 620x311magnify

According to Festo, the robot has an rubber elastic spring element that acts like an Achilles tendon: “It is fastened at the back of the foot and parallel to the pneumatic cylinder on the knee joint. The artificial tendon cushions the jump, simultaneously absorbs the kinetic energy and releases it for the next jump.”

festo bionic kangaroo 2 620x311magnify

To make the robot even fancier, Festo also made it so it can be controlled with gestures. The company uses the Myo armband to make BionicKangaroo move, stay or rotate in place. Watch BionicKangaroo hip hip hop and not stop:

It would’ve been way cooler if they made a BionicTigger instead. Check out Festo’s report (pdf) if you want to learn more about BionicKangaroo.

[via Ubergizmo]

Mi.Mu Gesture Control Music Glove: New Wave

The very talented musician Imogen Heap and her colleagues at Mi.Mu are working on a glove that will allow you to make music by moving your fingers and hands. Think Minority Report, but instead of flipping screens around your movements create sounds. Air drumming is about to be legit.

mi.mu music glove 620x465magnify

Mi.Mu has an input and output board called x-OSC that connects the glove to a computer or multiple computers over Wi-Fi. It also has an accelerometer, a gyroscope and a magnetometer. Along with the flex sensors on the glove itself, the system can detect “the orientation of your hand, the “flex” of your fingers, your current hand posture (e.g. fist, open hand, one finger point), the direction (up, down, left, right, forwards, backwards) of your hand [and] sharp movements such as drum hits.”

You can map one or more of these movements to control music software with the help of Mi.Mu’s own application, which converts your movements to OSC or MIDI. This means you can use the glove with any software that can handle those two files. The video below shows Imogen performing (!) a song using only two Mi.Mu gloves to control the music:

As you may have noticed, the glove allows the wearer to activate multiple tweaks or sounds at once. You can also use gestures to switch between your saved mappings, which should reduce the number of movements you have to memorize for a given performance.

Pledge at least £1,200 (~$2,000 USD) on Kickstarter to receive a Mi.Mu glove as a reward. Hopefully in a few years the glove will be affordable enough, so we can wash away all the hate and society can start advancing.

[via Gadgetify]

AllSee Low-power Sensor Uses Ambient Radio Signals to Detect Gestures

Many gesture detection devices, including the Kinect and the Leap Motion, use infrared cameras to sense movement. They also have dedicated chips that process the data from the cameras. These components are power-hungry, especially if they’re turned on at all times. Researchers from the University of Washington have developed a gesture detection device that uses 1,000 to 10,000 times less power than its counterparts.

allsee gesture recognition sensor by Bryce Kellogg Vamsi Talla Shyam Gollakota 620x367magnify

Bryce Kellogg, Vamsi Talla and their teacher Shyam Gollakota call the device AllSee. Instead of cameras and infrared light, it measures how the user’s hand affects ambient TV signals: “At a high level, we use the insight that motion at a location farther from the receiver results in smaller wireless signal changes than from a close-by location. This is because the reflections from a farther location experience higher attenuation and hence have lower energy at the receiver.”

The signal can also come from a dedicated RFID transmitter such as an RFID reader; future models may even use ambient Wi-Fi signals. The researchers even built prototypes that used TV signals both as source of data and as source of power, eliminating the need for a battery or plug.

Wave at your browser and go to the AllSee homepage for more on the device.

[via DamnGeeky]