Hydra evolved: Sixense Stem launches on Kickstarter, we go hands-on with a prototype (video)

Hydra evolved Sixsense Stem launches on Kickstarter, we go handson with a prototype

Sixense might not be a household name, but its electromagnetic motion sensing technology crops up in the darndest places. The 1:1 tracking technology is used in medical rehabilitation and Japanese arcade games, but it's most widely known as the wizardry behind the Razer Hydra motion controller. Now the company is gearing up to release a spiritual successor to the Hydra, the Sixense Stem System.

Like the Hydra, Stem offers six degrees of motion-tracking freedom, albeit without the wires or Razer branding. It isn't necessarily more accurate, but it is more comprehensive -- it's a modular system that offers up to five trackable modules, or "Stems," that attach to game controllers, VR headsets, accessories or even appendages. We caught up with Sixense president and CEO Amir Rubin to learn more about the Stem's Kickstarter launch and the company's first foray into the consumer product space.

Filed under: ,

Comments

Source: Kickstarter

Hisense picks up Hillcrest Labs’ gesture and motion control tech for TVs

Hisense picks up Hillcrest Labs' gesture and motion control tech for TVs

Following LG and TCL, Hisense is now the latest TV manufacturer to adopt Hillcrest Labs' Freespace technology. According to the agreement, Hisense, the world's fifth largest smart TV brand (as of Q1 2013, according to NPD DisplaySearch), will be able to add in-air pointing, gesture control and motion control -- all via a remote control -- to its future smart TVs and set-top boxes. This also means TCL now faces a fellow Chinese competitor with the same set of Freespace features. While there's no time frame just yet, we've been told that Hisense will eventually sell these next-gen devices in the US and China later this year, so stay tuned.

Filed under: ,

Comments

Leap Motion controller review

Leap Motion controller review

When the Leap Motion controller was revealed to the world, it brought with it the promise of a new and unique computer user experience. And, ever since we first got to see what the Leap Motion controller could do -- grant folks the ability to interact with a computer by waving their fingers and fists -- we've wanted one of our own to test out. Well, our wish was granted: we've gotten to spend several days with the controller and a suite of apps built to work with it. Does the device really usher in a new age of computing? Is it worth $80 of your hard-earned cash? Patience, dear reader, all will be revealed in our review.

Filed under:

Comments

Leap Motion controllers now shipping

Would you look at that? Seems Leap Motion's eagerly awaited motion controller has started shipping a few days early -- well, a few days before its delayed July 22nd date, but we'll take it. We've received a couple of confirmations from future Leapers that their devices are on the way. Until they actually arrive, however, why not take a look at some of the apps developers have been working on for the system?

[Thanks to everyone who sent this in]

Filed under:

Comments

The New York Times Leap Motion app: for all the news that’s fit for gestures (video)

The New York Times Leap Motion app for all the news that's fit for gestures video

Few of us reading the morning news enjoy putting our greasy hands on a tablet or newspaper just to flip through articles. With the newly unveiled New York Times app for the Leap Motion Controller, we won't have to. The release lets news hounds navigate stories (and ads) through a unique interface optimized for touch-free gestures. Both Mac and Windows versions of the NYT app will be available in the Airspace store on July 22nd, the same day Leap Motion ships to customers. More importantly, the app will be free -- at least at launch, readers won't run into the usual paywall. If the prospect of contact-free news has you intrigued, there's a video demo available after the break.

Filed under: ,

Comments

Source: New York Times Idea Lab

Leap Motion starts expanded beta, opens dev portal to the public, shows off Airspace app store (hands-on)

Leap Motion starts expanded beta, opens dev portal to the public, shows off Airspace app store handson

Slowly but surely Leap Motion is making its way toward a commercial release. Today, the company has announced it's moving into the next phase of beta testing and that it will be opening up its developer portal to the public later in the week. While this still won't get folks a Leap device any faster, it will let them dig into Leap's tools and code base in preparation for when they finally get one. The move marks a shift from the company's previous SDK-focused beta to a consumer-focused one that'll serve to refine the UX in Windows and OSX. Within each operating system, there will be two levels of Leap control: basic, which essentially allows you to use Leap in place of a touchscreen, and advanced to allow for more 3D controls enabled by Leap's ability to detect the pitch and yaw of hands in space.

CEO Michael Buckwald gave us this good news himself, and also gave us a preview of Airspace, Leap's app store, and a few app demos for good measure. As it turns out, Airspace is a two-pronged affair -- Airspace Store is showcase for all software utilizing the Leap API and Airspace Home is a launcher that keeps all the Leap apps that you own in one convenient place. There will be 50 apps in Airspace at the start of the beta, with offerings from pro tools and utility apps to casual games, and we got to see a few examples.

Comments

eyeSight software uses standard cameras to power 3D gesture controls (video)

DNP eyeSight

Turning regular ol' devices into motion-activated wonders is all the rage these days, and a company called eyeSight is determined to stand out from the pack. The brains behind eyeSight claim to have developed a purely software-based solution for equipping PCs, TVs and mobile devices with 3D gesture controls using existing standard cameras. It sounds like a pretty sweet deal, but it all comes down to whether or not eyeSight can deliver on its potential. If it can, then it could be a promising sign that gesture-controlled technology is on its way to becoming more accessible for budget-conscious consumers, since a software setup would negate the need for costly hardware. Currently, the platform is limited to developer SDKs, but you can watch an eyeSight-powered Google Earth demo after the break.

Filed under:

Comments

PMD and Infineon to enable tiny integrated 3D depth cameras (hands-on)

PMD and Infineon show off CamBoard Pico S, a tiny 3D depth camera for integration video

After checking out SoftKinetic's embedded 3D depth camera earlier this week, our attention was brought to a similar offering coming from Germany's PMD Technologies and Infineon. In fact, we were lucky enough to be the first publication to check out their CamBoard Pico S, a smaller version of their CamBoard Pico 3D depth camera that was announced in March. Both reference designs are already available in low quantities for manufacturers and middleware developers to tinker with over USB 2.0, so the two companies had some samples up and running at their demo room just outside Computex.

Filed under:

Comments

WiSee uses WiFi signals to detect gestures from anywhere in your house (video)

DNP WiSee  video

Have you always dreamed of controlling your TV by flailing in the next room? Researchers at the University of Washington have just the system for you: WiSee, a gesture-recognition interface that uses WiFi to control things like sound systems and temperature settings. Since WiFi signals are capable of passing through walls, WiSee can detect gestures made from neighboring rooms, breaking free from the line-of-sight method relied on by devices like Kinect and Leap Motion. Unlike those two, WiSee doesn't require an additional sensor; the software can theoretically be used with any WiFi-connected device and a router with multiple antennae to detect Doppler shifts created by movement. The prototype was tested in both an office environment and a two-bedroom apartment, and the team reported a 94% accuracy with a set of nine distinct gestures. If you watch the video, embedded after the break, you'll notice that each user performs an identifying motion prior to the control gesture. It's a trick the team picked up from studying Kinect's solution for distinguishing between specific individuals in crowded rooms. Intrigued? Head over to the source link to read the report in full.

Filed under:

Comments

Via: The Verge

Source: University of Washington

SoftKinetic teases embedded 3D depth camera, coming to Intel devices next year (hands-on)

SoftKinetic previews its embedded 3D depth camera at Computex 2013 video

At Intel's Computex keynote earlier today, the chip maker teased that it expects embedded 3D depth cameras to arrive on devices in the second half of 2014. Luckily, we got an exclusive early taste of the technology shortly after the event, courtesy of SoftKinetic. This Belgian company not only licenses its close-range gesture tracking middleware to Intel, but it also manufactures time-of-flight 3D depth cameras -- including Creative's upcoming Senz3D -- in partnership with South Korea-based Namuga. Read on to see how we coped with this futuristic piece of kit, plus we have a video ready for your amusement.

Filed under: ,

Comments