Alps Electric integrates motion sensors and eye detection into vehicle cockpit of the future (video)

Alps Electric integrates motion sensors, capacitive touch, eye detection into vehicle cockpit of the future video

Residing in hall space a fair distance away from the likes of the Toyota and Sony, the automotive division of Alps Electric was demonstrating a forward-looking vehicle interface at CEATEC 2012. Connecting together the company's existing capacitive touch technology with motion sensors and eye movement cameras, the system centers on the multimodal commander -- that mysterious-looking orb located below the gear stick. Navigation through it can be done through waving your hand over the device, swiping or rotating the orb like a dial. This can then move through weather, music and map programs, which are all integrated into the car's touchscreen, while an overhanging motion sensor will also detect where your hand is headed. An Alps spokesman said that this means the system can try to predict your intentions, adjusting the UI before you reach for the controls. We've got a hands-on video from pretty busy showroom -- and more impressions -- after the break.

Continue reading Alps Electric integrates motion sensors and eye detection into vehicle cockpit of the future (video)

Filed under: ,

Alps Electric integrates motion sensors and eye detection into vehicle cockpit of the future (video) originally appeared on Engadget on Wed, 03 Oct 2012 08:21:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Tokyo University of Science shows off robotic suit powered by pneumatic artificial muscles (video)

Tokyo University of Science shows off robotic suit powered by neumatic artificial muscles video

What can one do with a robot suit? Well, it's certainly not limited to just lifting sacks of rice, but that was exactly what we got to do at CEATEC courtesy of Koba Lab from Tokyo University of Science. First seen in 2009, the magic behind this 9kg kit are the pair of pneumatic artificial muscles (aka McKibben artificial muscles) on the back, which are made by industrial equipment manufacturer Kanda Tsushin. When pressurized with air using electrical components from KOA Corporation, the lightweight, loosely-woven PET tubes contract and consequently provide support to the user's back, shoulders and elbows. As such, our arms were able to easily hold two more sacks of rice (making it a total of 50kg) until the demonstrator deflated the muscles. Check out our jolly hands-on video after the break.

Continue reading Tokyo University of Science shows off robotic suit powered by pneumatic artificial muscles (video)

Filed under:

Tokyo University of Science shows off robotic suit powered by pneumatic artificial muscles (video) originally appeared on Engadget on Tue, 02 Oct 2012 21:02:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Fujitsu demos ad transmission technology, sends info from TV to handset via smartphone camera (video)

Fujitsu demos ad transmission technology, sends info from TV to handset via smartphone camera video

Another easter egg at Fujitsu's CEATEC booth was a system for transmitting coupons, URLs and other digital information from a TV screen to a user's smartphone. We'll back up a bit: the data ends up on-screen in the first place thanks to information embedded in light flashing at various levels of brightness (the frame rate is too quick to be detected by the human eye). Theoretically, when a viewer is watching a commercial, they'll see a prompt to hold up their phone's camera to the screen, and doing so will bring up a corresponding coupon or website on their handset -- it takes about two to three seconds here for the recognition. The embedded information covers the entire panel, so users don't need to point their device at a particular section of the screen.

In Fujitsu's demo, pointing a smartphone at the TV pulled up a website on the phone. It only took about a second for the URL to pop up on the device, and there was no noticeable flickering on the TV itself (essentially, the picture looks identical to what you'd see on a non-equipped model, since your eye won't notice the code appearing at such a high frequency). The company says this technology works at a distance of up to two or three meters. Head past the break to take a look at the prototype in action.

Continue reading Fujitsu demos ad transmission technology, sends info from TV to handset via smartphone camera (video)

Filed under: ,

Fujitsu demos ad transmission technology, sends info from TV to handset via smartphone camera (video) originally appeared on Engadget on Tue, 02 Oct 2012 16:42:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Rohm 5Wh hydrogen fuel cells power up smartphones, ready for the trash after one charge

Rohm 5Wh hydrogen fuel cells power up smartphones, ready for the trash after one charge

Rohm's hydrogen fuel cells are meant to power smartphones and other mobile devices, but unlike other juicing-up solutions, the cells are only good for one charge. Rohm says disposable fuel cells can be made smaller and lighter than their multi-use counterparts, and as the only byproduct is hydrogen, the company is touting the cells' eco-friendliness. The system generates electricity by using hydrogen that's created by the reaction of a metal material and water. While the device we saw here at CEATEC is a prototype, Rohm may offer its recharging system as both a smartphone case and a USB-attachable juicepack. Each offers 5Wh and can fully charge a handset once. There's also a 200W power generator, which certainly stretches the meaning of portable but can keep a laptop, LCD TV and a peripheral or two going for three to four hours. Rohm says its fuel cells will see a commercial release some time in 2013; for now you can get a sneak peek in our hands-on gallery below.

Zach Honig contributed to this report.

Filed under: ,

Rohm 5Wh hydrogen fuel cells power up smartphones, ready for the trash after one charge originally appeared on Engadget on Tue, 02 Oct 2012 11:22:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NTT DoCoMo translation app converts languages in real time (hands-on video)

NTT DoCoMo translation Android app converts languages in real time handson video

Last year at CEATEC, we saw NTT DoCoMo demo its translation app, which made life easier by translating a Japanese menu into English text. This time around the carrier is showing off the new Hanashite Hon'yaku service for Android devices, which can translate spoken Japanese to English and vice versa (it supports a total of 10 languages, including French, German and Korean). In addition to providing an on-screen translation, the system reads out your speaking partner's words in your language.To use the service, you need an Android-enabled (2.2 and higher) device running on either the carrier's spumode or moperaU plan. Provided you fit those requirements, you'll simply have to dial the other party, speak into the phone and wait for it to play back your words in a foreign tongue.

Of course, you can also use the service in person, which is exactly what we did at DoCoMo's booth. When we gave it a test run with some simple questions ("Where are you from?", "What time is it?"), the app had no trouble spitting back those phrases in Japanese so the DoCoMo rep could respond. When he answered in Japanese, the translation to English was equally seamless, taking just a second or two to communicate that he is from Japan. Though the app is free, you'll have to pay call and data charges (using the service for face-to-face conversation only entails a data fee). The cross-cultural barriers will break down starting November 1st, but you can get a glimpse of the service in action just after the break.

Continue reading NTT DoCoMo translation app converts languages in real time (hands-on video)

Filed under:

NTT DoCoMo translation app converts languages in real time (hands-on video) originally appeared on Engadget on Tue, 02 Oct 2012 10:29:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NTT DoCoMo’s i beam tablet prototype is driven by your eyes (video)

NTT DoCoMo's i beam tablet prototype is driven by your eyesvideo

Another prototype from DoCoMo aimed at Nihon's commuters, the i beam concept tablet forgoes any touch at all, allowing the user (once they're at the specified 'sweet spot') to navigate around apps and screens using your eyes. Two sensors along the bottom edge of the tablet track both of your eyes and after a slightly laborious configuration setup, we were able to tour around the prototype slabs features without laying a finger on it. The navigational dot was a little erratic, but we'll put that down to prototype nerves. The tablet was otherwise able to follow our eye-line and fulfill what we wanted it to do.

Returning to the home screen by targeting the kill box in the top right corner proved to be the most difficult thing -- we soon resorted to tapping at the screen for that. DoCoMo showcased an eye-controlled game, alongside picture galleries, a web browser and a reader app. The e-book client seemed to be the most heavily involved, with the ability to look up words with a hard-stare, and flip pages by eyeing the two lower corners. The Japanese carrier isn't planning a consumer launch any time soon -- and the hardware comes with a pretty pronounced chin at the moment, but if you like staring at someone staring at a tablet, our eyes-on is after the break.

Continue reading NTT DoCoMo's i beam tablet prototype is driven by your eyes (video)

Filed under: ,

NTT DoCoMo's i beam tablet prototype is driven by your eyes (video) originally appeared on Engadget on Tue, 02 Oct 2012 09:32:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NTT DoCoMo hands-free videophone prototype replaces that off-center webcam stare with your digital doppelganger (video)

NTT DoCoMo handsfree videophone prototype replaces that offcenter webcam stare with your digital replica handson

In a sort of reverse-Project Glass, one of DoCoMo's latest prototypes flips its cameras back at the wearer. This hands-free videophone headset ties together seven separate cameras, each recording 720p video from wide-angle lenses. Aside from the single camera pointing behind the user (and beaming the background image), the rest of them point at the users' face, recording different quadrants. These are then composited together, creating a three-dimensional avatar of the user that's then broadcasted to the other caller. The model then nods, blinks, and moves -- all based on the camera footage -- all in real-time.

In its current guise, the bottom half of the face is still composed from high resolution stills captured beforehand, but the program is able to animate the mouth based on the words and tones that the built-in mic picks up. NTT DoCoMo had some lighter, slight less clunky, future prototypes on show, and suggested that the headset could have medical applications, embedding further sensors that could gauge blood pressure, pulse and temperature and possibly broadcast this data during a call to your future physician. Work is currently underway to utilize smaller, higher quality sensors. We take a closer look at CEATEC after the break.

Continue reading NTT DoCoMo hands-free videophone prototype replaces that off-center webcam stare with your digital doppelganger (video)

Filed under: , , ,

NTT DoCoMo hands-free videophone prototype replaces that off-center webcam stare with your digital doppelganger (video) originally appeared on Engadget on Tue, 02 Oct 2012 09:23:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

NEC Medias Tab UL runs Android 4.0, weighs just over half a pound (hands-on)

NEC Medias Tab UL runs Android 40, weighs just over half a pound handson

Folded in between DoCoMo R&D prototypes and One Piece-themed smartphones unlikely to make it across the Pacific, NEC's new Android tablet caught our eye. The 7-inch NEC Medias Tab UL is one very svelte slab. Measuring in at just 7.9mm (0.3 inches) thick and weighing a mere 250g (0.55 lbs), the tablet still manages to house a 3,100mAh battery, and a Snapdragon MSM8960 1.5GHz dual-core processor. If you'd compare it to the Nexus 7, Google's own effort look a little weighty and thick against this white-finish tablet. Performance from the dual-core chip is also suitably impressive, despite the curious DoCoMo-decked Android skin coating the Ice Cream Sandwich OS.

In true Japanese style, there's a TV aerial embedded within the side and while it won't broadcast the crisp high-definition delights of NOTTV, there's plenty of terrestrial viewing available -- if you stay in the Land of the Rising Sun. We were pleasantly surprised with its crisp WXGA screen, which looks to be TFT. The 1,280 x 800 resolution display meant videos and websites looked sharp, while there was barely any color degradation at wider angles. On DoCoMO's network, users can expect to see download speeds up to 75Mbs, and upload speeds hitting up to 25Mbps. The tablet is now on sale across Japan, but there's still no word yet on it launching elsewhere.

Continue reading NEC Medias Tab UL runs Android 4.0, weighs just over half a pound (hands-on)

Filed under:

NEC Medias Tab UL runs Android 4.0, weighs just over half a pound (hands-on) originally appeared on Engadget on Tue, 02 Oct 2012 07:43:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Fujitsu eye-tracking tech uses built-in motion sensor, infrared LED for hands-free computing (video)

Fujitsu eyetracking tech uses builtin motion sensor, infrared LED for handsfree computing

Eye-tracking technology looks to be one of the major tropes at CEATEC this year. One of many companies demoing a gaze-following setup is Fujitsu, which is showing off a prototype desktop PC with a built-in sensor and infrared LED. This configuration should be cheaper than many other eye-controlled solutions out there, as the components are integrated directly into the computer and no external hardware is needed. It's sweet and simple: the camera captures the reflection of light on the user's eye, and image processing technology then calculates the user's viewing angle to allow for hands-free navigation on-screen.

We got a brief eyes-on with Fujitsu's demo, which shows off the eye-controlled tech working with a map application. Even without any detectible calibration, the system did a respectable job of navigating around Tokyo based on how we moved our eyes. Panning from right to left works especially seamlessly, but moving up and down required a bit more effort -- we caught ourselves moving our whole head a few times. This is an early demonstration of course, though Fujitsu has already enumerated several applications for this technology, from assisting disabled users to simply eliminating the need to look down at the mouse and keyboard. See the gaze detection in action in our hands-on video past the break.

Continue reading Fujitsu eye-tracking tech uses built-in motion sensor, infrared LED for hands-free computing (video)

Filed under: , ,

Fujitsu eye-tracking tech uses built-in motion sensor, infrared LED for hands-free computing (video) originally appeared on Engadget on Tue, 02 Oct 2012 06:46:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments

Toyota’s Smart Insect concept EV packs Kinect motion sensor, voice recognition (video)

Toyota's Smart Insect concept EV packs Kinect motion sensor, voice recognition video

Toyota is showing off its new Smart Insect prototype at the company's CEATEC booth. The fully electric car charges via a standard 100-volt AC outlet, and it's decked out with gull-wing doors and motion detection courtesy of Microsoft's Kinect. The on-board motion sensors allow the car to recognize its owner based on face and body shape, and it predicts the owner's behavior by analyzing movement and determining when to open the door, for example. (It also allows for the front and rear displays to show a welcome message when the owner approaches the car.) There's also voice recognition for opening the car door and other functions, with a speaker on the hood of the car and dashboard-mounted "dialogue monitors" on the front and back.

The tech carries through to the Insect's interior, which sports a wireless charging pad, a dash-mounted monitor that connects to the driver's handset and a button for dialing up Toyota's virtual agent. As a connected car, the Insect naturally ties in with entertainment and navigation services (in this case, via Toyota's Smart Center). There's also integration with a home energy management system, which allows the owner to adjust air conditioning and lock the front door via a smartphone app. As this is a proof of concept -- and one we couldn't test out, at that -- it's unclear how well these features work, and it's unlikely that we'll ever see the prototype make it to market. Still, it's fun to dream, and you can do that by tuning into our hands-on video just past the break.

Continue reading Toyota's Smart Insect concept EV packs Kinect motion sensor, voice recognition (video)

Filed under:

Toyota's Smart Insect concept EV packs Kinect motion sensor, voice recognition (video) originally appeared on Engadget on Tue, 02 Oct 2012 05:35:00 EDT. Please see our terms for use of feeds.

Permalink   |   | Email this | Comments