The NBA faces a big challenge now that it offers all its player statistics to the public -- how does it generate stats that hold the interest of basketball fans? The league's solution is a multi-year agreement to use Stats LLC's SportVU motion tracking system in every arena (15 teams had already implemented the technology on their own). As of the 2013-14 season, every NBA arena will have a six-camera setup that creates a steady stream of player data based on ball possession, distance, proximity and speed. The NBA's website, NBA Game Time and NBA TV will all use the information to expand game stats beyond what we see today with heat maps and specific details on each possession. There's no telling how useful that extra knowledge will be, but we won't be shocked if it helps settle a few sports bar arguments.
Via: AP (Yahoo)
Researchers at the University of Tokyo's Ishikawa Oku Lab have been hard at work on a camera system that can track fast moving objects incredibly well, and the technology may change the way sports like baseball and soccer are televised. Recently, the team building the system has entered the next phase of testing: taking it outside, to see if will perform as well as it has in a lab setting. If all goes according to plan, they expect it'll be ready for broadcast use in roughly two years.
Demos of the tech are pretty impressive, as you can see in the video below showing the (warning: not recommended watching for those easily prone to motion sickness). To get the ping-pong ball-centric shots, the system uses a group of lenses and two small mirrors that pan, tilt and move so the camera itself doesn't have to. The mirrors rely on a speedy image tracking system that follows movement, rather than predicting it. Swapping the camera out for a projector also has some interesting applications -- it can paint digital pictures on whatever its tracking. Sounds like the perfect gadget for folks who wish their table tennis balls looked like emoji.
Source: Ishikawa Oku Laboratory
While patrolling the halls of the CHI 2013 Human Factors in Computing conference in Paris, we spied a research project from MIT's Media Lab called "Smarter Objects" that turns Minority Report tech on its head. The researchers figured out a way to map software functionality onto tangible objects like a radio, light switch or door lock through an iPad interface and a simple processor / WiFi transceiver in the object. Researcher Valentin Huen explains that "graphical user interfaces are perfect for modifying systems," but operating them on a day-to-day basis is much easier using tangible objects.
To that end, the team developed an iPad app that uses motion tracking technology to "map" a user interface onto different parts of an object. The example we saw was a simple radio with a a pair of dials and a speaker, and when the iPad's camera was pointed at it, a circular interface along with a menu system popped up that cannily tracked the radio. From there, Huen mapped various songs onto different positions of the knob, allowing him to control his playlist by moving it -- a simple, manual interface for selecting music. He was even able to activate a second speaker by drawing a line to it, then "cutting" the line to shut it off. We're not sure when, or if, this kind of tech will ever make it into your house, but the demo we saw (see the pair of videos after the break) seemed impressively ready to go.