Tag Archives: UniversityOfToronto
Researchers teach a computer to compose sonnets like Shakespeare
Handheld skin printer could help heal burn victims
ICYMI: Genetically-based cancer meds, taste’s base and more
Doctors grow tumors that roll up like toilet paper
Mandatory South Korean parental control app is a security nightmare
AeroVelo’s human-powered helicopter bags $250,000 Sikorsky Prize
We're sure AeroVelo team members think every sleepless night and pedal push are worth it now that they can add the prestigious $250,000 Sikorsky Prize to their pile of bragging rights. They've completely demolished all the requirements needed to win the human-powered helicopter competition during one of their recent attempts. Atlas, their flying contraption, stayed in the air for 64.11 seconds, flew at a max altitude of 3.3 meters (10.8 feet) and never meandered beyond the designated 10 x 10 meter (33 x 33 feet) area. The University of Toronto's creation was locked in head-to-head battle with the University of Maryland's Gamera chopper for quite some time, but it's finally bagged the prize that had remained unclaimed for 33 long years. That's a tremendous accomplishment for anyone, especially for a project with humble beginnings, and if Leonardo Da Vinci were still alive, he'd extend a big congratulazione.
Filed under: Misc
Via: Popular Mechanics
Source: AeroVelo
University of Toronto student tech shoots HDR video in real-time (eyes-on)
Sure, you love the HDR pictures coming from your point-and-shoot, smartphone or perhaps even your Glass. But what if you want to Hangout in HDR? An enterprising grad student from the University of Toronto named Tao Ai -- under the tutelage of Steve Mann -- has figured out how to shoot HDR video in real-time. The trick was accomplished using a Canon 60D DSLR running Magic Lantern firmware and an off-the-shelf video processing board with a field programmable gate array (FPGA), plus some custom software to process the video coming from the camera. It works by taking in a raw feed of alternatively under and over exposed video and storing it in a buffer, then processing the video on its way to a screen. What results is the virtually latency-free 480p resolution HDR video at 60 frames per second seen in our video after the break.
When we asked whether higher resolution and faster frame rate output is possible, we were told that the current limitations are the speed of the imaging chip on the board and the bandwidth of the memory buffer. The setup we saw utilized a relatively cheap $200 Digilent board with a Xilinx chip, but a 1080p version is in the works using a more expensive board and DDR3 memory. Of course, the current system is for research purposes only, but the technology can be applied in consumer devices -- as long as they have an FPGA and offer open source firmware. So, should the OEM's get with the program, we can have HDR moving pictures to go with our stationary ones.
Google acquires neural network startup that may help it hone speech recognition and more
Mountain View has just picked up some experts on deep neural networks with their acquisition of DNNresearch, which was founded last year by University of Toronto professor Geoffrey Hinton and graduate students Alex Krizhevsky and Ilya Sutskever. The group is being brought into the fold after developing a solution that vastly improves object recognition. As a whole, advances in neural nets could lead to the development of improved computer vision, language understanding and speech recognition systems. We reckon that Page and Co. have a few projects in mind that would benefit from such things. Both students will be transitioning to Google, while Hinton will split his attention between teaching and working with the search giant.
Filed under: Misc, Internet, Google
Via: TechCrunch
Source: University of Toronto
Autodesk researchers develop ‘magic finger’ that reads gestures from any surface (video)
By combining a camera that detects surfaces with one that perceives motion, Canadian university researchers and Autodesk have made a sensor that reads finger gestures based on which part of your body you swipe. The first camera can detect pre-programmed materials like clothing, which would allow finger movements made across your pants or or shirt to activate commands that call specific people or compose an email, for instance. Autodesk sees this type of input as a possible compliment to smartphones or Google Glasses (which lack a useful input device), though it says the motion detection camera isn't accurate enough yet to replace a mouse. Anyway, if you wanted that kind of device for your digits, it already exists -- in spades.
Filed under: Displays, Wearables
Autodesk researchers develop 'magic finger' that reads gestures from any surface (video) originally appeared on Engadget on Mon, 22 Oct 2012 08:49:00 EDT. Please see our terms for use of feeds.
Permalink | | Email this | Comments