Eye-tracking Mario demo offers glipse at the next generation

Stream:

News Bot

Your News Bitch
3,282
0
0
0
Console: Headset:
News By Graham Templeton Nov. 17, 2013 10:01 am
3D printing has come a long way since the early days, back when it seemed its only utility to the everyday citizen was making little custom brackets or maybe an SD card holder. Now we know that 3D printing has all sorts of other, more exciting possible applications, but it’s surprising how a small, custom-made piece of plastic can still be the lynchpin allowing some truly amazing accomplishments. The latest example? A $25, 3D-printed bracket that holds a webcam onto the stock Google Glass unit — but the camera is pointed in, not out.
Eye-tracking, or in this case pupil-tracking, makes sense as an interface for a device that sits directly over your eyes. The problem is that eye movement has historically been rather crude as a means of control, since by entering a command you must also look away from the thing you’re trying to control. That’s especially problematic for something like Glass, which already puts the interface in a tiny portion of the user’s overall frame of vision. Yet when a researcher named Brandyn White had the idea for a pupil-tracking algorithm, he knew it could be useful for at least one application.
http://www.youtube.com/watch?v=QSn6s3DPTSg


As a proof-of-concept for the pupil-tracking system, White produced a video of it being used to control a game of Mario. Probably the most recognizable video-game of all time, Super Mario Bros. uses a known input logic that immediately communicates the abilities of this eye-tracking rig. It doesn’t seem to take too violent a movement to provoke a response from Mario; in particular, jumping doesn’t seem to require the player to look too far away from the action.
Beyond the stripped-down webcam and the specially printed bracket, what makes this system work is the software. Special image filters bring the pupil into sharp relief, letting the camera easily keep tabs on both its size and position. The process begins with a calibration routine that takes note of where the user’s pupils point when the user looks at a particular segment of the screen. Thus, different segments of the screen can be associated with different commands.
The big problem with eye-tracking is, of course, that our eyes are constantly twitching around involuntarily. White pays close attention to the radius of the pupil, perhaps foreshadowing a system that uses rapid pupil dilation unrelated to light levels as a differentiator between real and involuntary commands. He suggests that the new input method could let people use Glass discreetly when it might otherwise be rude to do so. Whether we want computers to invade those few remaining unconnected moments is another matter entirely.
Now read: Google Glass 2 offers minor but important hardware updates



More...