Home » Opinions » Control Issues » Muscle and Eye Control

Muscle and Eye Control

All that Natal stuff is great, but wouldn't it be good if there was a way to control technology without looking like you're either conducting an orchestra or staking a claim as the world's worst body-popper? Luckily, there are some, less physically demanding ideas going around. The first is muscle control. A team at Microsoft Research has already created a special sensor arm-band that reads the electrical activity going through your nerves and uses this to work out which fingers are being moved and which hand shapes are so being made. In true geek genius style, one of the first applications to benefit is Guitar Hero, which the researchers are seen playing without the controller for the ultimate in virtual air guitar action. However, more serious demos combine muscle control with Surface, enabling specific fingers to operate specific tools, or allowing the operating system to register a greater range of pinch and throw gestures than would be possible with multi-touch alone. The combination even allows Surface to sense pressure more accurately; ideal for natural drawing and painting applications.

Like, what else would you do with this kind of tech? Rock on, Microsoft Research dudes!

Next comes eye control, where a sensor tracks the motion of your eye and uses this data to work out where and what you're currently focussed on. Eye-tracking is already an established technology in the assistive technology and marketing/research spheres, where it's used to provide an input method for severely motion impaired PC users, or track what is being looked at for branding or usability studies. As the technology becomes cheaper and less obtrusive, however, it may well become a viable control method for desktop PCs, mobile devices and digital cameras.
Interestingly, Canon actually used an early eye-tracking technology to control focus in its 1990s EOS 5/A2e and 7E film SLRs - an idea that hasn't been copied by digital cameras since.
This Canon EOS SLR let you choose the focus point simply by looking at it

In October, Researchers at NTT DoCoMo in Japan demonstrated working eye control of a media player on a mobile phone. Instead of an optical system, the controls took input from a sensor built into a pair of in-ear headphones, which detected changes in the body's physical state that take place when we move our eyes. Meanwhile, Microsoft's Craig Mundie (him again!) demonstrated another eye-control system during a summer 2009 tour of US colleges. Looking at a transparent screen packed with a grid of Web page thumbnails, Mundie used optical eye tracking to select and zoom into pages, with the software interpreting what he was looking at and doing its best to react in an intuitive way. It's not hard to imagine, say, FPS games or image-editing apps working along similar lines.

comments powered by Disqus