Rather than a new range of chips, Intel presented the RealSense Camera – a 1080p camera with an infrared sensor that will be embedded in tablets, hybrids, all-in-ones and laptops. Intel’s excitement at the RealSense Camera stems from the new interactions users can have with their devices.
Gestures, facial recognition and head tracking can provide new ways of navigating a computer without the need for touch or a mouse/keyboard combo. So far, so Kinect, but how does the RealSense camera really compare to the Xbox One peripheral?
For a start it’s tiny, making it easy to embed in all sorts of products. Secondly, Eden promises that it will be very cheap to implement. Can it really be an effective way of navigating Windows though?
We tested a prototype of the RealSense camera (made by Creative, it’s currently external) with a number of apps and games and this is how we got on.
The RealSense Camera prototype is external
Our first test was with a Lemmings clone called Hoplites. My little Greek soldiers started heading to their doom in a lava pit until I placed my hand in front of the laptop and external RealSense camera. My fingers were easily identified and there was very little lag between my actions and virtual hand on the screen. I used my fingers to create a bridge and save the soldiers. It was fun but later levels became frustrating due to the imprecise nature of the controls so the novelty wore off quickly.
Hoplites quickly got annoying
My second demo was with a 3D space-invader style game. Moving your hand moves the spaceship around the screen, while opening your fist fires lasers onto auto-locked targets.
Unlike Hoplites depth played a more important role. By moving your hand towards and away from the screen you can make your spaceship go forward or back, thus avoiding incoming fire. Surprisingly not being able to see your hand on-screen made the gameplay more intuitive, the spaceship follows your hand with precision.
The most interesting demo, however, wasn’t of a game but rather a music-making app called Kagura – a fun and funky 21st century theremin. Created by developer and designer Shunsuke Nakamura it won Intel’s grand prize in the company’s Perceptual Computing Showcase, and with good reason.
Kagura lets you move virtual instruments around the screen and play them using your hands, head or even anything you’re holding. Move your hands forward and you can tweak settings intuitively.
A thumbs down lowers the tempo, while a pinch grasps an icon and let you place it anywhere on-screen. It’s a great demonstration of how the RealSense camera can be used as a form of navigating a user interface and was the most convincing of the demos on offer.
Watch the Kagura demo:
One of the major benefits that Intel is pitching is that the RealSense camera will make green-screen a thing of the past. By adding depth of field to the camera video recording suites and VOIP services (like Skype, which is a partner) the RealSense Camera can virtually “green-screen” the background allowing you to put other backgrounds on or add videos. Aimed at video bloggers primarily the effect requires further refinement. The edges of the person in focus are over-cropped and suffer from an effect similar to cell-shading which is a little off-putting. It’s a good as a first stab, though, and it can only get smoother.
Intel believes the voice and gesture interface will be the future – bigger than touch and better than mouse and keyboard. There’s certainly something intuitive about the gesture control interface in certain respects, basic navigation and some types of gaming in particular. Couple it to an Oculus Rift, for example, and there could be some compelling gaming experiences ahead.
When it comes to productivity and precise actions, though, it’s still some way off taking over from touch or the mouse. Still this could be the first big step towards a new computing interface.
The first products to come with the RealSense Camera will be available to buy in the second half of 2014.