Home » Opinions » Context Computing: Intel's Vision Of The Future » Context Computing: Intel's Vision Of The Future

Context Computing: Intel's Vision Of The Future

Another innovation we were shown is a remote control that has a motion sensor attached to it. By analysing the way the user hefts and interacts with the remote, it can detect who is holding it and change the channel options accordingly.


Taking a further step into the future, we were also given a glimpse at Carnegie Mellon's ongoing research into understanding the brain. We were shown footage of a machine that could monitor brain waves and recognise what a person was looking at. Justin put a vague prediction of full brain function analysis being available by 2030 - a somewhat scary proposition!



Closing things out, Intel's Labs Director for Interaction & Experience Research, Genevieve Bell, told us how all this research brings us back to Intel's activities today. By studying the reactions people have to these research projects as well as current tech she and her team can guide the Intel product teams towards creating devices people actually want, rather than those that just push forwrad frontiers for the sake of it.


Considering all this blue sky thinking, it was rather comical that Bell used as as an example of this design approach the humble new CE4200 Atom chip that the company announced last week. It combines an Atom chip with video decoders and other AV centric technologies to make for a low power chip that can be used to power the TVs of, well, next year. Apparently such a product would never have been designed were it not for all this research, which seems like a stretch. Still we're sure it won't be long before we've Atom chips strapped to every limb, and several floating round our innards - brings a whole new meaning to Intel Inside - telling us to be prepared to take a pee, take more exercise, and eat more healthily... Hmmm, can't wait.

comments powered by Disqus