Home / Opinions / Intel Research & Development Projects

Intel Research & Development Projects


Intel Research & Development Projects

The Intel Developer Forum (IDF) has just kicked off here at the Moscone Center, San Francisco so keep your eyes peeled for a plethora of upcoming announcements from the company over the next couple of days. As has become traditional, though, 'Day 0' (yesterday) was used to showcase the company's research and development projects outside of the obvious technological advancements in CPU design and manufacture. So over the next few pages we'll give you a brief incite into where Intel thinks the industry's heading and what it's working on to make that vision a reality.

Mobile Augmented Reality

We've seen many augmented reality apps spring up over the last year or two. For instance, the Layars app, available on the Android Marketplace, uses location and orientation information to overlay information about where you are, such as restaurant locations, onto a realtime image from your phone's camera.

The new twist on this that Intel is working on is the ability to actually use image recognition to provide information about the world around you. Specifically, you can take a picture of a famous landmark then very quickly have the wikipedia page for that landmark pop up to tell you all about it. The idea being that you may not even know the name of the landmark so can't look it up yourself.

It indexes Google Images to find matching images then adds location and orientation information from the mobile device to help narrow down possible matches, and if it still can't do it automatically it will provide a list of possible matches for you to choose from. The key challenges for the research lie in managing where the required processing is performed - whether on a server or on the device - to enable a responsive and accurate service.

We were shown a demo of it working and we must admit to being rather impressed. Sadly, as with all these projects, intel won't push them to market - it's just for science - so we'll have to wait for someone else to pickup the idea and commercialise it. Here's hoping it happens sooner rather than later.

Cloud-based Ray-Tracing For Games

Intel's involvement in developing realtime ray-tracing has become something of an in-joke in technology journalism as the company's been talking about it and demoing it in various forms for years now without seemingly getting it any closer to market. Larabee, the codename for a graphics-hardware related project, was expected to finally bring realtime ray-tracing to home computers but that project seems still to be a long way from commercial availability. Imagine our suprise, then, to see Intel once again talking up the potential of ray-tracing, this time using the power of the cloud to perform the calculations.

Ray-tracing is a technique for creating realistic looking lighting effects in computerised 3D enviroments like games. Traditional scanline techniques, used in all 3D games from the last couple of decades, use graphical tricks to make a scene look correctly lit and thus realistic. Ray tracing, on the other hand, actually plots the path of rays of light within a 3D scene thus giving you a much more realistic look. A perfect example of the power or ray-tracing is how it handles transparent and reflective objects like those shown in the included picture and video - traditional lighting methods simply wouldn't be able to recreate these effects. The problem with ray-tracing is that it requires a huge amount of processing power.

Intel's cloud based ray-tracing demo was showing that by distributing the substantial work load of ray-tracing across a large array of servers 'in the cloud' (in this case an array of Larrabee-based many-core multiprocessors) you can get super realistic looking images running at speeds fast enough to play games on even a modest laptop. Intel was showing a few modified sections of the Wolfenstien game running with ray-tracing enabled and it did look rather impressive. The company sees this tech potentially being incorporated into online 3D games, the like of which are already available. Considering Intel's track record on ray-tracing we wouldn't be surprised if we never hear of this again, though.

comments powered by Disqus