Home / News / PC Component News / nVidia Demonstrates GPU-Acclerated Ray-Tracing

nVidia Demonstrates GPU-Acclerated Ray-Tracing


nVidia Demonstrates GPU-Acclerated Ray-Tracing

As well as affording us an indepth look at Intel's forthcoming GPU, Larrabee, this year's Special Interest Group on GRAPHics (SIGGRAPH) conference has seen nVidia showing off a real-time ray tracing demo to attendees. The aim of the demo, though, seems not so much to have been to prove that ray-tracing is possible on the GPU, but rather that it is still extremely unviable.

The demo itself comprised a render of a green Bugatti Veyron at a resolution of 1,920 x 1,080 running at 30fps. In order to run in this way, though, nVidia needs an extremely powerful system: namely an nVidia Quadro Plex 2100 D4 Visual Computing System, which packs in a pair of dual-GPU Quadro FX 4700 X2s. In simple terms: a heck of a lot of GPU processing power.

Despite this huge amount of processing power available, nVidia still isn't able to produce a photorealistic image in real-time. In order to run at 30fps at 1080p, the demo uses three bounces for each ray (i.e. each beam of light reflects off three surfaces before 'hitting' the screen). To create a realistic image, more reflections need to be calculated - but doing so would turn the simulation into a slideshow.

nVidia does have the edge over Intel's CPU-based Quake ray-tracing demo, which only ran at 720p, between 14 and 30 FPS and needed a four-socket quad quad-core system to run. But, as I said, nVidia is pointing out that ray-traced games are still some way off, not that GPUs ray-trace better than CPUs. Despite the hype around Larrabee's potential use as a ray-tracing card.


August 15, 2008, 5:16 pm

is that supposed to look good?

it looks utterly out of place, the car and the surroundings don't match at all.

tbh as other article have already said, if Larabee has the drivers behind it, then I think it probly will be a success. (or at least the 3rd choice)

However I can't see that happening, simply because decent drivers take time, not just talented devs (that I'm sure Intel have).

So in that time for the driver maturing, nvidia and amd can shift their hardware designs and respond to any other changes in the market. Intel would then need to hope, any changes can be coped with by the hardware design, and the driver team can keep up!


August 15, 2008, 10:11 pm

Whatever ray-tracing engine they are using looks awful. Expect games developers to create or use something a lot better (and hopefully we will get results akin to those produced by a decent ray-tracing renderer such as Maxwell Render).


August 18, 2008, 2:09 am

I have to agree that these images are sub-par as far as Ray tracing goes. but they are at 30fps on cards not meant for the job. Intel is going to wipe the floor with nvidia if this is all that they can do with 10k worth of graphics cards.

With 4 and 8 core nehalems coming and dual socket boards not far behind. We are going to be looking at 16 physical cores and with HT enabled it will work more like 32 cores working at 3+ ghz a piece. then in the year following you can upgrade to lerrabee which will still pllay old games but will be a major boost for ray tracing.

Now the best part. I can use all 32 virtual cores for more than just gaming. Oh and it will only cost me 5k max for that system. instead of 12-15k for the nvidia system.

comments powered by Disqus