With nVidia set to launch its next generation graphics chip within the next I'm-not-allowed-to-say-how-many days, I thought now might be a prudent time to look at one of, in the consumer sector at least, the lesser known features of the current and next generation nVidia graphics cards - CUDA.
Before I continue; a small disclaimer of sorts. I'm afraid that this article is inevitably going to sound very pro-nVidia, but that's just not possible to avoid. ATI's rival project may, for all intents and purposes, not exist. Close to Metal (CTM) is similar only in that it offers a layer between programmers and the GPU architecture, unlike CUDA though, it only works with specially designed cards, and that means it doesn't have CUDA's install base, or its developer support. Hopefully for AMD that problem is being worked on, but for now CUDA is the only player in the market.
Compute Unified Device Architecture as it is fully named, is at its most basic level a way of allowing code that might normally be executed on a CPU and transferring it to the GPU instead. Some of you might be aware that the idea of running programs on a GPU isn't a new one, but previously that software had to be custom coded to take advantage of the GPU's architecture. CUDA is free to download and allows programmers to use C and C++ (probably the most common in use today) code and run it on the graphics card. Once optimised for the massively more parallel nature of the GPU compared to the CPU, these programs can run faster, much faster in fact. Some programs can see as much as a 130 times performance jump from moving to the GPU, others maybe even more.
I could go more in depth but the reality is that even that much explanation isn't important. What we consumers care about isn't technologies, but benefits. So what does CUDA offer to us, the general GeForce buying public? So far most of the developed software is in the research sector, mainly medical and scientific - applications such as Stanford University's Folding@Home being a prime example. The client isn't available to the public yet, but the preliminary figures are very impressive. As much as it sounds like PR spiel, projects like this really could change the world, finding cures for cancers and diseases such as Huntington's, Parkinson's and many more.
Thinking on less grand scales there are also now many innovative new companies emerging ready to take advantage of the processing power the GPU can offer. One such company is CoolIris, developer of a nifty program called PicLens. This nifty plugin for all the major browsers (Safari, Internet Explorer and Firefoxs 2.0 and 3.0) allows content on compatible sites to be viewed in a 3D environment that has to be seen to be believed (video on the site). It sounds like a gimmick, but once you start using it you honestly won't be able to go back to normal image viewing on sites like Google Images, YouTube and Facebook. While PicLens will run on integrated graphics, the second you use a system with a dedicated GPU the situation improves dramatically.