During CeBIT, while Gordon and I were out in Germany getting frosty toed and weary eyed, NVIDIA was nice enough to launch seven products on the same day. While Benny did an excellent job covering the 7900 GTX, today it's the turn of the 7600 GT - NVIDIAâ€™s new mid-range card. As much as we all like to whet our appetites reading about high-end kit for financial reasons, this is the card that most in the market for a graphics board will be thinking about more seriously.
As a spoiler to the 7600 GT, ATI has simultaneously released the X1800 GTO â€“ so a head to head comparison definitely seemed the order of the day. A whole host of benchmarks, including SLI results, can be found at the end of the review.
Physically, the 7600 GT is small, with a tiny copper heatsink and fan. This does a good job of cooling, but is a little on the noisy side when in 3D mode.
You might notice that it doesnâ€™t need any external power, which is not surprising considering it only uses between 65 and 70W, making it an ideal candidate for SFF machines. Without a doubt we will see passive editions of this card and I donâ€™t expect that the heatsink will need to be particularly large either.
As with the GTX, the chip is built on a 90 micron process, with 12 pixel pipelines, eight pixel output engines and five vertex shaders. Statistically, this is close to half the power of a 7900 GTX. Half is another keyword when talking about memory too, with only a 128-bit interface and a 256MB frame buffer. This means that it has limited memory bandwidth, but this only comes into play at the highest resolutions, while few games require a 512MB frame buffer yet. Anyone with a 17in TFT panel wonâ€™t have a problem.
By default the core runs at 560MHz, but already I have seen board partners touting higher clock speeds, which indicates good headroom for overclocking. The memory runs at 700MHz (1,400MHz effective) â€“ it is astonishing to see speeds as high as this hit the mid-range.
Comparing the 7600 GT to the 6600 GT, we see very few major improvements in specifications. It still has Pixel Shader 3.0 support, but doesnâ€™t run FSAA with full precision HDR unlike newer ATI hardware. On the plus side, it brings transparency FSAA to the mainstream and there are a host of other internal improvements to increase efficiency. These will speak for themselves in the benchmark results.