Zotac nVidia GeForce GTX 275 Amp! Edition - Counter-Strike: Source

By Edward Chester



Our Score:


What can we say about Counter-Strike: Source that hasn't already been said before? It is quite simply the benchmark for team-based online shooters and, five years after its release, it's still one of the most popular games in its genre. It focuses on small environments and incredibly intensive small-scale battles with one-shot kills the order of the day. If you want to test all elements of your first person shooter skills in one go, this is the game to do it.

We test using the 64-bit version of the game using a custom timedemo taken during a game against bots on the cs_militia map. This has a large amount of foliage and is generally one of the most graphically intensive maps available. We find a frame rate of at least 60fps is required for serious gaming as this game relies massively on quick, accurate reactions that simply can't be compromised by dropped frames. All in-game settings are set to their maximum and we test with 0xAA 0xAF, 2xAA 4xAF, and 4xAA 8xAA.

There's not much to say here. All the cards are CPU (or other factors) limited throughout all our tests in this game.


April 4, 2009, 4:22 am

Great review - fully appreciate the early hours work. Thank you


April 4, 2009, 5:18 am

absolute dead heat. i'm stunned


April 4, 2009, 6:04 am

Thanks for both reviews Ed. It seems that Nvidia cards always do better in Crysis, though it should make my next Graphics Card purchase interesting.

Anyway, I was wondering which card is quietest when idle and under load (or are they much the same), as I like to run a quiet PC, so would be interested in this aspect of performance?


April 4, 2009, 7:49 am

@ ilovethemonkeyhead - Uhhh...it's not a dead heat, but they are close ;)

@ Pbryanw - According to bit-tech the GTX 275 is the quieter card.

john 17

April 4, 2009, 1:38 pm

Well according to you guys the 4890 loses against the GTX275 amp in crysis but at Xbitlabs.com the 4890 beats the GTX285 in crysis by alomst 13% and the 4870 1GIG gets the exact same frames of the GTX285 in crysis while here the 4870 loses against the GTX260 i wonder why!


April 4, 2009, 8:26 pm

@john - I've also now read the reviews on Bit-tech and Anandtech and each come up with slightly different results. I'd assume (though can't be sure) that this is down to testing methodology and the test setup (hardware) used to get the results.

The fact that some reviews are pro-Nvidia and some pro-ATi probably show how close things are between the 4890 and the GTX275. So giving the same score to each, like TR have done, seems about right.

The Stick

April 4, 2009, 9:03 pm

@john - having read the review of the 4890 at xbitlabs you seem to refer to its apparent they tested crysis warhead as opposed to crysis. Also I would suggest they read their graphs again as some of their conclusions do not seem to match what the graphs show. That review at xbitlabs reads like a press release for the 4890 to me. Other reviews (such as the one at bit-tech) show the GTX 275 to beat the 4890 in crysis as well, with the GTX 285 beating both.


April 4, 2009, 11:58 pm

Remember, you can also create a profile with your HD 4890 to bring both the GPU and Memory clock rates much lower then they are at stock. Thus you are able to reduce power at idle and fan speed.


April 5, 2009, 12:05 am


IMO, that's because Xbit disabled all control panel optimizations which can decrease overall performance by up to 20% from nvidia's control panel. They did this by changing Texture Filtering from it's default value of Quality to High Quality. I am not sure if they forced xQ which is true MSAA instead of using CSAA. But if they did then this would also explain why the results are so different.

In some games you have to force xQ in order to get true MSAA. {H}'s review made mention of this in their Fear 2 benchmark results.


April 5, 2009, 7:30 am

After reading several reviews I'm still none the wiser as the which is the better card - it's about a 50-50 split between ATI and nVidia. There seems to be a lot more discordant reports and benchmark results for these two cards than any other.


April 5, 2009, 6:52 pm

@smc8788 - the nVidia cards are much quieter than ATI's (particularly when under load) so that could be a decisive factor for many - it would be for me.

john 17

April 5, 2009, 7:04 pm


Well i think you've got it a little bit wrong here,as they explain "Xbitlabs" they have disabled these optimizations to get the best quality and to minimize the effects of default software optimizations which may worsen the overall quality.

and btw they didn't only change settings for NVIDIA cards they also did it with ATI cards

like enabling adaptive AA and turning the mipmap level to high quality.

john 17

April 5, 2009, 7:57 pm


to be fair it depends on the card your'e buying, for instance you've got Vapor-X from Sapphire which is an extremely quiet and cool card even at full load.


April 5, 2009, 8:48 pm

smc8788 - As I've said before you can create a profile within CCC to enabling a 2D clock of 240 GPU/450 Mem thus reducing power consumption. You can also manually control fan speed through CCC to whatever you want. So the claim that one card is more quieter then another isn't an accurate.

john - Perhaps you misunderstood me. How else do you define getting the best quality if you don't remove all "control panel" optimizations? You said it yourself that "they have disabled these optimizations to get the best quality and to minimize the effects of default software optimizations which may worsen the overall quality." You are basically saying the same thing I said.

As for control panel changes between the two control panel settings I addressed your question as to why their results are different. I thought you already understood that both control panel settings were changed. All they did to CCC's settings was enable a feature that improves the IQ of alpha textures, etc called Adaptive AA (set to quality/on for the 4890); for Nvidia it's called Transparent Anti-Aliasing. So what they did was

-Remove all optimizations from Nvidia's control panel, turned off Vertical Sync and enabled Transparent AA

-Enabled Adaptive AA for CCC (CCC does not have user controlled optimizations) and disabled Vertical Refresh

With Nvidia's control panel settings had to be changed from default in order to remove software optimizations which offer the best image quality (IQ). I hope you better understand my post now. Even if we agree to disagree I believe we still have the same conclusion.


April 5, 2009, 9:17 pm

I wanted to also add that this is the only review I found so far that actually used a retail GTX 275. The performance of this retail card is no where as high as I found in other reviews using none retail cards.

john 17

April 6, 2009, 12:24 am


No. I think we both have the same point directed to different directions.

In case you don't know, The adaptive aa is more demanding and effective than the Multisampling aa.Nvidia's transparency multisampling aa is the same as the original multisampling aa but with a slight difference, the transparency msaa doesn't disable some of the HDR effects that the original multisampling does.The same goes for ATI's adaptive aa.

Now take a look at this chart that compares ATI's ad aa to the original ms aa:


You can see how the dapative aa is more effective than the multisampling aa.

The adaptive aa that ATI uses is more effective than Nvidia's transparency multisampling.The TSms aa that nvidia uses, uses the same number of pixel textures and color samples/ pixel as the original ms aa .

So xbitlabs, used a better quality aa with ATI cards.Because the adaptive aa is better than the multisampling aa and the transparency multisampling aa that was used for nvidia cards.

Talking about the antisotrpoic setts that were used.First let me notify you that AF has a very samll effect on performance unlike aa.In worst cases you gte 1-2 fps drop with

16x af.

Now at xbitlabs, they put texture filtering / mipmap detail quality "the same thing" for both card on high quality.but they turned of the trilinear optimizations for nvidia cards, which has nothing to do with the image quality produced.cuz texture buffering methods should be selected by the game.That's why he didn't use any specific af rendering method for both ATI and Nvidia cards.


April 6, 2009, 9:43 pm


I believe we are in agreement but pointing out different aspects of the review. Yes, we know that Adative AA is better but can take a hit in performance as well. Therefore, there should have been an IQ comparisons between the two when those type of settings are enabled. AdAA & TrAA is used to anti-alais pologons that use transparent textures were MSAA cannot. It really compliments MSAA more so then anything else IMO. So yes, I do understand how it works.

Now here is were things gets interesting (at least for me). They put both texture filtering and mipmap quality on high quality. In Nvidia's control panel it defaults to quality. For CCC it's defaults to high quality. How many other reviews do this (or at least tell you what settings are used)? This is why I believe the 4890 won. It's because IQ was equitable. As far as Texture Filtering - Trilinear Optimization: On; it should be common knowledge that this option has a negative effect on IQ.

Did you know there are games that use CSAA inherently? Some of these games are, RaceDriver Grid, Fear 2, Crysis and Crysis Warhead, Company of Heroes, etc. For example, at 8xAA, MSAA uses 1 texture/shader sample, 8 stored color/z samples and 8 coverage samples. At 8xAA, CSAA uses 1 texture/shader sample, 8 stored color/z samples and 4 coverage samples. Unfortunately when someone says they used 8xAA it doesn't necessarily mean they use MSAA. This is why I mentioned the H review earlier as they specifically mention having to force MSAA.


April 7, 2009, 2:10 am


...For example, at 8xAA, MSAA uses 1 texture/shader sample, 8 stored color/z/ctencil

samples and 8 coverage samples. With CSAA 8xAA uses 1 texture/shader sample, 4 stored color/z/stencil samples and 8 coverage samples...

jeffrey clucas

November 26, 2009, 5:13 am

Great review but it would have been nice to put in the one of the 9 series to show how much there is to gain by upgrading because i would rather like to know if the 200 series is just another gimmick like the 9 series or weather the performance gains are worth the cost tbh i think ill wait for the 300 series that are due to release in the first quarter of 2010

comments powered by Disqus