Home » Opinions » The Trials of Testing

Testing under trial

by | Go to comments

Share:

During the testing for our recent article on ATI's HD 2900 XT it occurred to me that maybe our method for testing graphics cards wasn't really giving the big picture and maybe an overhaul of our testing methods was in order. With so many new technologies appearing in recent years and with games becoming ever more sophisticated and variable, simply cranking all the in-game settings to the max and running a timedemo with or without Anti-Aliasing (AA) just doesn't cut the mustard anymore. So, today I want to open a dialogue with you, our readership, to see what you would like to see in future 3D application testing. However, first I want to discuss a few of the underlying issues and technologies that we need to consider and provide some context upon which to base your comments.

The fundamental point is that the average gamer doesn't care about how many fps they get in their games, as long as it's enough to play the game satisfactorily - this seems to be forgotten all too often. What counts as satisfactory varies from game to game but as a general rule of thumb an average of above 60fps and a minimum no lower than 30fps is enough for most games. Competitive online gamers may want a minimum of 100fps but then they're quite happy playing at 800 x 600 with all distracting details, like rain effects, turned down so they can get every competitive advantage. For most of us, though, what really matters is enjoying the experience which means making the game look, sound, and feel as involving as possible. In other words, performance is not just about how fast you get there but how good things look along the way. After all what does being able to play Quake 4 at 150fps, as opposed to 100fps, really do for you?

/94/4693e4/48e8/4659-IMG1592s.jpg

It was this thought process that prompted a few sites to switch to a different way of testing that became known as the 'best playable settings' method. It involved playing each game through multiple times while varying not just AA and AF but every in-game and driver setting until the best compromise between performance and image quality was found for each card at each resolution (this is why it is also known as Apples-to-Oranges testing as the settings used for two different cards weren't necessarily the same). Then, rather than churning out a load of graphs, a much more subjective opinion was formed and conveyed to the reader. Quite simply, it could be authoritatively said which card gave you, the tester, the best experience with any one game.

While this method was a step forward in terms of giving the reader the most information, it meant an exponentially greater amount of work for the person testing the cards and though some amateur sites have the time to do this excessive testing, professional publications with ongoing deadlines, can't afford to spend three weeks testing a graphics card. Also, for the reader to interpret the results requires a significant level of knowledge of how all these settings affect image quality. Finally, some readers simply don't like having to read all the words and like to skip straight to the graphs. For these reasons we have not yet dabbled in this method of testing here at TR and we don't particularly plan on making a change as drastic as that. However, I do think that a move to incorporate some of the teachings from the “best playable settings” method into our testing would be beneficial.

Go to comments
comments powered by Disqus