Please register or login. There are 0 registered and 1179 anonymous users currently online. Current bandwidth usage: 326.30 kbit/s December 16 - 09:16pm EST 
Hardware Analysis
      
Forums Product Prices
  Contents 
 
 

  Latest Topics 
 

More >>
 

    
 
 

  HalfLife 2 benchmarks, Valve, ATI marketing and more 
  Sep 12, 2003, 08:30am EDT 
 
By: Sander Sassen

The magic word in the graphics industry today seems to be shaders, and it comes as no surprise that Gabe Newell from Valve, the software developer that’s working on HalfLife 2, just held a lengthy presentation at ATI’s Shader Days which basically talked about shaders, shaders and some more shaders and all the great things you can do with them. The gist of the matter is that image quality becomes the determining factor in tomorrow’s games and requirements for videocards. But he also mentioned performance of currently available videocards with HalfLife 2 and unless you missed the coverage by other websites about this subject it is save to say that ATI came out on top. Nvidia cards, including the new FX series, seem to take quite a performance hit when utilizing the DX9 code path, with their performance about half that of a comparable ATI videocard.

But by reading the reports from people that have actually been to the event and seen the side by side comparisons we learn that DX9 is mostly used for eye candy in HalfLife 2, and not so much to up the frame rate. The visual difference between using the DX8 code path or a mixed DX8/9 as Nvidia is using isn’t as big as the benchmarks have you believe. So we’re back to the square one, remember when 3dfx and Nvidia were duking it out over 32-bit color? And 3dfx stated that 16-bit was good enough? Nvidia must’ve sold quite a few extra videocards because of the simple fact that they had 32-bit color, regardless of the fact that very few games used it, or could use it without suffering a severe performance penalty.

HalfLife2 screenshot

Fig 1. A screenshot from Valve's HalfLife 2, fully utilizing DirectX 9.0.

That automatically brings me to the driver cheats both Nvidia, and to a lesser degree, ATI have been caught on using. These cheats were meant to increase performance, by reducing image quality. But in fact they didn’t really affect visual quality too much otherwise it would’ve been blatantly obvious at first sight. These optimizations obviously weren’t as clear to the naked eye as some needed utmost scrutiny of the image or even the driver code paths to reveal any optimization. Whether this is a bad thing we leave up to you to judge, as in the end it is all up to what makes a game playable. It is, and always will be, a thin line between calling something a cheat or just a performance enhancement. But I think both Nvidia and ATI tackled this problem pretty well in their drivers, by offering a performance slider that we could manually set to any desired level, either fully optimized for performance, eyecandy, or somewhere in between.

HalfLife2 screenshot

Fig 2. Another HalfLife2 screenshot, again utilizing DirectX 9.0.

However, at some point in the near future we’ll see a similar situation as with modern processors; they’re plenty fast for 99% of the tasks, videocards will then also offer plenty of performance, just like with 2D today. The path leading up to that will focus more and more on image quality as high frame rates will merely be a matter of clockspeed. Visual effects will be processed by dedicated circuitry inside de graphics processor, much like the shaders and per pixel processing of today’s videocards.

One thing is for sure though, ATI gambled on the right horse by putting image quality as a number one feature, and working that into the core design of their graphics processors. All they need to do (and will do with the new 9800 and 9600 series) until a new DirectX specification is released is increase the clockspeed and optimize their algorithms. Nvidia on the other hand gambled on raw frame rates and clockspeed, they are still not fully DX9 compatible, or have made shortcuts in their design that yield DX9 compatibility at the expense of image quality or clockspeed. They will need to work much harder, and maybe even overhaul their basic design, if they want to remain competitive with ATI now that image quality becomes a determining factor. Funny that only a few years ago Nvidia was in exactly the same position as ATI is now, let’s see if they can come around again and come out with a product that delivers on all counts.

Sander Sassen.

 

  Comments 
 
 Subject 
 Author 
 Replies 
 Last Post 

 

  Voice Your Opinion 
 
Start New Discussion Topic
 

    
 
 

  Related Articles 
 
 

  Newsletter 
 
A weekly newsletter featuring an editorial and a roundup of the latest articles, news and other interesting topics.

Please enter your email address below and click Subscribe.