3dfx Driver Build 1.04.01: Increased Performance and Hidden Surface Removal (HSR) for the Voodoo4 and Voodoo5
by Matthew Witheiler on December 14, 2000 1:55 AM EST- Posted in
- GPUs
The Test Explained
We were recently asked a few questions by our readers inquiring about our testing process. One question posed is why we do not use the more current and T&L based D3D game Evolva for testing. The reason for this is due to the limited attraction of Evolva. Our benchmarks are there to show how a card performs on games that the reader plays. Unreal Tournament, our current D3D benchmarks, is far more popular than Evolva, allowing users to better judge how games they play will perform on a given card or driver set. Recently we have also switched to Reverend's Thunder demo benchmark and we have been quite happy with the stress that the benchmark puts on the video card. Until a new popular D3D game that has a respectable benchmark comes out, we will continue to use Unreal Tournament to judge D3D performance.
A second question posed is why we use D3D for testing of 3dfx cards as opposed to Glide. First off, the reason we test D3D as opposed to Glide is that game supporting Glide as an API are dwindling. The current crop of video games are supporting D3D and OpenGL far more than Glide, meaning that testing in D3D is a better representation of how a card will perform in the future. In addition, we pointed out in our GeForce2 GTS review that not only does Glide limit color depth to 16-bit (a quality frowned upon by many) it also runs slower than D3D at higher resolutions. For the full explanation, please look here. Keep in mind that these scores were recorded with a previous benchmark.
To support this theory, please read the following quote by Tim Sweeney as made in the "Ask The Sweeney" interview found on VoodooExtreme:
Tim, [Question submitted by Jeff Walker] Is Glide dead yet?
Tim Sweeney's response -- Yes. Glide is dead. Nobody is writing any new code aimed at Glide. There are just some games on the market still taking advantage of it, so it will be a little while before the thing is fully buried. But I can assure you developers are doing their best to shovel dirt on the grave, and it we ever see the deceased try to claw its way out, we will whack it back down.
The final question we were asked was what settings we use to test. This is a very good question and since it has been a while since we last addressed this, let's take a look how we test.
First off, one will note the lack of a Sound Card in our testbed platform. We have chosen to eliminate the soundcard from our test system because we wish to isolate the performance of the video card alone. Including a sound card in our tests would introduce a second potential bottleneck that would mean that the performance of the card would no longer be measured accurately.
Now, when it comes to game configurations, we test each card with the same set of settings. In Quake III Arena, we test all cards with the "Normal" settings, only modifying the color depth, texture quality, and video mode (resolution) to the desired resolution. For example, at 1024x768x32, we set the video mode to 1024x768 and both the color depth and texture quality are set to 32-bit.
In Unreal Tournament, we only alter the world texture detail setting, the skin detail setting, the min desired framerate, the resolution, and the color depth. Both the world texture detail setting as well as the skin detail setting are set to high, while the min desired framerate is set to 0 in order to stress the video card to the maximum amount. Both the resolution and the color depth are set to what ever resolution and color depth we are testing with at the time.
Our final benchmark, MDK2, is left at all defaults except for enabling hardware T&L as well as altering the resolution to reflect the desired resolution. Even in cards that do not support T&L, we enable this feature in order to see how the card performs in a T&L game. We expect to see more and more games support this feature and we feel it is important to see how a card reacts in a T&L environment.
In all of our benchmarks, we have V-Sync disabled. This is set because we wish to measure the performance of the video card, not the refresh rate of the monitor.
With those questions out of the way, let's see what kind of system we are going to test on today.
Windows 98 SE Test System |
||||||
Hardware |
||||||
CPU(s) |
Intel Pentium III 750E |
|||||
Motherboard(s) |
|
|||||
Memory |
128MB PC100 Corsair SDRAM |
|||||
Hard Drive |
IBM Deskstar DPTA-372050 20.5GB 7200 RPM Ultra ATA 66 |
|||||
CDROM |
Phillips 48X |
|||||
Video Card(s) |
3dfx
Voodoo4 4500 AGP 32MB |
|||||
Ethernet |
Linksys LNE100TX 100Mbit PCI Ethernet Adapter |
|||||
Software |
||||||
Operating System |
Windows 98 SE |
|||||
Video Drivers |
|
|||||
Benchmarking Applications |
||||||
Gaming |
idSoftware
Quake III Arena demo001.dm3 |
0 Comments
View All Comments