The video card is the easiest PC part to be deceived, especially if you decided to buy an entry-level board, like GeForce FX 5200, GeForce FX 5500, GeForce 6200 or GeForce 6600.
Almost all mid-range and high-end video cards follow the same specs. This means if you buy a GeForce 6600 GT from manufacturer A it will have the same specs and performance as a GeForce 6600 GT from manufacturer B.
The problem with low-end video cards is that NVIDIA doesn’t set standard clock rates or a memory bus width (number of bits to be used to access video memory). This problem is particularly common with the chips listed above.
For those cards, you can find models accessing memory at 32-, 64- or 128-bit rate. Different clock rates can be used. On the market you can find GeForce 6600 models accessing memory at 400-, 500-, 550- and 600 MHz, for example.
What happens: you buy a GeForce 6600 without paying attention to these details to learn later that your GeForce 6600 is slower than the GeForce 6600 of your friend, cousin or neighbor, since it uses a different clock or memory configuration – and maybe you decided to buy a GeForce 6600 exactly because you wanted a computer identical to your cousin’s.
The solution? To specify at the store the brand and the exact model that you want to buy. “GeForce” is the name of the chip, and NVIDIA manufactures only chips, not boards. Video boards are manufactured by other companies like XFX, Prolink/Pixelview, Gigabyte, Leadtek, eVGA, MSI, ASUS, etc. At the manufacturer’s website you can obtain the number of the exact model you want to buy and ask for that specific model, checking later if your PC came with this model.
Low-end video cards from ATI are easier to be identified, since 64-bit models are labeled as “SE” (e.g., Radeon 9200 SE) and ATI doesn’t allow video card manufacturers to change the clock of the video card based on their products.
To check the clock rate of your video card you should run PowerStrip. Video cards have two clocks: the clock used internally by the video processor (GPU) and the clock used by the video processor to access the video memory. You need to check both. In Figure 2, you can see that our GeForce FX 5700 Ultra was running at 500 MHz with its memory running at 1 GHz.
How can you know the correct clock rates for your video card? Check this information on our tutorials NVIDIA Chips Comparison Table and ATI Chips Comparison Table. These tutorials have a complete list of all chips and their clock rates.
Notice that sometimes the clock rate reported by PowerStrip program is half the clock published on our tables. What happens is that nowadays video cards use memories with DDR technology, where two data chunks are transferred per clock cycle, doubling the performance compared to a system running at the same clock rate but transferring only one data chunk per clock cycle. Because of that usually manufacturers announce their memory clocks “doubled”. For example, GeForce FX 5700 Ultra accesses its video memory at 500 MHz but since it achieves a performance as if it were accessing it at 1 GHz (since it uses DDR technique) the manufacturer says that its memory clock is of 1 GHz, while this is not true. On our tables we published the “doubled” clock rates.
So, if PowerStrip lists a memory clock as being exactly half the memory clock rate shown on our tables, the clock rate is correct (this PowerStrip behaviour is particularly common when checking video cards based on ATI chips). Notice that this is only true for the memory clock, not being valid for the video processor clock (“core clock”).
We’d also like to remember that if you want to run games on your PC you should not buy a PC with on-board video (i.e., where the video is produced by the motherboard, also known as integrated graphics).