GeForce 8 Series Architecture


Instead of the graphics chip having separated shader engines according to the task to be performed – for instance, separated pixel shader and vertex shader units – the GPU has now just one big engine that can be programmed on the fly according to the task to be done: pixel shader, vertex shader, geometry and physics. This new architecture also makes it easier to add new shader types in the future.

The reason behind unified architecture is that in some situations the GPU was using all shader engines from one type (pixel shader engines, for example) and even queuing tasks for these engines while engines from another type (vertex shader engines, for example) were idle but cannot be used to perform a different task, since they were dedicated to a specific procedure type.

So Shader 4.0 allows the use of any shader engine by any shader process – pixel, vertex, geometry and physics.

In Figure 1, you can see the block diagram for GeForce 8800 GTX, the most high-end model on the new GeForce 8 series. As you can see, this GPU has eight shader units and none of them are dedicated to a specific task. Each shader unit has 16 streaming processors (the green boxes labeled SP), eight texture filtering units (the blue boxes labeled TF), four texture address units (not drawn in Figure 1) and one L1 memory cache (the orange box). This GPU also has six memory interface busses, each one are 64-bit wide and has its own L2 memory cache. The streaming processors work with a different (higher) clock rate.

GeForce 8800 GTXFigure 1: GeForce 8800 GTX block diagram.

So GeForce 8800 GTX has 128 shader engines (i.e., streaming processors; 16 streaming processors x 8 units) and 384-bit memory interface (64 bits x 6).

GeForce 8800 GTS, another GeForce 8 model that was released, has six shader units and five 64-bit memory interface busses, so it has 96 shader engines (16 streaming processors x 6 units) and 320-bit memory interface (64 bits x 5).

We will discuss the technical specs for these two chips in a while.

Other features found on GeForce 8 series include:

  • Support for thousand of independent simultaneous threads, technology NVIDIA calls GigaThread.

  • 16x anti-aliasing and 128-bit floating point precision HDR (High Dynamic Range) – GeForce 6 and 7 use a 64-bit precision engine. NVidia is calling these two technologies as “Lumenex”.

  • Nvidia is calling their physics simulation technology as “Quantum Effects”. Physics simulation improves the reality of effects like smoke, fire and explosions.

  • Hardware-based features to enhance 2D high definition video quality. Nvidia calls these enhancements as “PureVideo HD”. Two new features are provided on GeForce 8 series, HD noise reduction and HD edge enhancement. GeForce 8 series also support HDCP decryption, in order to play HD-DVD and Blue-ray that have this technology on PCs. Of course GeForce 8 supports all other enhancements provided on previous NVIDIA chips, like de-interlacing, high-quality scaling, inverse telecine and bad edit correction.

  • Recommended for resolutions starting at 1600 x 1200.

Author: Gabriel Torres

Gabriel Torres is a Brazilian best-selling ICT expert, with 24 books published. He started his online career in 1996, when he launched Clube do Hardware, which is one of the oldest and largest websites about technology in Brazil. He created Hardware Secrets in 1999 to expand his knowledge outside his home country.

Share This Post On
Subscribe To Our Newsletter

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our website.

You have been added to our newsletter!