If you don’t follow the video card market almost daily, it is really complicated to understand the differences between the several different NVIDIA graphics chips available on the market today. To facilitate knowing and understanding the differences among these chips, we have compiled the table below.
It is important to note that beginning in 2007, both AMD (ATI) and NVIDIA started referring to the memory clock of their video cards with the real clock rate used. In the past, manufacturers referred to the memory clocks with double (or quadruple) their real clock rate, because DDR and other technologies based on it (DDR2, GDDR3, etc.) allow the memory chip to transfer two data blocks per clock cycle, while GDDR5 memories allow the memory chip to transfer four data data blocks per clock cycle. So, a video card with a memory chip running at 500 MHz would be referred to as having a 1 GHz memory. In order to make our table comparable to older chips, we still use the effective memory clock and not the real memory clock.
GPU | Clock† | Turbo Clock | Memory Clock | Memory Interface |
GeForce4 MX 420 | 250 MHz | – | 166 MHz | 128-bit |
GeForce4 MX 440 SE | 250 MHz | – | 333 MHz | 64-bit |
GeForce4 MX 440 | 270 MHz | – | 400 MHz | 128-bit |
GeForce 4 MX 440 AGP 8x | 275 MHz | – | 512 MHz | 128-bit |
GeForce4 MX 460 | 300 MHz | – | 550 MHz | 128-bit |
GeForce MX 4000 | 250 MHz | – | * | 32-bit, 64-bit or 128-bit |
GeForce4 Ti 4200 | 250 MHz | – | 514 MHz (64 MB) or 444 MHz (128 MB) | 128-bit |
GeForce4 Ti 4200 AGP 8x | 250 MHz | – | 500 MHz | 128-bit |
GeForce PCX 4300 | 275 MHz | – | 512 MHz | 128-bit |
GeForce4 Ti 4400 | 275 MHz | – | 550 MHz | 128-bit |
GeForce4 Ti 4600 | 300 MHz | – | 650 MHz | 128-bit |
GeForce4 Ti 4800 SE | 275 MHz | – | 550 MHz | 128-bit |
GeForce4 Ti 4800 | 300 MHz | – | 650 MHz | 128-bit |
GeForce FX 5200 | 250 MHz | – | 400 MHz | 64-bit or 128-bit |
GeForce FX 5200 SE | 250 MHz | – | 333 MHz | 64-bit |
GeForce FX 5200 Ultra | 350 MHz | – | 650 MHz | 128-bit |
GeForce PCX 5300 | 325 MHz | – | 650 MHz | 128-bit |
GeForce FX 5600 | 325 MHz | – | 550 MHz | 128-bit |
GeForce FX 5600 SE | 250 MHz | – | 550 MHz | 64-bit |
GeForce FX 5500 | 270 MHz | – | 400 MHz | 64-bit or 128-bit |
GeForce FX 5600 Ultra | 500 MHz | – | 800 MHz | 128-bit |
GeForce FX 5700 LE | 250 MHz | – | 400 MHz | 128-bit |
GeForce FX 5700 | 425 MHz | – | 600 MHz | 128-bit |
GeForce FX 5700 VE | 300 MHz | – | 500 MHz | 128-bit |
GeForce FX 5700 Ultra | 475 MHz | – | 900 MHz | 128-bit |
GeForce PCX 5750 | 475 MHz | – | 900 MHz | 128-bit |
GeForce FX 5800 | 400 MHz | – | 800 MHz | 128-bit |
GeForce FX 5800 Ultra | 500 MHz | – | 1 GHz | 128-bit |
GeForce FX 5900 XT | 390 MHz | – | 680 MHz | 256-bit |
GeForce FX 5900 | 400 MHz | – | 850 MHz | 256-bit |
GeForce FX 5900 SE | 400 MHz | – | 700 MHz | 256-bit |
GeForce FX 5900 ZT | 325 MHz | – | 700 MHz | 256-bit |
GeForce PC 5900 | 350 MHz | – | 500 MHz | 256-bit |
GeForce FX 5900 Ultra | 450 MHz | – | 850 MHz | 256-bit |
GeForce PCX 5950 | 475 MHz | – | 950 MHz | 256-bit |
GeForce FX 5950 Ultra | 475 MHz | – | 950 MHz | 256-bit |
GeForce 6200 | 300 MHz | – | 550 MHz | 128-bit |
GeForce 6200 LE | 350 MHz | – | 550 MHz | 64-bit |
GeForce 6200 (TC) | 350 MHz | – | 666 MHz * | 32-bit or 64-bit |
GeForce 6500 (TC) | 400 MHz | – | 666 MHz | 32-bit or 64-bit |
GeForce 6600 | 300 MHz | – | 550 MHz * | 64-bit or 128-bit |
GeForce 6600 DDR2 | 350 MHz | – | 800 MHz * | 128-bit |
GeForce 6600 LE | 300 MHz | – | * | 64-bit or 128-bit |
GeForce 6600 GT | 500 MHz | – | 1 GHz | 128-bit |
GeForce 6600 GT AGP | 500 MHz | – | 900 MHz | 128-bit |
GeForce 6800 LE | 300 MHz | – | 700 MHz | 256-bit |
GeForce 6800 XT | 325 MHz | – | 600 MHz | 256-bit |
GeForce 6800 XT AGP | 325 MHz | – | 700 MHz | 256-bit |
GeForce 6800 | 325 MHz | – | 600 MHz | 256-bit |
GeForce 6800 AGP | 325 MHz | – | 700 MHz | 256-bit |
GeForce 6800 GS | 425 MHz | – | 1 GHz | 256-bit |
GeForce 6800 GS AGP | 350 MHz | – | 1 GHz | 256-bit |
GeForce 6800 GT | 350 MHz | – | 1 GHz | 256-bit |
GeForce 6800 Ultra | 400 MHz | – | 1.1 GHz | 256-bit |
GeForce 6800 Ultra Extreme | 450 MHz | – | 1.1 GHz | 256-bit |
GeForce 7100 GS (TC) | 350 MHz | – | 666 MHz * | 64-bit |
GeForce 7200 GS (TC) | 450 MHz | – | 800 MHz * | 64-bit |
GeForce 7300 SE (TC) | 225 MHz | – | * | 64-bit |
GeForce 7300 LE (TC) | 450 MHz | – | 648 MHz * | 64-bit |
GeForce 7300 GS (TC) | 550 MHz | – | 810 MHz * | 64-bit |
GeForce 7300 GT (TC) | 350 MHz | – | 667 MHz | 128-bit |
GeForce 7500 LE OEM | 550 MHz | – | 800 MHz | 64-bit |
GeForce 7600 GS | 400 MHz | – | 800 MHz | 128-bit |
GeForce 7600 GT | 560 MHz | – | 1.4 GHz | 128-bit |
GeForce 7650 GS OEM | 400 MHz | – | 800 MHz | 128-bit |
GeForce 7800 GS | 375 MHz | – | 1,2 GHz | 256-bit |
GeForce 7800 GT | 400 MHz | – | 1 GHz | 256-bit |
GeForce 7800 GTX | 430 MHz | – | 1.2 GHz | 256-bit |
GeForce 7800 GTX 512 | 550 MHz | – | 1.7 GHz | 256-bit |
GeForce 7900 GS | 450 MHz | – | 1.32 GHz | 256-bit |
GeForce 7900 GT | 450 MHz | – | 1.32 GHz | 256-bit |
GeForce 7900 GTO | 650 MHz | – | 1.32 GHz | 256-bit |
GeForce 7900 GTX | 650 MHz | – | 1.6 GHz | 256-bit |
GeForce 7950 GT | 550 MHz | – | 1.4 GHz | 256-bit |
GeForce 7950 GX2‡ | 500 MHz x2 | – | 1.2 GHz x2 | 256-bit x2 |
GeForce 8400 GS | 450 / 900 MHz | – | 800 MHz | 64-bit |
GeForce 8400 GS (Rev, 2) | 567 / 1,400 MHz | – | 800 MHz | 64-bit |
GeForce 8400 GS (Rev, 3) | 520 / 1,230 MHz | – | 800 MHz | 64-bit |
GeForce 8500 GT | 450 / 900 MHz | – | 666 MHz or 800 MHz | 128-bit |
GeForce 8600 GT DDR2 | 540 / 1.18 GHz | – | 666 MHz or 800 MHz | 128-bit |
GeForce 8600 GT GDDR3 | 540 / 1.18 GHz | – | 1.4 GHz | 128-bit |
GeForce 8600 GTS | 675 / 1.45 GHz | – | 2 GHz | 128-bit |
GeForce 8800 GS | 550 / 1,375 MHz | – | 1.6 GHz | 192-bit |
GeForce 8800 GT | 600 / 1.5 GHz | – | 1.8 GHz | 256-bit |
GeForce 8800 GTS | 500 / 1.2 GHz | – | 1.6 GHz | 320-bit |
GeForce 8800 GTS 512 | 650 / 1,625 MHz | – | 1.94 GHz | 256-bit |
GeForce 8800 GTX | 575 / 1.35 GHz | – | 1.8 GHz | 384-bit |
GeForce 8800 Ultra | 612 / 1.5 GHz | – | 2.16 GHz | 384-bit |
GeForce 9400 GT | 550 / 1.4 GHz | – | 800 MHz | 128-bit |
GeForce 9500 GT DDR2 | 550 / 1,400 MHz | – | 1 GHz | 128-bit |
GeForce 9500 GT GDDR3 | 550 / 1,400 MHz | – | 1.6 GHz | 128-bit |
GeForce 9600 GSO | 550 / 1,350 MHz | – | 1.6 GHz | 192-bit |
GeForce 9600 GSO 512 | 650 / 1,625 MHz | – | 1.8 GHz | 256-bit |
GeForce 9600 GT | 600 / 1,500 MHz or 650 / 1,625 MHz | – | 1.8 GHz | 256-bit |
GeForce 9800 GT | 600 / 1,500 MHz | – | 1.8 GHz | 256-bit |
GeForce 9800 GTX | 675 / 1,688 MHz | – | 2.2 GHz | 256-bit |
GeForce 9800 GTX+ | 738 / 1,836 MHz | – | 2.2 GHz | 256-bit |
GeForce 9800 GX2‡ | 600 / 1,500 GHz x2 | – | 2 GHz x2 | 256-bit x2 |
GeForce G100 | 567 / 1,400 MHz | – | 1 GHz | 64-bit |
GeForce GT 120 | 500 / 1,400 MHz | – | 1 GHz | 128-bit |
GeForce GT 130 | 500 / 1,250 MHz | – | 1 GHz | 192-bit |
GeForce GTS 150 | 738 / 1,836 MHz | – | 2 GHz | 256-bit |
GeForce 205 OEM | 589 / 1,402 MHz | – | 1 GHz | 64-bit |
GeForce 210 | 589 / 1,402 MHz | – | 1 GHz | 64-bit |
GeForce GT 220 | 625 / 1,360 MHz | – | 1.58 GHz | 128-bit |
GeForce GT 240 DDR3 | 550 / 1,340 MHz | – | 1.8 GHz | 128-bit |
GeForce GT 240 GDDR3 | 550 / 1,340 MHz | – | 2 GHz | 128-bit |
GeForce GT 240 GDDR5 | 550 / 1,340 MHz | – | 3.4 GHz | 128-bit |
GeForce GTS 240 OEM | 675 / 1,620 MHz | – | 2.2 GHz | 256-bit |
GeForce GTS 250 512 MiB | 675 / 1,620 MHz | – | 2 GHz | 256-bit |
GeForce GTS 250 1 GiB | 738 / 1,836 MHz | – | 2.2 GHz | 256-bit |
GeForce GTX 260 | 576 / 1,242 MHz | – | 2 GHz | 448-bit |
GeForce GTX 260/216 | 576 / 1,242 MHz | – | 2 GHz | 448-bit |
GeForce GTX 275 | 633 / 1,404 MHz | – | 2.268 GHz | 448-bit |
GeForce GTX 280 | 602 / 1,296 MHz | – | 2.21 GHz | 512-bit |
GeForce GTX 285 | 648 / 1,476 MHz | – | 2.48 GHz | 512-bit |
GeForce GTX 295‡ | 576 / 1,242 MHz x2 | – | 2 GHz x2 | 448-bit x2 |
GeForce 310 OEM | 589 / 1,402 MHz | – | 1 GHz | 64-bit |
GeForce 315 OEM | 475 / 1,100 MHz | – | 1.58 GHz | 64-bit |
GeForce GT 320 OEM | 540 / 1,302 MHz | – | 1.58 GHz | 128-bit |
GeForce GT 330 OEM | 500 / 1,250 MHz or 550 / 1,340 MHz | – | 1 GHz or 1.6 GHz | 128-bit. 192-bit or 256-bit |
GeForce GT 340 OEM | 550 / 1,340 MHz | – | 3.4 GHz | 128-bit |
GeForce GT 405 OEM | 589 / 1,402 MHz | – | 1.58 GHz | 64-bit |
GeForce GT 420 OEM | 700 / 1,400 MHz | – | 1.8 GHz | 128-bit |
GeForce GT 430 | 700 / 1,400 fgefoMHz | – | 1.6 GHz or 1.8 GHz | 128-bit |
GeForce GT 440 DDR3 | 810/1,620 MHz | – | 1.8 GHz | 128bits |
GeForce GT 440 GDDR5 | 810 / 1,620MHz | – | 3.2 GHz | 128-bit |
GeForce GTS 450 | 783 / 1,566 MHz | – | 3.6 GHz | 128-bit |
GeForce GTX 460 SE | 650 / 1,300 MHz | – | 3.4 GHz | 256-bit |
GeForce GTX 460 768 MiB | 675 / 1,350 MHz | – | 3.6 GHz | 192-bit |
GeForce GTX 460 1 GiB | 675 / 1,350 MHz | – | 3.6 GHz | 256-bit |
GeForce GTX 460 v2 1 GiB | 778 / 1,556 MHz | – | 4 GHz | 192-bit |
GeForce GTX 465 | 607 / 1,215 MHz | – | 3,206 MHz | 256-bit |
GeForce GTX 470 | 607 / 1,215 MHz | – | 3,348 MHz | 320-bit |
GeForce GTX 480 | 700 / 1,401 MHz | – | 3,696 MHz | 384-bit |
GeForce GT 510 OEM | 523 / 1,046 MHz | – | 1.8 GHz | 64-bit |
GeForce GT 520 | 810 / 1,620 MHz | – | 1.8 GHz | 64-bit |
GeForce GT 530 OEM | 700 / 1,400 MHz | – | Up to 1,796 MHz | 128-bit |
GeForce GT 545 DDR3 | 720 / 1,440 MHz | – | Up to 1.8 GHz | 192-bit |
GeForce GT 545 GDDR5 OEM | 870 / 1,740 MHz | – | 4 GHz | 128-bit |
GeForce GTX 550 Ti | 900 / 1,800 MHz | – | 4.1 GHz | 192-bit |
GeForce GTX 555 OEM | 776 / 1,553 MHz | – | 3,828 MHz | 192-bit |
GeForce GTX 560 | 810 a 950 / 1,620 a 1,900 MHz | – | 4 to 4.4 GHz | 256-bit |
GeForce GTX 560 OEM | 552 / 1,104 MHz | – | 3,206 MHz | 320-bit |
GeForce GTX 560 SE | 736 / 1,472 MHz | – | 3,828 MHz | 192-bit |
GeForce GTX 560 Ti | 822 / 1,644 MHz | – | 4,008 MHz | 256-bit |
GeForce GTX 560 Ti/448 | 732 / 1,464 MHz | – | 3.8 GHz | 320-bit |
Geforce GTX 560 Ti OEM | 732 / 1,464 MHz | – | 3.8 GHz | 320-bit |
GeForce GTX 570 | 732 / 1,464 MHz | – | 3.8 GHz | 320-bit |
GeForce GTX 580 | 772 / 1,544 MHz | – | 4,008 MHz | 384-bit |
GeForce GTX 590‡ | 607 / 1.215 MHz x2 | – | 3,414 MHz x2 | 384-bit x2 |
GeForce GT 605 OEM | 523 / 1,046 MHz | – | 1,796 MHz | 64-bit |
GeForce GT 610 | 810 / 1,620 MHz | – | 1,796 MHz | 64-bit |
GeForce GT 620 | 700 / 1,400 MHz | – | 1.8 GHz | 64-bit |
GeForce GT 620 OEM | 810 / 1,620 MHz | – | 1,796 MHz | 64-bit |
GeForce GT 630 D3 | 700 MHz / 1.4 GHz | – | 1.6 a 1.8 GHz | 128-bit |
GeForce GT 630 G5 | 810 MHz / 1.62 GHz | – | 3.2 GHz (GDDR5) | 128-bit |
GeForce GT 630 2 GiB | 902 MHz | – | 1.8 GHz | 64-bit |
GeForce GT 630 OEM | 875 MHz | – | 1,782 MHz | 128-bit |
GeForce GT 635 | 967 MHz | – | 2 GHz | 64-bit |
GeForce GT 640 2 GiB DDR3 | 900 MHz | – | 1.8 GHz | 128-bit |
GeForce GT 640 1 GiB GDDR5 | 1,046 MHz | – | 5 GHz | 64-bit |
GeForce GT 640 OEM DDR3 | 797 MHz | – | 1,782 MHz | 128-bit |
GeForce GT 640 OEM DDR3 | 720 / 1,440 MHz | – | 1,782 MHz | 192-bit |
GeForce GT 640 OEM GDDR5 | 950 MHz | – | 5 GHz | 128-bit |
GeForce GT 645 OEM | 823 MHz | – | 4 GHz | 128-bit |
GeForce GTX 650 | 1,058 MHz | – | 5 GHz | 128-bit |
GeForce GTX 650 Ti | 928 MHz | – | 5.4 GB/s | 128-bit |
GeForce GTX 650 Ti Boost | 980 MHz | 1,033 MHz | 6 GHz | 192-bit |
GeForce GTX 660 | 980 MHz | 1,033 MHz | 6 GHz | 192-bit |
GeForce GTX 660 Ti | 915 MHz | 980 MHz | 6 GHz | 192-bit |
GeForce GTX 670 | 915 MHz | 980 MHz | 6 GHz | 256-bit |
GeForce GTX 680 | 1,006 MHz | 1,058 MHz | 6,008 MHz | 256-bit |
GeForce GTX 690‡ | 915 MHz x2 | 1,019 MHz x2 | 6,008 MHz x2 | 256bits x2 |
GeForce GT 720 (DDR3) | 797 MHz | – | 1.8 GHz | 64-bit |
GeForce GT 720 (GDDR5) | 797 MHz | – | 5 GHz | 64-bit |
GeForce GT 730 (DDR3, 64-bit) | 902 MHz | ‐ | 1.8 GHz | 64-bit |
GeForce GT 730 (DDR3, 128-bit) | 700 MHz | – | 1.8 GHz | 128-bit |
GeForce GT 730 (GDDR5) | 902 MHz | – | 5 GHz | 64-bit |
GeForce GT 740 (DDR3) | 993 MHz | – | 1.8 GHz | 128-bit |
GeForce GT 740 (GDDR5) | 993 MHz | – | 5 GHz | 128-bit |
GeForce GTX 750 | 1,020 MHz | 1,085 MHz | 5 GHz | 128-bit |
GeForce GTX 750 Ti | 1,020 MHz | 1,085 MHz | 5.4 GHz | 128-bit |
GeForce GTX 760 | 980 MHz | 1,033 MHz | 6 GHz | 256-bit |
GeForce GTX 770 | 1,046 MHz | 1,085 MHz | 7 GHz | 256-bit |
GeForce GTX 780 | 863 MHz | 900 MHz | 6 GHz | 384-bit |
GeForce GTX 780 Ti | 875 MHz | 928 MHz | 7 GHz | 384-bit |
GeForce GTX TITAN | 837 MHz | 876 MHz | 6 GHz | 384-bit |
GeForce GTX TITAN Black | 889 MHz | 980 MHz | 7 GHz | 384-bit |
GeForce GTX TITAN Z‡ | 705 MHz | 876 MHz | 7 GHz | 384-bit |
GeForce GTX 960 | 1,127 MHz | 1,178 MHz | 7 GHz | 128-bit |
GeForce GTX 970 | 1,050 MHz | 1,178 MHz | 7 GHz | 256-bit |
GeForce GTX 980 | 1,126 MHz | 1,216 MHz | 7 GHz | 256-bit |
GeForce GTX 980 Ti | 1,000 MHz | 1,075 MHz | 7 GHz | 384-bit |
GeForce GTX TITAN X | 1,000 MHz | 1,075 MHz | 7 GHz | 384-bit |
GPU | Memory Bandwidth | Proc. | DirectX | OpenGL | PCIe |
GeForce4 MX 420 | 2.6 GB/s | 1 | 7 | 1.2 | AGP 4x |
GeForce4 MX 440 SE | 2.6 GB/s | 1 | 7 | 1.2 | AGP 4x |
GeForce4 MX 440 | 6.4 GB/s | 1 | 7 | 1.2 | AGP 4x |
GeForce 4 MX 440 AGP 8x | 8.1 GB/s | 1 | 7 | 1.2 | AGP 8x |
GeForce4 MX 460 | 8.8 GB/s | 1 | 7 | 1.2 | AGP 4x |
GeForce MX 4000 | * | 1 | 7 | 1.2 | AGP 8x |
GeForce4 Ti 4200 | 8.2 GB/s (64 MB) or 7.1 GB/s (128 MB) | 4 | 8.1 | 1.4 | AGP 4x |
GeForce4 Ti 4200 AGP 8x | 8 GB/s | 4 | 8.1 | 1.4 | AGP 8x |
GeForce PCX 4300 | 8.2 GB/s | 2 | 7 | 1.4 | 1.1 |
GeForce4 Ti 4400 | 8.8 GB/s | 4 | 8.1 | 1.4 | AGP 4x |
GeForce4 Ti 4600 | 10.4 GB/s | 4 | 8.1 | 1.4 | AGP 4x |
GeForce4 Ti 4800 SE | 8.8 GB/s | 4 | 8.1 | 1.4 | AGP 8x |
GeForce4 Ti 4800 | 10.4 GB/s | 4 | 8.1 | 1.4 | AGP 8x |
GeForce FX 5200 | 3.2 GB/s or 6.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5200 SE | 2.7 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5200 Ultra | 10.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce PCX 5300 | 10.4 GB/s | 4 | 9.0 | 1.5 | 1.1 |
GeForce FX 5600 | 8.8 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5600 SE | 4.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5500 | 3.2 GB/s or 6.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5600 Ultra | 12.8 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5700 LE | 6.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5700 | 9.6 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5700 VE | 8 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5700 Ultra | 14.4 GB/s | 4 | 9.0 | 1.5 | AGP 8x |
GeForce PCX 5750 | 14.4 GB/s | 4 | 9.0 | 1.5 | 1.1 |
GeForce FX 5800 | 12.8 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5800 Ultra | 16 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5900 XT | 21.7 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5900 | 27.2 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5900 SE | 22.4 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce FX 5900 ZT | 22.4 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce PC 5900 | 17.6 GB/s | 8 | 9.0 | 1.5 | 1.1 |
GeForce FX 5900 Ultra | 27.2 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce PCX 5950 | 30.4 GB/s | 8 | 9.0 | 1.5 | 1.1 |
GeForce FX 5950 Ultra | 30.4 GB/s | 8 | 9.0 | 1.5 | AGP 8x |
GeForce 6200 | 8.8 GB/s | 4 | 9.0c | 2.1 | AGP 8x or 1,1 |
GeForce 6200 LE | 4.4 GB/s | 2 | 9.0c | 2.1 | 1.1 |
GeForce 6200 (TC) | 2.66 GB/s or 5.32 GB/s * | 4 | 9.0c | 2.1 | 1.1 |
GeForce 6500 (TC) | 2.66 GB/s or 5.32 GB/s | 4 | 9.0c | 2.1 | 1.1 |
GeForce 6600 | 4.4 GB/s or 8.8 GB/s * | 8 | 9.0c | 2.1 | AGP 8x or 1,1 |
GeForce 6600 DDR2 | 12.8 GB/s | 8 | 9.0c | 2.1 | AGP 8x or 1,1 |
GeForce 6600 LE | 4 GB/s or 8 GB/s | 4 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 6600 GT | 16 GB/s | 8 | 9.0c | 2.1 | 1.1 |
GeForce 6600 GT AGP | 14.4 GB/s | 8 | 9.0c | 2.1 | AGP 8x |
GeForce 6800 LE | 22.4 GB/s | 8 | 9.0c | 2.1 | AGP 8x |
GeForce 6800 XT | 19.2 GB/s | 8 | 9.0c | 2.1 | 1.1 |
GeForce 6800 XT AGP | 22.4 GB/s | 8 | 9.0c | 2.1 | AGP 8x |
GeForce 6800 | 19.2 GB/s | 12 | 9.0c | 2.1 | 1.1 |
GeForce 6800 AGP | 22.4 GB/s | 12 | 9.0c | 2.1 | AGP 8x |
GeForce 6800 GS | 32 GB/s | 12 | 9.0c | 2.1 | 1.1 |
GeForce 6800 GS AGP | 32 GB/s | 12 | 9.0c | 2.1 | AGP 8x |
GeForce 6800 GT | 32 GB/s | 16 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 6800 Ultra | 35.2 GB/s | 16 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 6800 Ultra Extreme | 35.2 GB/s | 16 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7100 GS (TC) | 5.32 GB/s * | 4 | 9.0c | 2.1 | 1.1 |
GeForce 7200 GS (TC) | 6.4 GB/s * | 4 | 9.0c | 2.1 | 1.1 |
GeForce 7300 SE (TC) | * | 4 | 9.0c | 2.1 | 1.1 |
GeForce 7300 LE (TC) | 5.2 GB/s * | 4 | 9.0c | 2.1 | 1.1 |
GeForce 7300 GS (TC) | 6.5 GB/s * | 4 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7300 GT (TC) | 10.6 GB/s | 8 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7500 LE OEM | 6.4 GB/s | 4 | 9.0c | 2.1 | 1.1 |
GeForce 7600 GS | 12.8 GB/s | 12 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7600 GT | 22.4 GB/s | 12 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7650 GS OEM | 12.8 GB/s | 12 | 9.0c | 2.1 | 1.1 |
GeForce 7800 GS | 38.4 GB/s | 16 | 9.0c | 2.1 | AGP 8x |
GeForce 7800 GT | 32 GB/s | 20 | 9.0c | 2.1 | 1.1 |
GeForce 7800 GTX | 38.4 GB/s | 24 | 9.0c | 2.1 | 1.1 |
GeForce 7800 GTX 512 | 54.4 GB/s | 24 | 9.0c | 2.1 | 1.1 |
GeForce 7900 GS | 42.2 GB/s | 20 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7900 GT | 42.2 GB/s | 24 | 9.0c | 2.1 | 1.1 |
GeForce 7900 GTO | 42.2 GB/s | 24 | 9.0c | 2.1 | 1.1 |
GeForce 7900 GTX | 51.2 GB/s | 24 | 9.0c | 2.1 | 1.1 |
GeForce 7950 GT | 44.8 GB/s | 24 | 9.0c | 2.1 | AGP 8x or 1.1 |
GeForce 7950 GX2‡ | 38.4 GB/s x2 | 24 x2 | 9.0c | 2.1 | 1.1 |
GeForce 8400 GS | 6.4 GB/s | 16 | 10 | 2.1 | 1.1 |
GeForce 8400 GS (Rev, 2) | 6.4 GB/s | 8 | 10 | 3,3 | 2.0 |
GeForce 8400 GS (Rev, 3) | 6.4 GB/s | 16 | 10.1 | 3,3 | 2.0 |
GeForce 8500 GT | 10.6 GB/s or 12.8 GB/s | 16 | 10 | 2.1 | 1.1 |
GeForce 8600 GT DDR2 | 10.6 GB/s or 12.8 GB/s | 32 | 10 | 2.1 | 1.1 |
GeForce 8600 GT GDDR3 | 22.4 GB/s | 32 | 10 | 2.1 | 1.1 |
GeForce 8600 GTS | 32 GB/s | 32 | 10 | 2.1 | 2.0 |
GeForce 8800 GS | 38.4 GB/s | 96 | 10 | 2.1 | 2.0 |
GeForce 8800 GT | 57.6 GB/s | 112 | 10 | 2.1 | 2.0 |
GeForce 8800 GTS | 64 GB/s | 96 | 10 | 2.1 | 1.1 |
GeForce 8800 GTS 512 | 62.08 GB/s | 128 | 10 | 2.1 | 2.0 |
GeForce 8800 GTX | 86.4 GB/s | 128 | 10 | 2.1 | 1.1 |
GeForce 8800 Ultra | 103.6 GB/s | 128 | 10 | 2.1 | 1.1 |
GeForce 9400 GT | 12.8 GB/s | 16 | 10 | 2.1 | 2.0 |
GeForce 9500 GT DDR2 | 16 GB/s | 32 | 10 | 2.1 | 2.0 |
GeForce 9500 GT GDDR3 | 25.6 GB/s | 32 | 10 | 2.1 | 2.0 |
GeForce 9600 GSO | 38.4 GB/s | 96 | 10 | 2.1 | 2.0 |
GeForce 9600 GSO 512 | 57.6 GB/s | 48 | 10 | 2.1 | 2.0 |
GeForce 9600 GT | 57.6 GB/s | 64 | 10 | 2.1 | 2.0 |
GeForce 9800 GT | 57.6 GB/s | 112 | 10 | 2.1 | 2.0 |
GeForce 9800 GTX | 70.4 GB/s | 128 | 10 | 2.1 | 2.0 |
GeForce 9800 GTX+ | 70.4 GB/s | 128 | 10 | 2.1 | 2.0 |
GeForce 9800 GX2‡ | 64 GB/s x2 | 128 x2 | 10 | 2.1 | 2.0 |
GeForce G100 | 8 GB/s | 8 | 10 | 2.1 | 2.0 |
GeForce GT 120 | 16 GB/s | 32 | 10 | 3.0 | 2.0 |
GeForce GT 130 | 24 GB/s | 48 | 10 | 2.1 | 2.0 |
GeForce GTS 150 | 64 GB/s | 128 | 10 | 2.1 | 2.0 |
GeForce 205 OEM | 8 GB/s | 8 | 10.1 | 3.1 | 2.0 |
GeForce 210 | 8 GB/s | 16 | 10.1 | 3.1 | 2.0 |
GeForce GT 220 | 25.28 GB/s | 48 | 10.1 | 3.1 | 2.0 |
GeForce GT 240 DDR3 | 28.8 GB/s | 96 | 10.1 | 3.2 | 2.0 |
GeForce GT 240 GDDR3 | 32 GB/s | 96 | 10.1 | 3.2 | 2.0 |
GeForce GT 240 GDDR5 | 54.4 GB/s | 96 | 10.1 | 3.2 | 2.0 |
GeForce GTS 240 OEM | 70.4 GB/s | 112 | 10 | 3.0 | 2.0 |
GeForce GTS 250 512 MiB | 64 GB/s | 128 | 10 | 3.0 | 2.0 |
GeForce GTS 250 1 GiB | 70.4 GB/s | 128 | 10 | 3.0 | 2.0 |
GeForce GTX 260 | 112 GB/s | 192 | 10 | 2.1 | 2.0 |
GeForce GTX 260/216 | 112 GB/s | 216 | 10 | 2.1 | 2.0 |
GeForce GTX 275 | 127 GB/s | 240 | 10 | 3.0 | 2.0 |
GeForce GTX 280 | 141.7 GB/s | 240 | 10 | 2.1 | 2.0 |
GeForce GTX 285 | 159 GB/s | 240 | 10 | 2.1 | 2.0 |
GeForce GTX 295‡ | 112 GB/s x2 | 240 x2 | 10 | 2.1 | 2.0 |
GeForce 310 OEM | 8 GB/s | 16 | 10.1 | 3.1 | 2.0 |
GeForce 315 OEM | 12.6 GB/s | 48 | 10.1 | 3.2 | 2.0 |
GeForce GT 320 OEM | 25.3 GB/s | 72 | 10.1 | 3.2 | 2.0 |
GeForce GT 330 OEM | (varies) | 96 or 112 | 10 | 3.2 | 2.0 |
GeForce GT 340 OEM | 54.4 GB/s | 96 | 10.1 | 3.2 | 2.0 |
GeForce GT 405 OEM | 12.64 GB/s | 16 | 10.1 | 3.1 | 2.0 |
GeForce GT 420 OEM | 28.8 GB/s | 48 | 11 | 4.1 | 2.0 |
GeForce GT 430 | 25.6 GB/s or 28.8 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 440 DDR3 | 28.8 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 440 GDDR5 | 51.2 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GTS 450 | 57.7 GB/s | 192 | 11 | 4.2 | 2.0 |
GeForce GTX 460 SE | 108.8 GB/s | 288 | 11 | 4.1 | 2.0 |
GeForce GTX 460 768 MiB | 86.4 GB/s | 336 | 11 | 4.1 | 2.0 |
GeForce GTX 460 1 GiB | 115.2 GB/s | 336 | 11 | 4.1 | 2.0 |
GeForce GTX 460 v2 1 GiB | 96.2 GB/s | 336 | 11 | 4.1 | 2.0 |
GeForce GTX 465 | 102.6 GB/s | 352 | 11 | 4.2 | 2.0 |
GeForce GTX 470 | 133.9 GB/s | 448 | 11 | 4.2 | 2.0 |
GeForce GTX 480 | 177.4 GB/s | 480 | 11 | 3,2 | 2.0 |
GeForce GT 510 OEM | 14.4 GB/s | 48 | 11 | 4.1 | 2.0 |
GeForce GT 520 | 14.4 GB/s | 48 | 11 | 4.2 | 2.0 |
GeForce GT 530 OEM | Up to 28.7 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 545 DDR3 | Up to 43.2 GB/s | 144 | 11 | 4.1 | 2.0 |
GeForce GT 545 GDDR5 OEM | 64 GB/s | 144 | 11 | 4.2 | 2.0 |
GeForce GTX 550 Ti | 98.4 GB/s | 192 | 11 | 4.2 | 2.0 |
GeForce GTX 555 OEM | 91.9 GB/s | 288 | 11 | 4.2 | 2.0 |
GeForce GTX 560 | 128 to 140.8GB/s | 336 | 11 | 4.1 | 2.0 |
GeForce GTX 560 OEM | 128.2 GB/s | 384 | 11 | 4.2 | 2.0 |
GeForce GTX 560 SE | 92 GB/s | 288 | 11 | 4.1 | 2.0 |
GeForce GTX 560 Ti | 128.3 GB/s | 384 | 11 | 4.1 | 2.0 |
GeForce GTX 560 Ti/448 | 152 GB/s | 448 | 11 | 4.1 | 2.0 |
Geforce GTX 560 Ti OEM | 152 GB/s | 352 | 11 | 4.1 | 2.0 |
GeForce GTX 570 | 152 GB/s | 480 | 11 | 4.2 | 2.0 |
GeForce GTX 580 | 192.4 GB/s | 512 | 11 | 4.2 | 2.0 |
GeForce GTX 590‡ | 163.9 GB/s x2 | 512 x2 | 11 | 4.2 | 2.0 |
GeForce GT 605 OEM | 14.4 GBs | 48 | 11 | 4.2 | 2.0 |
GeForce GT 610 | 14.4 GB/s | 48 | 11 | 4.2 | 2.0 |
GeForce GT 620 | 14.4 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 620 OEM | 14.4 GB/s | 48 | 11 | 4.2 | 2.0 |
GeForce GT 630 D3 | 25.6 to 28.8 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 630 G5 | 51.2 GB/s | 96 | 11 | 4.2 | 2.0 |
GeForce GT 630 2 GiB | 14.4 GB/s | 384 | 11.1 | 4.2 | 2.0 |
GeForce GT 630 OEM | 28.5 GB/s | 192 | 11 | 4.2 | 3.0 |
GeForce GT 635 | 16 GB/s | 384 | 11 | 4,3 | 3.0 |
GeForce GT 640 2 GiB DDR3 | 28.5 GB/s | 384 | 11 | 4,3 | 3.0 |
GeForce GT 640 1 GiB GDDR5 | 40 GB/s | 384 | 11,1 | 4,3 | 3.0 |
GeForce GT 640 OEM DDR3 | 28.5 GB/s | 384 | 11 | 4.2 | 3.0 |
GeForce GT 640 OEM DDR3 | 43 GB/s | 144 | 11 | 4.2 | 2.0 |
GeForce GT 640 OEM GDDR5 | 80 GB/s | 384 | 11 | 4.2 | 3.0 |
GeForce GT 645 OEM | 64 GB/s | 576 | 11 | 4,3 | 3.0 |
GeForce GTX 650 | 80 GB/s | 384 | 11 | 4,3 | 3.0 |
GeForce GTX 650 Ti | 86.4 GB/s | 768 | 11 | 4,3 | 3.0 |
GeForce GTX 650 Ti Boost | 144.2 GB/s | 768 | 11 | 4,3 | 3.0 |
GeForce GTX 660 | 144.2 GB/s | 960 | 11 | 4,3 | 3.0 |
GeForce GTX 660 Ti | 144.2 GB/s | 1,344 | 11 | 4,3 | 3.0 |
GeForce GTX 670 | 192.2 GB/s | 1,344 | 11 | 4.2 | 3.0 |
GeForce GTX 680 | 192.2 GB/s | 1,536 | 11 | 4.2 | 3.0 |
GeForce GTX 690‡ | 192.2 GB/s x2 | 1,536 x2 | 11 | 4.2 | 3.0 |
GeForce GT 720 (DDR3) | 14.4 GB/s | 192 | 11 | 4.4 | 2.0 |
GeForce GT 720 (GDDR5) | 40 GB/s | 192 | 11 | 4.4 | 2.0 |
GeForce GT 730 (DDR3, 64-bit) | 14.4 GB/s | 384 | 11 | 4.4 | 2.0 |
GeForce GT 730 (DDR3, 128-bit) | 28.8 GB/s | 96 | 11 | 4.4 | 2.0 |
GeForce GT 730 (GDDR5) | 40 GB/s | 384 | 11 | 4.4 | 2.0 |
GeForce GT 740 (DDR3) | 28.8 GB/s | 384 | 11 | 4.4 | 3.0 |
GeForce GT 740 (GDDR5) | 80 GB/s | 384 | 11 | 4.4 | 3.0 |
GeForce GTX 750 | 80 GB/s | 512 | 11.2 | 4,4 | 3.0 |
GeForce GTX 750 Ti | 86.4 GB/s | 640 | 11.2 | 4.4 | 3.0 |
GeForce GTX 760 | 192.2 GB/s | 1,152 | 11 | 4,3 | 3.0 |
GeForce GTX 770 | 224.3 GB/s | 1,536 | 11 | 4,3 | 3.0 |
GeForce GTX 780 | 288.4 GB/s | 2,304 | 11 | 4,3 | 3.0 |
GeForce GTX 780 Ti | 336 GB/s | 2,880 | 11 | 4,3 | 3.0 |
GeForce GTX TITAN | 288.4 GB/s | 2,688 | 11 | 4,3 | 3.0 |
GeForce GTX TITAN Black | 336 GB/s | 2,880 | 11.2 | 4.4 | 3.0 |
GeForce GTX TITAN Z‡ | 336 GB/s | 2,880 | 11 | 4.4 | 3.0 |
GeForce GTX 960 | 112 GB/s | 1,024 | 12 | 4.4 | 3.0 |
GeForce GTX 970 | 224 GB/s | 1,664 | 12 | 4.4 | 3.0 |
GeForce GTX 980 | 224 GB/s | 2,048 | 12 | 4.4 | 3.0 |
GeForce GTX 980 Ti | 336 GB/s | 2,816 | 12.1 | 4.5 | 3.0 |
GeForce GTX TITAN X | 336 GB/s | 3,072 | 12.1 | 4.5 | 3.0 |
† Some GPUs make use of two clock signals, a higher one for the shader processors and lower one for the rest of the chip.
†† This video card has two GPUs working in parallel (SLI). The specs published are for just one of the chips.
* The manufacturer can setup a different memory clock rate or interface, so pay attention because not all video cards based on this chip have this spec. The memory transfer rate will depend on the interface and clock rate used.
(TC) means TurboCache. TurboCache is a technology that allows the video card to simulate more video memory by using part of the main system RAM as video memory. Read our tutorial on this subject for a better understanding on this feature.
As for the DirectX version, check the table below:
DirectX | Shader Model |
7.0 | No |
8.1 | 1.4 |
9.0 | 2.0 |
9.0c | 3.0 |
10 | 4.0 |
10.1 | 4.1 |
11 | 5.0 |
11.1 | 5.1 |
11.2 | 5.2 |
12 | 6.0 |
12.1 | 6.1 |
For a detailed discussion on the subject, read our DirectX tutorial.
[nextpage title=”Temperature and Power Specs”]
In the table below, we compare the temperature and power specs for NVIDIA chips. Chips not listed mean that NVIDIA does not publish their maximum temperature, maximum power, and minimum PSU power specs.
It is important to note that these are all “worst-case scenario” values. Under normal operation, the graphics chip will present a temperature below its maximum and consume a power below what is published on the table. The same holds true for the minimum required power supply, where NVIDIA publishes a “safe” value, taken in consideration that the power supplies are labeled with their maximum peak power, not with their maximum continuous power.
GPU | Max. Temp. | Max. TDP | Minimum power supply |
GeForce 9400 GT | 105° C | 50 W | 300 W |
GeForce 9500 GT | 105° C | 50 W | 350 W |
GeForce 9600 GSO | 105° C | 105 W | 400 W |
GeForce 9600 GSO (512 MB) | 105° C | 90 W | 400 W |
GeForce 9600 GT (600 MHz) | 105° C | 59 W | 300 W |
GeForce 9600 GT (650 MHz) | 105° C | 96 W | 400 W |
GeForce 9800 GT | 105° C | 105 W | 400 W |
GeForce 9800 GTX | 105° C | 140 W | 450 W |
GeForce 9800 GTX+ | 105° C | 141 W | 450 W |
GeForce 9800 GX2 | 105° C | 197 W | 580 W |
GeForce G 100 | 105° C | 35 W | 300 W |
GeForce GT 120 | 105° C | 50 W | 350 W |
GeForce GT 130 | 105° C | 75 W | 350 W |
GeForce GTS 150 | 105° C | 141 W | 450 W |
GeForce 205 OEM | 105° C | 30,5 W | 300 W |
GeForce 210 | 105° C | 30,5 W | 300 W |
GeForce GT 220 | 105° C | 58 W | 350 W |
GeForce GT 240 | 105° C | 69 W | 300 W |
GeForce GTS 240 | 105° C | 120 W | 450 W |
GeForce GTS 250 | 105° C | 150 W | 450 W |
GeForce GTX 260 | 105° C | 182 W | 500 W |
GeForce GTX 275 | 105° C | 219 W | 550 W |
GeForce GTX 280 | 105° C | 236 W | 550 W |
GeForce GTX 285 | 105° C | 204 W | 550 W |
GeForce GTX 295 | 105° C | 289 W | 680 W |
GeForce 310 | 105° C | 30,5 W | 300 W |
GeForce 315 | 105° C | 33 W | ND |
GeForce GT 320 | 105° C | 43 W | 300 W |
GeForce GT 330 | 105° C | 75 W | 300 W |
GeForce GT 340 | 105° C | 69 W | 300 W |
GeForce GT 420 | 105° C | 50 W | ND |
GeForce GT 430 | 98° C | 49 W | 300 W |
GeForce GT 440 | 98° C | 65 W | 300 W |
GeForce GTS 450 | 100° C | 106 W | 400 W |
GeForce GTX 460 | 104° C | 160 W | 450 W |
GeForce GTX 465 | 105° C | 200 W | 550 W |
GeForce GTX 470 | 105° C | 215 W | 550 W |
GeForce GTX 480 | 105° C | 250 W | 600 W |
GeForce GT 510 | 102° C | 25 W | ND |
GeForce GT 520 | 102° C | 29 W | 300 W |
GeForce GT 530 | 105° C | 50 W | ND |
GeForce GT 545 (DDR3) | 100° C | 70 W | ND |
GeForce GT 545 GDDR5 | 100° C | 105 W | ND |
GeForce GTX 550 Ti | 100° C | 116 W | 400 W |
GeForce GTX 555 OEM | 99° C | 150 W | ND |
GeForce GTX 560 OEM | 97° C | 150 W | 450 W |
GeForce GTX 560 | 99° C | 150 W | 450 W |
GeForce GTX 560 Ti OEM | 97° C | 210 W | ND |
GeForce GTX 560 Ti | 100° C | 170 W | 500 W |
GeForce GTX 560 Ti/448 | 99° C | 170 W | 500 W |
GeForce GTX 570 | 97° C | 219 W | 550 W |
GeForce GTX 580 | 97° C | 244 W | 600 W |
GeForce GTX 590 | 97° C | 365 W | 700 W |
GeForce GT 605 | 102° C | 25 W | ND |
GeForce GT 610 | 102° C | 29 W | 300 W |
GeForce GT 620 | 98° C | 49 W | 300 W |
GeForce GT 620 OEM | 102° C | 30 W | ND |
GeForce GT 630 D3 | 98° C | 49 W | 300 W |
GeForce GT 630 G5 | 98° C | 65 W | 300 W |
GeForce GT 630 2 GiB | 90° C | 25 W | 300 W |
GeForce GT 630 OEM | 102° C | 50 W | ND |
GeForce GT 640 2 GiB DDR3 | 98° C | 65 W | 350 W |
GeForce GT 640 1 GiB GDDR5 | 95° C | 49 W | 300 W |
GeForce GT 640 OEM (DDR3 128 bits) | 102° C | 50 W | ND |
GeForce GT 640 OEM (DDR3 192 bits) | 102° C | 75 W | ND |
GeForce GT 640 OEM (GDDR5) | 102° C | 75 W | ND |
GeForce GT 645 | 102° C | 140 W | ND |
GeForce GTX 650 | 98° C | 64 W | 400 W |
GeForce GTX 650 Ti | 105° C | 110 W | 400 W |
GeForce GTX 650 Ti Boost | 97° C | 134 W | 450 W |
GeForce GTX 660 | 97° C | 140 W | 450 W |
GeForce GTX 660 Ti | 97° C | 150 W | 450 W |
GeForce GTX 670 | 97° C | 170 W | 500 W |
GeForce GTX 680 | 98° C | 195 W | 550 W |
GeForce GTX 690 | 98° C | 300 W | 650 W |
GeForce GT 720 | 98° C | 19 W | 300 W |
GeForce GT 730 (DDR3, 64 bits) | 98° C | 23 W | 300 W |
GeForce GT 730 (DDR3, 128 bits) | 98° C | 49 W | 300 W |
GeForce GT 730 (GDDR5) | 98° C | 38 W | 300 W |
GeForce GT 740 | 98° C | 64 W | 400 W |
GeForce GTX 750 | 95° C | 55 W | 300 W |
GeForce GTX 750 Ti | 95° C | 60 W | 300 W |
GeForce GTX 760 | 97° C | 170 W | 500 W |
GeForce GTX 770 | 98° C | 230 W | 600 W |
GeForce GTX 780 | 95° C | 250 W | 600 W |
GeForce GTX 780 Ti | 95° C | 250 W | 600 W |
GeForce GTX TITAN | 95° C | 250 W | 600 W |
GeForce GTX TITAN Black | 95° C | 250 W | 600 W |
GeForce GTX TITAN Z | 95° C | 375 W | 700 W |
GeForce GTX 960 | 98° C | 120 W | 400 W |
GeForce GTX 970 | 98° C | 145 W | 500 W |
GeForce GTX 980 | 98° C | 165 W | 500 W |
GeForce GTX 980 Ti | 92° C | 250 W | 600 W |
GeForce GTX TITAN X | 91° C | 250 W | 600 W |
Leave a Reply