The Future According to NVIDIA
By Gabriel Torres on May 25, 2008 - 5:05 PM


Last week NVIDIA held their Spring 2008 Editor’s day, where they presented their forthcoming series of graphics processing units, which will be released next month. While we can’t talk about this new chip series yet due to the Non-Disclosure Agreement (NDA), we can talk about some ideas that NVIDIA is seeing as “the future of computing” – basically more GPGPU usage (i.e. the use of the graphics chip to process regular programs) and the co-existence of “competing” technologies like ray tracing and rasterization.

NVIDIA Editors Day Spring 2008 - CUDA
click to enlarge
Figure 1: The future of computing, according to NVIDIA.

During the whole Editor’s Day NVIDIA repeated ad nauseum how marvelous GPGPU is, showing several examples of applications where performance increased monstrously by the use of this technique. For those who don’t know the concept, the idea is to make the video card GPU to process regular programs instead of using the CPU. What is allowing this to be possible is NVIDIA’s CUDA compiler, which is capable of compiling any program written in C to be run on any NVIDIA GPU from series 8 on. We’ve already wrote an article explaining more about this technology.

Is NVIDIA saying that in the future GPUs will replace CPUs? No exactly. The computer will still need a CPU, but the way NVIDIA is seeing it the role of the CPU will dramatically decrease in the future. In fact, this is already happening. Thru their “nTeresting” newsletters NVIDIA has been hammering Intel in the past month, claiming that contrary to what Intel wants you to believe CPU’s aren’t playing an important role on gaming performance anymore and the savvy user should buy a cheaper CPU and spend the saved money on a better video card for a better gaming performance. With GPU performance increasing and more and more tasks that were previously performed on the CPU being transferred to the GPU, this idea makes sense.

With GPGPU this idea will also be valid for regular applications, as soon as mainstream products start to use the GPU for processing, boosting the application performance. During the event Adobe declared that they will start using GPGPU on their forthcoming products, in particular the next version of Photoshop to be released around September, and this could represent the first step towards that direction. Even though from the presentations it is clear that GPGPU can really boost performance on specific applications, the future is always unclear and the performance increase brought by GPGPU for the average user will only depend on software developers upgrading their programs to support it.

It is always important to remind that programs compiled to use GPU’s from NVIDIA for processing won’t run on ATI’s. Of course final products can detect which video card you have installed on your system and load the CUDA-compiled code – which will provide the performance increase – if you have an NVIDIA video card.

As for the rasterization vs. ray tracing battle, NVIDIA is seeing the co-existence of both technologies in the future, as ray tracing is in fact a better technology for some applications, but worse for others. They used as an example the film industry, which use both technologies on movies depending on what needs to be rendered. Nowadays ray tracing isn’t used by games. So by this talk we may expect future GPUs and games to support ray tracing? Maybe, but this probably won’t happen before 2010.

Originally at http://www.hardwaresecrets.com/blog/The-Future-According-to-NVIDIA/94


2004-14, Hardware Secrets, LLC. All Rights Reserved.

Total or partial reproduction of the contents of this site, as well as that of the texts available for downloading, be this in the electronic media, in print, or any other form of distribution, is expressly forbidden. Those who do not comply with these copyright laws will be indicted and punished according to the International Copyrights Law.

We do not take responsibility for material damage of any kind caused by the use of information contained in Hardware Secrets.