NVIDIA announced its third-generation GeForce RTX GPUs, Ada Lovelace (Ada for short), succeeding the 2nd-gen Ampere and 1st-gen Turing cards. This new generation delivers up to 76 billion transistors and 18,000 CUDA cores. The new GPUs also have improved, new-gen ray tracing cores, as well as improved Tensor cores. They also come with DLSS 3, which NVIDIA claims is vastly improved from the previous version — and it can supposedly deliver a 4x increase in frame rates over native rendering. We’ll have to see it in action to see how it compares.
As for the cards themselves, NVIDIA announced both the RTX 4090 and the RTX 4080. The flagship 4090 comes with 16,384 CUDA cores and a boost clock speed of 2.52 GHz. It also has a whopping 24GB of GDDR6X VRAM. NVIDIA says that the 4090 is up to two times faster than the 3090 Ti, its previous flagship GPU, on games like Microsoft Flight Simulator, three times faster on the newly-announced Portal RTX, and four times faster on RacerX.
The RTX 4080 comes in two flavors, a 16GB VRAM model and a 12GB VRAM model. Aside from the memory, though, there are many more differences to be seen here. The RTX 4080 16GB is more of a 4080 Ti in some ways, as it has 9,728 CUDA cores compared to the RTX 4080 12GB’s 7,680 CUDA cores. The latter’s cores have a slightly higher clock speed, at 2.61 GHz rather than the former’s 2.51 GHz, but it’s still fewer cores. We’ll be curious to see the performance difference between both GPUs once they’re released.
The RTX 4090 will set you back $1,600, and it will be available on October 12th. The RTX 4080, though, will cost $1,200 for the 16GB model and $900 for the 12GB model, and they will launch in November. This is, of course, pricing for the Founders Edition cards — OEM cards from third-parties like MSI and ASUS will cost more. It’s a sharp price increase compared to the previous generation, given the RTX 3080 launched for $700 and the RTX 3090 costed $1,200 (scalpers notwithstanding, of course). You can read more about the RTX 4090 and the RTX 4080 on NVIDIA’s website.
Source: NVIDIA