Elysian Realm, aka @KittyYYuko on Twitter, has filtered which aims to be a unit of the GeForce RTX 4080 Founders Editionthe high-end model (although not the maximum) of the next generation of NVIDIA graphics: Ada Lovelace.
The photo posted by @KittyYYuko shows an alleged RTX 4080 Founders Edition inside its protective plastic, making it difficult to check some of the product’s details, even though the model name, RTX 4080, is perfectly legible. The type of heatsink it will use remains in the inkwell, but apparently there will be an important change compared to the Founders Edition of the RTX 3080.
The design of the RTX 4080 Founders Edition aims to be of double slot and the fan, with 7 blades, would be larger than the one incorporated in the same versions of the RTX 3080 and RTX 3090, whose respective fans have nine blades. Fortunately, the Founders Edition left that reputation for getting very hot behind a few generations thanks to the inclusion of more competent dissipation mechanisms by NVIDIA.
NVIDIA GeForce RTX 4080 Founders Edition.
At the feature level, the RTX 4080 is expected to use a “reduced” version of the AD103-300 graphics processor, with 9,728 CUDA cores, 48MB L2 cache and that it would be based on the PCB PG136/139-SKU360. This is in contrast to the full version of the processor, which supports 10,240 CUDA cores and 64MB of L2 cache.
Other features of the presumed definitive version of the RTX 4080 are 16GB of GDDR6X memory, a speed of 23Gbps and a 256-bit bus interface. Bandwidth is expected to be 736GB/s, which would be slightly less than the 760GB/s of the RTX 3080, which uses a 320-bit bus interface and 10GB of VRAM. The TBP now aims to be 340 watts.
Both the RTX 4080 and the RTX 4070 could be among the first Ada Lovelace graphics cards to hit the market after or together with the RTX 4090, whose launch would be planned for October 2022. We will see how the future generation of NVIDIA is received by looking at the economic context in which we live, but, on the other hand, on some fronts and for a long time the challenge is more to take advantage of the hardware than to increase its power.