Nvidia’s next-gen GPUs will likely require a new power supply
Another day, another specs rumor for Nvidia’s next-gen GPUs. This time the rumor mill is buzzing about the potential power limits of Nvidia’s Lovelace GPUs and whether or not you need to upgrade your power supply. Let’s just say you might need to consider a new PSU for your next build.
According to Moore’s Law is Dead and Wccftech, Nvidia’s next GPUs will likely hit 600 watts. For comparison, the RTX 3090 peaks at 350W and the RTX 3090 Ti would bump that up to 450W. to upgrade their diet.
We’ve seen rumors of high-powered characters before. It was earlier reported that the RTX 4090 and 4090Ti would require a massive 1200W power supply. Reports of these outrageous numbers were later toned down a bit as the leakers involved were unable to confirm the numbers. exact from the TDP.
It looks like Nvidia is trying to push the boundaries of power consumption, which might explain why such big rumors are circulating. The main issue seems to be ensuring the graphics cards can be properly air-cooled by both Nvidia’s reference design and Add-In-Board partners. ExtremeTech notes that Nvidia’s power lenses are compliant with the high-power 12-volt PCIe Gen 5 connector that supports up to 600W.
Among power consumption rumours, Lovelace GPUs may also feature super-fast GDDR7 memory. The current GDDR6X used in the RTX 3080, 3080 Ti, and 3090 cards maxes out at 19 Gbps. Even using the existing 256-bit and 384-bit memory interfaces would be a noticeable performance boost.
These two rumors are in addition to the massive performance gains the next-gen cards would have. Nvidia’s flagship GPUs could have up to 75% more CUDA cores than the RTX 3090 and a huge L2 cache. This would significantly reduce the time and energy required to access data from main memory.
All of this news comes just as GPU prices are finally dropping, up to 25% for some graphics cards. Intel’s entry into the graphics card market with its Arc Alchemist line of GPUs should also help alleviate shortage issues.