NVIDIA GeForce RTX 4090 GPU Estimated Power Consumption Is 450 Watts Alone

NVIDIA GeForce RTX 4090 GPU is a gaming graphics card released in September 2018. It uses GDDR6 memory and has 8GB of video RAM, so it should be able to run games at 4K resolution with high textures.

The “NVIDIA GeForce RTX 4090 power consumption” is the estimated power consumption of a single NVIDIA GeForce RTX 4090 GPU. The card alone draws 450 watts.

NVIDIA-GeForce-RTX-4090-GPU-Estimated-Power-Consumption-Is-450NVIDIA image

Those of us who believe we can power the next generation of flagship graphics cards using passive PSUs look to be in for a rough time. Igor Wallosek has released his predictions for the GeForce RTX 4090’s power consumption, which suggest that the GPU’s Lovelace-based GPU alone might require 450 watts. This is the same graphics card power as NVIDIA’s newly released GeForce RTX 3090 Ti, which is already 100 watts more powerful than its ordinary version. Wallosek’s calculations also reveal that the GeForce RTX 40 Series will feature variants with TGPs of up to 600 watts, as rumors have predicted.

1649191415_141_NVIDIA-GeForce-RTX-4090-GPU-Estimated-Power-Consumption-Is-450Igor’s Lab (photo credit: Igor’s Lab)

Where the NVIDIA GeForce “RTX 4090” gets its 600 watts from – a GPU and component calculation | EXCLUSIVE (Igor’s Lab)

The board power TGB of the huge Ada card (whether it will be dubbed RTX 4090 or not) is 600 watts this time, which should be deemed set. This time, we want to figure out how much money is still left over for the chip. And here is where it becomes fascinating, because based on previous generations, you can predict voltage transformer losses, extra-low voltages, and other losses very effectively. Because I’m not a psychic, I specifically inquired about the memory. Micron is within the ranges of what should be predicted owing to the higher clock with 3.4 watts per module (according to my research). The storage increase is really an afterthought. Instead of the GeForce RTX 3090’s 60 watts, the overall power consumption for 24 GB (12 modules) is about 40 to 41 watts.

In this basic design, the four memory voltage converters should have an efficiency of roughly 60 to 70%. I plan about 15 watts of losses for the whole circuit, including the MOSFETs, coils, and caps, with the components involved. This isn’t insignificant, but it’s also not exceptional. I set roughly 10 watts for the low voltages with the standard 1.2 volts, 1.8 volts, 5 volts and other things like the MCU, the shunts, the rail filtering, etc., and an average of 5 watts for the fans, despite rumors that they would not be directly included in the limit.

If you extrapolate the power consumption and look at the traces, you may practically double the board losses since more phases equals more conductor traces. If the GPU voltage converters (NVVDD) are identical to those found in the 3090 Ti, they should have an efficiency of about 90%, resulting in a power consumption of around 50 watts. This is also around twice the amount of heat generated by the RTX 3090, leaving approximately 450 watts for the GPU. In comparison, the AD102 would only use slightly less than twice as much electricity.

Return to the thread

Recent Developments


Life Is Strange: True Colors Chinatown Detective Agency, and Other Xbox Game Pass Titles for April

5 April 2022 5 April 2022


The NH-D12L is a low-height dual-tower CPU cooler with a new NF-A12x25r PWM fan from Noctua.

5 April 2022 5 April 2022


Noctua Confirms AMD’s Next-Generation AM5 Socket and Ryzen 7000 Series Processors Cooler Support

5 April 2022 5 April 2022


Unreal Engine 5 is now available for download from Epic Games

5 April 2022 5 April 2022


Unreal Engine 5 Tech Test, 100x More Graphic Detail, by Gears of War Developer

5 April 2022 5 April 2022


Crystal Dynamics Confirms Unreal Engine 5 Development for New Tomb Raider

5 April 2022 5 April 2022

The “rtx 4090 release date ” is a gaming GPU that was released by NVIDIA. The estimated power consumption of the RTX 4090 is 450 watts alone.

Frequently Asked Questions

How much kWh does a RTX 2070 use?

A: It will take you around 15 hours on average to charge your RTX 2070 with the charger that comes in-box. The biggest factor at play here is how many watts your device uses, as well as how fast it charges.

How many watts does a 2080 Super use?

A: The 2080 Super uses a power of 220 watts.

  • rtx 4090 hashrate
  • rtx 4090 specs
  • rtx 4090 price
  • rtx 4090 8k
  • rtx 4090 power
You May Also Like