PC power consumption is on the rise, and it’s not just because of the increased use of processors and GPUs. It’s also due to the way that we’re using our screens and displays. The average PC now consumes about 20 watts more than it did a few years ago, according to a study by PCWorld. That’s an increase of about 20 percent, and it’s not just in the United States. In Europe, where prices for PCs are often much higher than in North America, the trend is even more pronounced. The reason for this increase is simple: We’re using more pixels on our screens and displays than ever before. A screen can only display so many images at once, and as we get bigger screens that can show up to 4K resolution (a resolution that’s four times as high as 1080p), those images need to be processed in order to be displayed on the screen. That means that each time we turn on our PC or laptop, it’s drawing power from its battery in order to do that processing—and then some! That’s why you might see your laptop or desktop reporting high power consumption numbers when you first turn it on—it’s using up all of its juice trying to start up your computer again from sleep! And if you have an older model with a lower-power processor or graphics card, those numbers may be even higher still because they were built into the device rather than being powered by batteries.
The Problem
Big 2022 hardware launches by NVIDIA, AMD, and Intel are absurdly powerful. However, that speed comes at a cost—and it’s the continuation of a problem that’s been slowly developing over the last few years.
The NVIDIA GeForce RTX 4090 graphics card runs circles around its predecessor, the RTX 3090. But it gulps down 450W from your power supply in order to function. That’s 100W more than the RTX 3090 and the same as the RTX 3090 Ti. A prospective RTX 4090 Ti, if it ever sees the light of day, will likely take around 550-600W alone, if not more — the AD102 die used by the GPU has a power limit of 800W.
What about CPUs? Well, AMD’s Ryzen 9 7950X CPU, with 16 cores and 24 threads, has a TDP of 170W — give space for at least 230W for power peaks since that’s the power limit of the AM5 socket. That’s a dramatic increase from AM4 chips, where the high-end Ryzen 9 5950X had a TDP of just 105W. Intel’s Core i9-13900K has a TDP of 125W, but Intel CPUs are known to have aggressive power spikes — its predecessor, the Core i9-12900K, is known to go up to 250W.
Across the board, everything is gulping down more power, despite the fact that new products are using more efficient processes. Both the new Ryzen chips and the new RTX graphics cards are using TSMC’s 5nm process. Intel is using 10nm and was using 14nm as early as 2021, but that’s another story.
Why Is Power Efficiency Important?
The fact that everything is using more power is important for a few reasons. It means that the consequences of that power consumption difference are carried over to you, the user.
For one, you need to buy a more powerful (and more expensive) power supply to power your PC’s components. Looking at gaming PC build guides from 2016, we can find that, for a PC equipped with an Intel Core i7-6700K and a GeForce GTX 1080, a 650W power supply was recommended.
Meanwhile, for a high-end gaming PC in 2022, 650W is just not enough. With the RTX 4090 graphics card and the AMD Ryzen 9 7950X CPU, you’re looking at 620W of power between just two components, leaving absolutely no legroom for possible power spikes—or any other components at all. You need at least an 850W power supply for such a PC, and going for a 1000W one wouldn’t be entirely unreasonable. You may need even more—if you’re planning to overclock.
While a 1000W power supply would be considered overkill years ago, it’s now a reasonable option for some PCs. That should speak volumes on its own.
We also need to talk about the issues that come with actually pulling that much power out of your wall. Playing games on your PC pulls more and more power every time, and that has a twofold effect. You’ll have a higher electricity bill, especially if you tend to run yourself through heavy, hours-long gaming sessions. There are also the obvious environmental costs of increased electricity usage.
What Can Be Done to Change This?
It’s not like chipmakers don’t know this. The purpose of die shrinks is to fit more transistors within a chip while simultaneously using less power. But at the same time, making chips better and better is still causing power consumption to go up. Basically, innovation is outpacing any power efficiency gains we’re getting. And while innovation is good, more needs to be done to improve performance per-watt and power efficiency metrics without necessarily harming that innovation.
One thing you can do to help is stay aware of how much power your PC is consuming. If you’re shopping for a power supply, you can buy one with an 80+ Platinum or 80+ Titanium certification. These are the most efficient and will help you cut down on your PC’s idle power draw. If you’re willing to get more technical and don’t mind sacrificing some performance, you can also underclock or undervolt several parts of your PC.
For now, the power consumption problem will keep getting worse, but the PC space is changing rapidly, so that might not be the case forever.