powerful pc header

The effort by Intel and AMD to invest in lower-power products has been weighing on my mind for some time as I look to my own PC, the others in my house and consider the recent trouble my family has had with our local municipality over electricity prices. Simply put, unless you’re earning a decent salary of five figures or more anually and have the need and the requirements for a high-powered rig, it doesn’t make sense to run what amounts to a tiny heater in your lounge, living room or bedroom.

It may just consume more power than you really need at any given time and may actually waste your money in doing so. It’s all about saving power these days and I’ve been looking at ways to tone down my usage for my desktop. If you follow me over the jump, perhaps I can convince you to do the same as well.

Firstly, consider that you probably have no idea how much energy your PC actually uses over the course of an hour or a day. Neither do I and at best, my measurements are only arrived at through some guessing. I’m considering buying a digital meter to measure power use, power draw and energy consumption over time to determine whether any of my plans will actually make financial sense.

It doesn’t help, for example, to replace my monitors, my graphics card, motherboard and CPU if all I’m going to be saving is a measly 30W of power in the end. Sure, I might see much more performance from the same amount of energy use, but I’m not a framerate junkie, nor do I need the power of a Bitcoin mining rig.

In my situation, any gains over 80W will be substantial enough to make it worth the consideration. Go over 150W and the hardware practically pays for itself over time. 

So when I finally get my meter, what am I going to monitor and what will I need to replace? Let’s look at some of the biggest power-suckers in my rig.

My Dual-monitor setup

Pictured: Not my actual gaming setup.

Pictured: Not my actual gaming setup.

I use two monitors for my workflow. I don’t think I could ever go back to a single one, unless it’s a UltraHD 4K monitor and I don’t think I could quite handle three for the moment, as I don’t need that much space. But monitors, especially the older CRT variants, do use up a lot of power and I know mine aren’t power-sippers either.

My monitors use TN panels and are both lit by CCFL tubes. Not only are they not as bright or vivid as I would like, they’re both power-suckers. My main 22″ 1080p BenQ monitor puts out a lot of heat and it is a very wasteful design – then again, I bought it back when electricty was almost 135% cheaper than it is now, so I didn’t care at that point. My Acer 18.5″ monitor is also guilty of the same thing and both are matte displays as well.

Typically, displays with a matte finish need a higher brightness setting to be viewed properly in lighter environments, like the front lounge I’m currently sitting in, which has a lot of light coming in through the front windows – a glossy panel may have reflections, but it will use less energy to achieve the same brightness level.

On the monitor side, though, I’ll only upgrade if I find a slimmer, LED-backlit, semi-glossy 1080p IPS panel with variable VBLANK support for under R2500. Anything less than that won’t cut it if I intend to use it for the next five years.

My Graphics card – one power-hungry sucka

A little down the road after I bought by first GPU, a Radeon HD5750 1GB, it failed and after much wailing and gnashing of teeth from tech support at Club 3D, I was sent a Radeon HD6870. I wasn’t complaining at that point because I’d literally received a GPU twice as powerful as my old one, but along with that came came considerably higher power consumption and heat levels. Its manageable now, but I know that for the majority of the time that I’m working, I don’t even tax it.

Then there were the launches of the Geforce GTX650 Ti and the Radeon HD7790 that had my attention with three graphs in a review by Tom’s Hardware – power consumption, temperatures and average game performance and this was the first clear indicator of where I should be moving to in the future.

Power Temp Avg-Perf

The HD7790 had the same amount of memory as my GPU and had to make do with less functional units and a smaller memory bus – but this isn’t to its detriment. I was shocked as I saw a card that easily cost half mine was a little better in every metric that mattered to me. It had lower frame latency, consumed less power, produced much less heat and didn’t lose anything in terms of performance.

In the same vein, if you’re still running a GTS250, or a GTX480, or a Radeon HD5870 or any of the big-hitting GPUs of the last decade, you need to consider how much energy they’re using that can be considered waste. In my case, just a single GPU replacement can lower my use by over 80W, passing my pre-determined goal. Are you still using a Geforce 8800GT? Ditch it now for a GTX650, which is more than twice as fast and consumes less than half the power.

My motherboard and processor

ASUS M5A97 LE R2.0

Aside from the SATA ports, I use none of the slots below the main PCI-E 16x one where my GPU sits. I don’t need most of the hardware on my motherboard!

Along with the fact that my GPU is idle for most of the day, I never tax my Athlon X3 triple-core in work-hours enough to make it worth my purchase. It is nice to have three cores to spread the load and it helps a lot for games that are lightly multi-threaded, but it is a 95W processor that is outperformed by a Intel Core i3 using half as much power. I will probably never need more than four threads anyway, so I have quite a bit of leeway in terms of purchase choices for a price under R2000.

My other consideration is a replacement with a AMD APU, perhaps from the socket FM2+ family. With HSA on the way and AMD’s plan to allow discrete GPUs to work in tandem with the APU to achieve more performance overall, I might swing that way. The APU could be the better choice even if I don’t opt for discrete graphics – I really don’t need that much GPU horsepower. I’m perfectly happy playing a game at 720p with medium settings – heck, I’ve done it for seven years on a PS3.

Pretty graphics is secondary to good gameplay for me. Plus, there’s that whole Mantle thing, although I’m bummed that I lose out on the completely free Geforce Shadowplay that Nvidia’s bundling with their drivers now.

I need speed like I need a speedy Llama

Samsung EV0 840 SSD

Beyond the main concerns, there’s not a lot of places where I can lower my power usage. I can use components that create less heat, thus enabling me to buy a smaller chassis with better cooling capability using slower fans, but those are minimal gains and I’m deaf in one ear anyway – fan noise is not going to bug me. I can choose a power supply with a lower rating but a higher efficiency, but this can only happen once I’ve figured out how much power I really need. But the last place and probably the best component that can be improved is a SSD.

Not only are solid state drives a lot more frugal in terms of power use, they also have no spinning parts and no manual maintenance required. In addition, SSD’s reduce the time you spend waiting for your computer to do something considerably. You will spend less time booting, you spend less time waiting for programs and games to load, you spend less time waiting for your recycle bin to empty and you spend less time waiting for your browser cache to be accessed, or for Photoshop to finish saving your projects, or for your scanning application to finish saving multi-megabyte pictures to your desktop.

Although I don’t currently own one (Samsung’s 840 Evo looks inviting) I will have a SSD as my system drive in the future. Not having to wait for things to be completed allows me to continue working and allows my system to drop to idle speeds much quicker than it would have with a regular hard drive. Those power savings may seem small at first, but they’ll definitely build up over the course of a month, even a year, into money back in my pocket.

What’s the damage for all this?

Intel Core i3-4130 @ R1469

ASRock B85M @ R1045

MSI Radeon R7-260X OC 2GB GDDR5 @ R1881

Samsung 840 Evo 250GB SSD @ R2087

Total: R6482

Sure, the upfront cost is high and there’s not a ton of incentive to upgrade if it’s going to be idle most of the time. But there’s around 50W in power reductions to be had at idle with this upgrade and almost certainly more than 150W less power use when gaming, which is the only time when the machine is properly taxed (that and use as a media server for my PS3). It will probably take about three years to have the upgrades pay for themselves in terms of power savings but then again, my machine as it is now is almost four years old.

I can easily wait that long. 

There is also the case that many people, particularly for a business use case, can move in the other, more extreme direction – downgrade their machine to a low-power Intel hyper-threaded or quad-core processor, 8GB of RAM and a SSD. This makes a lot of sense for machines/workloads where you don’t need a lot of horsepower and it reduces complexity greatly as well, especially if you can stick it into a VESA-mounted chassis behind your monitors, which is essentially the market for Intel’s NUC.

So to you I ask…

Will you be considering the same thing that I am in future? Are Eskom’s electricity increases giving you enough grief to consider moving to using lower-power components to save on your utility bill every month? And will you be more mindful of energy use in the future, particularly if you’re reading this article and had these thoughts for a while already?

Because I now pay for electricity and I can appreciate where my money is going each month, I think the efficiency bug has finally bitten me. Let me know what you think in the comments below and in the forums.

Discuss this in the forums: Linky