Energy Draw and Cost of Electric Bill

This topic is locked from further discussion.

#1 Posted by SparrowPrince (9 posts) -
Hi, So I was wondering how much money a more energy efficient intel processor will save you compared to a less efficient AMD processor. Would opting for an intel using 250watts over an AMD using 350 watts per hour save me much? Thanks
#2 Posted by ferret-gamer (17380 posts) -

Depends on how much your electricity is and how often you run it at speeds that have that much difference. It is probably going to be a few more bucks a month

#3 Posted by spittis (1875 posts) -
That should be really easy to calculate, you should see what you pay per kilowatt on your bill.
#4 Posted by General_X (9119 posts) -
I would guess that overall it would be a fairly negligible difference, if you'd really like to save money on electricity how much/often you run your AC and inefficient appliances make the biggest difference.
#5 Posted by GummiRaccoon (13650 posts) -

running a 60w bulb 24 hours costs about 10 cents.

#6 Posted by adamosmaki (9825 posts) -
I'd say it will be a negligible difference considering most users most of the time their Cpu usage is only 5-15% ( thats it when browsing the internet , listening to music, watching videos etc ). Both AMD and Intel have quite low idle power usage intel of course has the edge but the 5-10W of difference wont make much difference Of course if you are planning to do intensive tasks such as gaming and video processing 5-6 hours a day every day then there will be some difference in electricity
#7 Posted by Lox_Cropek (3512 posts) -

Depends on the price you pay for the kWh, but the difference is barely existent.

#8 Posted by Guppy507 (17398 posts) -
I'm just wondering where the hell you're finding a processor that uses 250 watts...
#9 Posted by SparrowPrince (9 posts) -
It was more of a hypothetical processor
#10 Posted by XaosII (16699 posts) -

It was more of a hypothetical processorSparrowPrince

Theres way too many factors to get an answer.

A high-end CPU uses about 100 Watts, and thats assuming 100% load. Unless that machine ends up being some kind of 24 hour video encoding box, its not going to be a constant 100% load.

If anything, the GPU will likely consume a greater chunk of electricty than would the CPU, so this hypothetical argument works better, at least for the typical person, if you were comparing GPUs.

However, i decided to create a file/media server box. Because it was intended to stay on 24 hours a day, i opted for a very low wattage CPU, an AMD E-350, which uses about 9 Watts. Not very powerful, but it allows me to run it fanless. Along with a fanless 200 Watt PSU, and six 2 TB Green Western Digital hard drives, i consume about 40 Watts steady.

Its generally only under exceptional circumstances that energy consumption should be any kind of consideration.

#11 Posted by SparrowPrince (9 posts) -
okay
#12 Posted by ydnarrewop (2149 posts) -
Interesting post :)
#13 Posted by imprezawrx500 (19187 posts) -
Your computer is never going to use enough power for you to really see it on the power bill. Turn off your hot water and suddenly 70+% of your power bill gone.