This topic is locked from further discussion.
- Member Since: February 16, 2008
- Posts: 17380
- Member Since: August 25, 2005
- Posts: 1875
- Member Since: July 30, 2003
- Posts: 9119
- Member Since: November 5, 2003
- Posts: 13650
- Member Since: November 24, 2007
- Posts: 9825
- Member Since: July 23, 2008
- Posts: 3512
- Member Since: July 10, 2007
- Posts: 17398
- Member Since: November 13, 2003
- Posts: 16699
It was more of a hypothetical processorSparrowPrince
Theres way too many factors to get an answer.
A high-end CPU uses about 100 Watts, and thats assuming 100% load. Unless that machine ends up being some kind of 24 hour video encoding box, its not going to be a constant 100% load.
If anything, the GPU will likely consume a greater chunk of electricty than would the CPU, so this hypothetical argument works better, at least for the typical person, if you were comparing GPUs.
However, i decided to create a file/media server box. Because it was intended to stay on 24 hours a day, i opted for a very low wattage CPU, an AMD E-350, which uses about 9 Watts. Not very powerful, but it allows me to run it fanless. Along with a fanless 200 Watt PSU, and six 2 TB Green Western Digital hard drives, i consume about 40 Watts steady.
Its generally only under exceptional circumstances that energy consumption should be any kind of consideration.
- Member Since: August 1, 2004
- Posts: 2149
- Member Since: October 26, 2004
- Posts: 19187