So, I was reading a friends blog, where he bought a rather nifty sounding device called a Watts up?.
Basically, it’s a power meter you plug in between a electrical device and a socket. And it measures how much power it uses. Simple, but clever. Apparently one version can even send info out over a serial port. Nifty.
So chip measured the power useage of various computers of his doing different tasks. The interesting thing is the dramatically higher power useage when heavily using the CPU. Now, this isn’t really news, but it got me thinking about how much energy resources are wasted because of inefficent code.
I never really thought of optimizing code as a enviromentally friendly thing to do. But at least for widely used software, it’s probably worth it just in the amount of energy saved. A couple watts here and there times a million boxes, and it’s probably more than worth the programmer time to look at additional optimizations.
I wonder how many kilowatt-hours are consumed playing World of Warcraft? Lets do a little dubious math:
Number of WoW players * Average Amount They Play * Average Amount of Power useage = ??
mmogchart.com puts the number of players at 2,000,000. Chip’s numbers seem to indicate ~180 watts of power used for a gaming system. We’ll call it 100 watts.
I have no idea how long people play WoW a day, aside from “too much”, but lets say 4 hours.
2,000,000 * 100 * 4 = 800,000 kWh a day.
Okay, so not all that much in the grand scheme of things. Considering Shearon Harris generates 860MWe.
Now if we figure out all the cpu cycles wasted decoding images of cats posted on the internet…
Blizzard puts the community size of WoW at 7 Million in their latest packaging (and it looks fairly onest, as you can see that they made boxes with both a 5 and a 6 million badge on them before if you visit a stocked Microcenter ;-)