well im looking into the real cost, the $0.07 an hour was an amount given to me when I called the electric company to find out why my bill had doubled, but when i do the real math ([(Watts * Hours Used)/1000] * cost kWh) things dont add up, hmm you guys got me thinking this dude I talked to at the electric company might be highJACK SH!T said:how do you determain what it cost to run the pc? I mean, how do you know it cost $0.07 and not $0.02, or $0.1 per hour? I don't think you can tell by your bill becouse there are all kinds of things that come into play there; lights, tv, vcr/dvd, stereo(am/fm vs. cd vs. tape), microwave, stove top, oven, fridge, heat/ac-fans, clocks, etc.....
'Bout the only way I can see is to run nothing at all but one computer & monitor only for one month. That way you could break it down, kindda'. The cost veries according to supply and demand. In July it might max out at $0.07, but the rest of the year it might never go above $0.009.....
If it avarages .02 most of the year, than that's $0.48 per day, $170.88 per year.
I don't know....I'm thinking too much I guess
For sh!tz I ran the numbers on .009 works out to be $76.896 per year
Last edited: