Some people pay more for electricity than in united states. Where i live the price after taxes is around 0,12$ per kwh, without any offpeak reductions, while the wages are over 10x smaller.
A 20$ saving in electricity here, adjusted to buying power, would be over 200$ there. In other words, 20$ is worth to me what 200$ is worth to you. Actually i have the computer open 12 hrs every day with both heavy and light use. If you do the math it would be more than 20$ in my case per year, but i gave that as a typical scenario.
It may sound like peanuts to you, but i am upgrading to an i3 3220 and i love the 55w power consumption.
You don't understand how little the difference is.
A good rule of thumb, for US energy costs:
100W over an entire year, costs about $100 more in your utility bills.
However, when you are looking at things like CPU power usage, you need to realize you don't run it at peak for a year. You probably don't even run it at peak for one minute per day, unless you regularly use multi-threaded encoding apps or run a distributed computing project that loads all cores. Of course, if you are paying 10X the US rate and make very little money, either of those activities is probably a dumb thing to do.
In reality, you probably run your computer at idle, powered off, or near-idle 23.5 hours a day. For the remaining 30 minutes you might run it at full load for 1-2 cores but almost never all 8 cores.
For a usage pattern like this you will find the extra power usage of a bulldozer CPU averages out to about 2 watts higher, or $2 extra per year, or $.16 extra per monthly utility bill.
For those of you in foreign countries paying outrage electricity prices, this is an insane extra $1.60 per month for using a power hungry FX-8150 CPU. Of course you probably saved $200 or more by buying an AMD CPU and board in the first place, so in a mere 10 years the extra electricity cost will make the 8150 a worse deal.