Thursday, March 30, 2006

Server Cost

The other day I ran across an interesting statement in a technical article — the author asserted that recent power cost hikes had made the cost of power (over a server’s lifetime) greater than the cost of the server itself. The author didn’t back that up with any analysis, but if he’s correct, that really flips the cost picture on its head from just a few years ago. Intuitively it seemed likely to me, as the cost of servers keeps going down and the price of power keeps staggering up.

So I decided to do a little analysis to see if this assertion passes the laugh test. Long story short: it does. The chart at right (click for a larger view) tells the story. It shows the net present value (NPV) of the lifetime power cost for a server, for any combination of server power consumption and average power cost per kilowatt hour. For this analysis, I assumed a four year server lifetime and an 8% discount rate. Reading off the chart, if you have a 400 watt server and you’re paying 20 cents a kilowatt hour, that lifetime power cost is about $4,300 — and chances are you paid substantially less than that for your server. As I write this, we’re paying 32 cents per kilowatt hour at my home, and the last server I personally purchased cost about $2,400. There’s another factor that makes it even worse: almost all IT datacenters (or server closets!) use air conditioning to remove the heat generated by the servers — you can add another 20% to 30% to the power consumption for that.

Sounds to me like the next frontier in lowering IT hardware costs lies with “green” servers — boxes optimized for lower power consumption per delivered MIPS…

No comments:

Post a Comment