nanog mailing list archives

Re: WSJ: Big tech firms seeking power


From: Alex Rubenstein <alex () nac net>
Date: Sat, 17 Jun 2006 00:50:04 -0400 (Eastern Daylight Time)




What is the amount of energy coming out of a server as heat as opposed to what you put in as electricity? My guess would be pretty close to 100%, but is it really so? And I've also been told that you need approx 1/3 of the energy taken out thru cooling to cool it? So that would mean that to sustain a 100W server you really need approx 130-140W of power when cooling is included in the equation. Is this a correct assumption?

Based upon my real-world experience, and talking to a few folks, it's very close to 100%. Most assume 100% for the practice of calculating cooling.

However, for those who are very scientific, they try to tell you that some of the power is going into movement of hard drive heads, etc., which creates force on your racks, etc. A true, but irrelevant discussion, really, because it's likely an immeasurable amount.

One could do the excercise of putting a computer in a well insulated box and measuring power in vs. rate of rise of temperature. Volunteers? :)




--
Alex Rubenstein, AR97, K2AHR, alex () nac net, latency, Al Reuben
Net Access Corporation, 800-NET-ME-36, http://www.nac.net



Current thread: