nanog mailing list archives

Re: Energy consumption vs % utilization?


From: "Steven M. Bellovin" <smb () research att com>
Date: Tue, 26 Oct 2004 16:01:24 -0400


In message <Pine.WNT.4.61.0410261429110.3340 () vanadium hq nac net>, Alex Rubenst
ein writes:


Hello,

I've done quite a bit of studyin power usage and such in datacenters over 
the last year or so.

I'm looking for information on energy consumption vs percent utilization. In

other words if your datacenter consumes 720 MWh per month, yet on average 
your servers are 98% underutilized, you are wasting a lot of energy (a hot 
topic these days). Does anyone here have any real data on this?

I've never done a study on power used vs. CPU utilization, but my guess is 
that the heat generated from a PC remains fairly constant -- in the grand 
scheme of things -- no matter what your utilization is.


I doubt that very much, or we wouldn't have variable speed fans.  I've 
monitored CPU temperature when doing compilations; it goes up 
significantly.  That suggests that the CPU is drawing more power at 
such times.

Of course, there's another implication -- if the CPU isn't using the 
power, the draw from the power line is less, which means that much less 
electricity is being used.

                --Steve Bellovin, http://www.research.att.com/~smb



Current thread: