Energy consumption vs % utilization?
nils.ketelsen at kuehne-nagel.com
Tue Oct 26 18:09:18 UTC 2004
On Tue, Oct 26, 2004 at 01:52:51PM -0400, Gregory (Grisha) Trubetskoy wrote:
> Sorry, this is somewhat OT.
Also Sorry, but I think the question itself is completely flawed.
> I'm looking for information on energy consumption vs percent utilization.
> In other words if your datacenter consumes 720 MWh per month, yet on
> average your servers are 98% underutilized, you are wasting a lot of
> energy (a hot topic these days). Does anyone here have any real data on
What does 98% underutilized mean?
What is the utilization of a device with fully built out RAM that is used
to 100%, when the CPU is used 2% only?
What is the utilization of a system, that uses two percent of the
memory and two percent of the available CPU time, when the policy
of the top secret organization owning this system requires, that the
application is running on a seperated machine?
Sure many machines might be (computing power wise) able to
handle Firewalling, Routing, Webserving, Database Serving, Mailserving and
storing accounting data, but still there might be very good reasons to
seperate these on different machines.
If you take points like policy requirement (see above:
an application might by policy utilize a machine to 100%), different types
of resources, failover etc. into account, you might end up
with different numbers then just looking at the CPU (and I
have the feeling that is what you did or were intending to do).
Actually I think nobody does calculate "real" utilization,
as there are a lot of soft factors to be taken into account.
More information about the NANOG