WSJ: Big tech firms seeking power

Alex Rubenstein alex at nac.net
Fri Jun 16 21:24:27 UTC 2006




On Fri, 16 Jun 2006, Matthew Crocker wrote:

>
>> I wonder just how much power it takes to cool 450,000 servers.
>
> 450,000 servers * 100 Watts/Server = 45,000,000 watts / 3.413 watts/BTU = 
> 13.1 Million BTU / 12000 BTU/Ton = 1100 Tons of cooling

Error: you MULTIPLY 3.413 to go from watts to BTU, not divide. It's be 
more like 154,000,000 BTU, /12000 or 12,798 tons.

Also at 100 watts, you are assuming Celerons with single hard drives. We 
see more like 120 to 240 depending on config. 100 would be low.


> A 30 Ton Liebert system runs about 80 amps @ 480 volts or 38400 watts, 
> you'll need at least 40 or them to cool 1100 tons which is 1536 Kw * 24 hours 
> * 7 days * 4.3 weeks = 1,110,000 KwH/month * $0.10/KwH = $111,000 /month in 
> cooling.

80 amps @ 480 is 80 * 480 * 1.73, or 66 kw. However, they don't draw that 
much. A 30 ton unit, worst case (115 degrees outside across the condensor) 
will be about 50 kw, assuming you do not have humidification or reheats 
turned on.

Second issue: you are assuming 100% cooling efficiency, or, in other 
words, that you'd have perfect airflow, perfect air return, etc. Never 
happens, especially when you have customers who are idiots.

Third issue: you are assuming there is no heat loss or gain in the 
structure of the building. This could be very significant. Let's assume 
it's not.

It's likely in an environment like this, you'd have more like 14000 tons. 
14000 / 30 = 466 units, @ 50 kw/unit, 23,300,000 watts, / 1000 * 24 * 
30.4375 (avg days in a month) = 17,020,000 kw-hrs, @ $0.12 (more likely 
with todays fuel prices unless you are in Kentucky) $2,042,400/month.

Also, don't forget the original 450,000 servers at 100 watts (45 mw) would 
be $3,944,700/month in power. Also, 450,000 1U servers at 40/rack would be 
11,250 racks, which at 10 sq-ft a rack would be 112,000 sq-ft of 
datacenter floor space (triple or, more likely, quadruple that for space 
for HVAC, generators, switchgear, UPSs, etc). That'd be 500,000 sq-ft at 
minimum.

Total is $5,987,000/mon, but you haven't ROIed the millions in electrical 
gear (think big: this is about 68 megawatts; $250k/each for a 2 mw 
generator (you'd need 40, $10 mm), $50k/each for a 500 kva UPS (you'd need 
80 $4mm), millions in panels, breakers, piping, copper wire (700% increase 
in copper pricing in the last 24 months, people), etc. Oh, and 466 liebert 
30 ton HVAC's, probably $25 to $40k/ea installed ($11 million). Oh, and no 
one has installed it yet, and you haven't paid rent on the facility that 
will take 2 years to build with probably 100's of workers saleries.

Take $6mm/month, divide by 450,000 servers, $13.33/month/server.

Oh, and 68 Megawatts over 112k ft of floor space is 607 watts/ft. Thats 
about 6 times what most centers built in the last couple years are built 
at.

But wait, there is more. Just a point of comparison -- Oyster Creek 
Nuclear Power generation plant, located here on the Jersey Shore, produces 
636 megawatts. You'd take one-tenth of that capacity -- in a bulding that 
would sit on a 10 or 20 acre chunk of land. I put this into the 'unlikely' 
category. The substation alone to handle stepping 68 mwatts from 
transmission to 480v would be probably 4 acres. And, 68 megawatts of power 
at 480 volts 81,888 amps. A typicall 200,000 sq-ft multi-tenant office 
building has 1600 amps of service; this would be the equivalent of 50 
buildings.

Having fun yet?

A 30 ton liebert takes about 30 sq-ft of floor space; 466 of them would be 
13,980 sq-ft. If you use a drycooler system, they are about 100 sq-ft, and 
youd need 233 of them (60 ton DDNT940's), 23,300 sq-ft of roof space. Each 
of those weighs 2,640 pounds, for a total of 615,000 pounds, or 308 tons 
(of weight, not HVAC capacity). I won't even spend the CPU cycles figuring 
out how many gallons of glycol this would bem but probably a good guess 
would be about 50,000 gallons. That'd be about a quarter-million dollars 
in glycol.

I'm tired now, time to climb back in my hole. In other words, don't get 
me started on the datacenter density issue.


-- 
Alex Rubenstein, AR97, K2AHR, alex at nac.net, latency, Al Reuben
Net Access Corporation, 800-NET-ME-36, http://www.nac.net





More information about the NANOG mailing list