Where you are in the cycle
Are you trying to figure out how to balance keeping the heat rejection for data
rooms, computer rooms, and telecom locations working at maximum efficiency and
performance, while trying to help reduce operating expenses within your
customer’s budget? This is just one of several challenges that contractors are
facing today as mission critical environments continue to evolve.
Then and Now
Not too many years ago, it was fairly simple. The manager or operator of a
data center would call you up to install a couple of computer room air
conditioning (CRAC) units against the wall of the center. Then you would connect
the piping, controls, and the electrical service and perform a routine start-up
of the new units. Unless you sold a maintenance agreement to the customer, you
and they were done.
Back then, engineers and data center operators were working under this
premise: How many watts per square foot would the information technology (IT)
equipment consume, therefore reject, into the room in the form of waste heat?
The calculations were simple enough; the conversion from watts to Btuh was made,
and the appropriate tonnage CRAC units were ordered and installed. This process
worked pretty well for many years. Even though the CRAC unit placement within
the space was not ideal, there was usually enough tonnage to totally saturate
the room with cold air. The mindset concerning electrical costs was, “They are
what they are.”
But times have changed. One of the rules of thumb in IT is Moore’s law, which
states, “Every 18 months, the computing capacity, meaning work done by a
computer, will double.” Therefore, the amount of heat given off by the computing
equipment will rise as well.
That law and other factors drive many decisions in the IT field. As a result,
the cooling of those very same data centers took on a whole new complexity. No
longer may we work based on watts/square-foot calculations. Today it’s all about
kilowatts (kW) per rack of IT gear. The introduction of larger-capacity, faster,
and therefore hotter computers (especially blade-style servers) has forced many
engineers to analyze the physics of cooling the data room.
The focus now, and going forward, is heat removal and, in particular,
capturing the heat as close to the source as possible. Most of today’s CRAC and
computer room air handler (CRAH) units do not sit on a wall (known as room
cooling) at the perimeter of the data center. Instead, they are strategically
located within the rows of IT equipment, or otherwise positioned to ingest all
of the hot air possible — the higher the return air temperature is, the more
efficient the overall operation is.
Many of the units themselves do not resemble air conditioners. More often
than not, they look like a rack of IT equipment, with the same dimensions and
the same name brand in many cases. This has taken on the name in-row
cooling.
Now the current mindset regarding electrical costs seems to be, “We must be
as efficient as possible in order to pay as little for power as possible.”
Ask the Right Questions
So how do you, as a service/installation/integration partner for
your
IT-savvy clients, help the company save money? As always, start by asking
questions.
What kind of input is your client seeking from you? Could you possibly save
the client operating dollars by studying the layout of the IT gear and making
placement suggestions based upon heat load?
Many existing data centers have an abundance of heat-removal capacity, read
as tons (or kW) of cooling. The problem is the supply and return (as hot as you
can get it) air distribution. It is highly inefficient to mix the supply and
return air of any mechanical system. You wind up only extracting a portion of
the heat from the room into the airstream, where it can then be transferred into
the chilled water or refrigerant and, finally, expelled from the room.
Would it make sense to duct or re-duct either the supply or return air in an
existing facility? Is this the only site they have in operation, or is there a
disaster recovery (DR) site elsewhere?
Also, start looking around and checking out what the IT staff is installing
and working on. Are the rows of equipment placed in a hot aisle-cold aisle
design? Do you have to turn the set points down low to keep the cooling on for
longer cycles? Are the CRAC units fighting each other, meaning some are cooling,
some are reheating, and others are fighting the humidity set point? All of these
are signs of inefficiency and waste.
What about containment systems, freezer strips, rack hats, end caps, etc.?
All of these devices and products, including fire-proof foam blocks and rack
U-space blanking panels, are an increasingly important part of mission critical
environments because they keep the heat isolated, allowing the heat rejection
systems to do their job to the best of their ability.
What happens if you lose a compressor due to short cycling of air due to poor
perforated tile placement? Will you drop the critical load? You need to know
what level of service your customer commits to provide for its customers, which
is typically specified as a Tier 1, 2, 3, or 4 service level agreement (SLA).
What are the expectations of up time? Is it 99.999 percent of the time? If so,
that is still 5½ minutes a year of outage — is that OK? Can you commit to
responding to those requirements?
As a good partner, you must also be highly cognitive of any issues associated
with the green movement. Many data centers are connected in some way to a
LEED-certified building. So, what impact may that have on operations, operating
procedures, chemicals for coil cleaning, etc.? Are the CRAC/CRAH units draining
their condensate into a cistern for non-potable irrigation water or cooling
tower make-up water?
These are just a few of the interwoven complexities of operating a green
building and an efficient data center simultaneously. All of this must be
happening while ensuring that a balance can be struck with the requirements
listed by an ever-growing list of certification agencies (and oh, by the way,
keeping the equipment energized, on-line and removing heat from the critical
space).
Be a Resource
When it comes down to it, HVAC is only a slice of the pie that critical
facilities managers are responsible for. As contractors, we need to make it easy
on the managers to communicate their needs to us. If you learn what is driving
their business, maybe you can figure out a way to help them stay on-line. Then
maybe, just maybe, you will viewed as a resource. No one wants to be just a
vendor or just the heating and air guy.
By showing your willingness to step outside of the normal service provider or
contractor stereotype, you will become an ever-more-important partner to your
customer. After all, isn’t that where we all want to be?
Greg Crumpton can be reached @
greg@airtight.co or via cell @ 704-807-9877
See this and more articles @:
www.airtight.co &