Hot or not? Know your data center's environment

Monitoring environmental conditions in the data center is absolutely critical to avoiding disasters, or surviving them

Whether we're designing, building, or maintaining data centers, we need to pay attention to a lot more than how and where the bits flow. We also need to know about the physical environment. We need to keep all the gear at a reasonable temperature, and we need to protect it from all kinds of possible mayhem -- from unauthorized access to burst pipes.

Environmental monitoring is one of the elements of data center construction and maintenance that seems to be missed at many levels. In numerous cases, it may be that data center design consultants were brought in for the overall design and then left, resulting in a data center that has been well equipped with fire suppression and cooling systems, but has no remote telemetry or monitoring systems. In other cases, it may be because the server crew is monitoring the servers, the network staff is monitoring the network, the storage group is monitoring the storage, but nobody's keeping tabs on the room itself.

[ InfoWorld's Disaster Recovery Deep Dive Report walks you through all the steps in anticipating and handling worst-case scenarios. Download it today! | Get the latest practical data center info and news with InfoWorld's Data Center newsletter. ]

The first order of business for a new data center (or to retrofit an existing data center) is to implement remote telemetry and environmental monitoring systems -- not in just one area of the room, but in multiple places around the room. APC and others make environmental monitoring systems for exactly this type of distributed deployment, and no data center should be without something like it.

Taking the temperature

As far as what you monitor, and how it's monitored, my overall thought is that you can never gather too many statistics and data points. Obviously we need to monitor temperature, but we need to do so at the ceiling and floor of the room, in multiple locations, and at the rack inlet. Ideally every few racks will have a temperature sensor placed in the front of the rack, where it can measure the temperature of the air entering the hardware.

Ambient temperatures are extremely useful as well, as are temperature readings in a hot aisle, if present. Measuring dew point, humidity, and airflow is also important, and these too should be measured in multiple locations. Door switch sensors should be used on rack doors to note when they're opened.

I like to see water presence sensors placed near racks, near AC units, and near any potential water source, such as overhead pipes that couldn't be diverted for whatever reason. You can also get rope sensors that run the length of rack aisles. These sensors are simple, triggering whenever they come into contact with water on the floor. If you have a water leak, you need to know about it as soon as possible. Vibration and smoke sensors, while perhaps not as critical as the others, offer further monitoring options.

And, of course, you need cameras. No corner of the data center should be out of sight of at least one camera. Fixed-position and pan-tilt-zoom cameras should be used together, and at least a few should have infrared features to allow for visibility in the dark.

What to do with all that data

With all of these data collection points, we have high visibility into the data center itself, not just into the servers and other hardware in the room. All this data should be maintained, tracked, and trended. Using SNMP and tools such as Cacti, or through vendor solutions, you should be able to recall and view data from any sensor from any time. You will be able to see if the ambient temps have been rising over months due to the addition of new gear, or to verify when a particular rack door was opened.

As far as alerting goes, it will take some time to find and set acceptable alert thresholds for some sensors, and for escalating alert messages delivered by email and text. In a high-traffic data center you may not want to know every time someone enters the room, but in a low-traffic data center, you probably do. Your cameras should be taking pics or video whenever motion sensors are triggered, and those images and video should be shipped to a server for storage, ideally synchronized to an offsite system.

It's not a bad idea to make sure you have at least one analog phone line and a modem hooked up to a Linux box somewhere, or a 3G/4G/LTE data connection that can be turned up automatically when needed. In a real emergency situation, that might be the only way you can check on the data center if the data circuits are down.

If you think that's overkill, ask anyone who was responsible for the care and feeding of a data center in the New York and New Jersey area when Hurricane Sandy came through. They'll tell you different. It takes only one significant, unexpected environmental problem to justify the expense and management of data center monitoring systems. I hope you never find that justification, but at the same time, you cannot assume you won't.

This story, "Hot or not? Know your data center's environment," was originally published at InfoWorld.com. Read more of Paul Venezia's The Deep End blog at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.

Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Join the discussion
Be the first to comment on this article. Our Commenting Policies