June 7, 2010

Host Location Series Part 1: Inside the Data Center

By:

Hostway Team

By Gail Seymour

Anyone who ever had the responsibility of maintaining an independent Web server can tell you, keeping even a single Web site up can quickly turn into a nightmare. First, you have to keep the server running 24/7, and it isn’t long before you realize that means having a backup server and an Uninterruptable Power Supply (UPS) for each machine. You need a way to ensure when the primary server fails, the backup takes over seamlessly, and someone is alerted to repair the fault. You also soon realize the limitations of UPS systems. If you’re in an area with frequent prolonged power outages, that can mean investing in a generator. Before you know it, maintaining all this equipment becomes a major hassle. That’s before you’ve even started thinking about reliable Internet connections, firewalls, physical security and cooling systems.

Introducing the Data Center

Fortunately, during the 1990s when many small companies found themselves in this position, it didn’t take long for a few bright sparks to realize their needs. Initially larger companies with Web sites rented space on their servers to smaller ones, and Web hosting companies soon developed. These companies built Internet Data Centers, complete with fast Internet access, reliable uptime and security all managed from a central location for their clients. Gradually the prohibitively high costs of owning a Web site fell. Today anyone can register a domain and host their own site for just a few dollars a month.

These data centers resemble the early computer rooms built to house large mainframe computers, with rack-mounted equipment arranged in rows on raised floors. They have climate controls you might expect to find in a sophisticated commercial greenhouse, security bordering on military grade in larger complexes, and a lot of cables. These might be housed overhead in cable trays and appear to be the equipment’s life support machine, or beneath the flooring, like an irrigation system. Little wonder, then data centers are often referred to as “server farms.”

Appearance

  • Most of the equipment in a data center is arranged in 19” rack cabinets that look like rows of lockers from the front. Cables connect the components at the back as you might hook up a Hi-Fi or home cinema system.
  • As well as providing under floor cable access ion some data centers, raised floors are part of the intricate cooling system employed to prevent the equipment overheating.

Climate Control

  • Cool air is pumped under the floor at pressure, forcing it up through vents at the front of the cabinets. This air is then drawn through the cabinets from front to back over the equipment, before being pushed out the back. The hot air then rises and is taken into the air conditioning system where it is cooled and fed back to the under floor pumps.
  • As well as maintaining a temperature in the range of 16–24 ° C (61–75 ° F), data centers need to maintain humidity around 40–55%. If it’s too humid, water can condense on the equipment and if it’s too dry problems with static electricity can occur, both of which can damage the servers.

Fire Safety

  • Data centers are often build with fire walls designed to slow the spread of fire from one room to another. They will also have sensitive smoke detectors designed to detect faulty equipment before it starts to burn, and manual fire fighting equipment. Between these two extremes, they will have a sprinkler system installed, and possibly a gas based fire suppression system. This multiple pronged approach demonstrates the ‘redundancy’ approach to everything adopted in a data center.

Redundancy

Because the Internet is global, online access to Web sites needs to be continually available. It’s because of that need for stability multiple redundant systems are employed to achieve the same goal. If one fails, another takes over. It’s all about preventing single points of failure, or having a single piece of equipment whose failure could bring down the whole system.

  • Power supplies may be connected to multiple sub-stations, so that in the event of a power outage in one area, power can still be drawn from an unaffected area. These will be backed up by on site generators and UPS equipment.
  • Servers will be arranged in clusters to limit the impact of a single server failure, and there will be backup clusters ready to assume the workload in the event of a catastrophic failure, often at a second location. The whole system will be backed up in real time or close to it, and there may well be a duplicate back up system waiting in the wings.
  • Not just the servers, but the switches, routers and firewalls that support them will typically be duplicated, and there may be multiple Internet connections sourced from different service providers.

Security

  • Hardware and software based security systems are combined to protect user data. As well as the firewalls, spam filters and anti-virus software you might be aware of, data centers will also be running Intrusion Detection Systems, and may have off site monitoring systems too.
  • Physical access to the data center will also be restricted. Although security staff and surveillance equipment will both be present, they may only be part of a layered approach to data security that could involve biometric user recognition.

About the Author

Gail Seymour has been a Web site designer for more than 10 years. During that time she has won three design awards and has provided the content and copy for dozens of Web sites and more than 50,000 Web pages.

Stay in the Loop

Join Our Newsletter

Stay ahead of the pack with the latest news, web design advice, and digital insights, delivered straight to your inbox.
This field is for validation purposes and should be left unchanged.
© Copyright 2023 Hostway. All rights reserved.