Everything You Ever Wanted to Know About Data Centres

data centre

Back in 2015, Apple was planning to build a billion-dollar data centre in Athenry, Ireland. To give you an idea how much space a billion dollars worth of data storage takes up, the center would have spanned the length of about 23 soccer fields, created over 100 jobs, and occupied a location any builder of data centres would dream of.

But as Apple would find out, there can be many roadblocks on the way to building the perfect data centre…

Whether you’re looking for personal or business data storage, there are a lot of factors to consider when looking for somewhere to store all your precious data. This article will show you what to look for when shopping for data storage.

Data Centre Requirements

Designing a data centre can be easy. Designing one that can stay up to date for more than a couple years is hard.

There are two major types of data centre builds:

1. Enterprise Class. In return for sacrificing price and efficiency, you get a reliable center that can scale as your storage needs as you grow. The center works around the needs of their customers. But the time lag between construction can result in less innovation.

2. Commodity Class. Faster and cheaper to build than an enterprise facility, but may not be able to scale as data needs grow. Some businesses can outgrow their data facility in a couple of years.

These data centres usually fall into one of two categories, managed or colocation. Managed data centres allow you to lease your own server rack. The data centre provider takes care of all the physical upkeep. Managed data centre providers often offer the following services:

  • Setting up your server and installing it at the center
  • Installing any software you’ve approved for running your operation
  • Keeping the server secure from hackers
  • Customer service
  • Patching holes in the server’s security
  • Backing up your data

Colocation centers are a little different. Think of a colocation center as a condo with homeowner’s association fees. You own, rather than lease, your server rack. But the service provider usually takes care of all the physical upkeep like a managed location.

There are also data centres built only for certain companies. But these companies are in the business of data. Most businesses need the benefit of big data without the hassle of building their own center.

The great thing about building a data centre is that they can sell for a profit. Verizon recently sold some of their data centres to Equinix so they could focus on their core business. Because your data centre may have had another company in mind during construction, let’s look at what else a customer should look for in a data centre.

Location/Weather

When thinking of data, we imagine a sort of ambiguous, all-consuming, vat of knowledge. But all that data is here in the real world, on servers. Those servers need everything from consistent temperatures to the availability of skilled workers.

There are four major concerns when it comes to natural disasters. These are the events that can bring a data centre down indefinitely:

Storms/Flooding

British cellphone provider, Vodaphone, saw one of their data centres in Leeds go down for several days during the winter flood of 2016. The backup generators ran out of fuel causing millions to lose coverage.

Drought

Data centres drink about 626 billion liters of water a year to cool off and generate power. Droughts are often a sign of higher than average temperatures. This makes water even more crucial for cooling systems.

Lightning

Lightning can knock out a data centre’s main power source and it’s backup generators. Lightning can also cause fires which can lead to more problems.

Wildfires

2018 saw wildfires burn 8.5 million acres across the globe. Other than the obvious danger of burning, fires can also increase temperatures.

Data centres should be in an area with good soil drainage to protect from floods. Even centers not located on a flood plain need to have good drainage to fare against heavy rain. You can ask your data centre provider if they did a perk test before beginning construction. A perk test is a measure of how well the soil on a building site can absorb water.

Like a celebrity trying to get away from paparazzi, data centres need privacy from the hectic world around them. Banks are bad neighbors because of their high risk of burglary.

We need data centres most during natural disasters. Your center shouldn’t be anywhere near high traffic areas. The reason for this being that workers need to be able to have adequate access to the center to properly maintain it.

In a perfect world, your data centre will be in an area with monotonous weather. The farther away from natural disaster hot spots, the better. The difficult part of this is finding a spot that fits a center’s privacy needs while still being close enough to connect to a strong fiber-optic network. No site will have all these specifications, so pick one that checks the most boxes.

“How Prepared Is the Data Centre?”

Staff is the most important aspect of a data centre. Having competent staff can make or break a data centre in times of emergency.

Almost half of data centres polled cited having workers nearby as necessary. It’s also recommended that data centres have back up staff in areas unaffected by the natural disaster that doesn’t have family in the area to care for first.

Data centres also need to plan their cooling systems to be able to take on greater amounts of heat. The more data storage centers absorb, the more heat it’s servers produce. For decades, raised floors were the go-to option to cool server rooms.

The air just below a server room goes through an air conditioner. It is then pushed up through perforated tiles. It’s like adding an ice cube to your hot soup to make it the perfect temperature. Raised floors add just enough cold air to the hot server room for a comfortable temperature.

Other Ways to Stay Cool

Raised floors are still common, but new technology allows for more efficient cooling. Using liquids, rather than air, is becoming an increasingly common way to cool computer components. There are three ways to cool your servers using liquid:

  • Liquid flows through the server and makes contact with the heat sink. All the heat the computer has put into the heat sink gets carried away with the liquid.
  • Immersion cooling completely submerges server components into a nonconductive liquid. The liquid whisks away heat through a system of pumps.
  • Dielectric fluid is a liquid insulator that gets sprayed onto hot server components, absorbing the heat when removed.

Circulating air is still the most common way to keep computers cool. Along with raised floors, there are a number of ways to increase airflow to servers.

When using air for cooling, the key to efficient use is containment. Rather than letting hot exhaust and cool treated air mix, the air cycles through the facility in one direction. A system using containment makes for much evener temperature maintenance. Because of this, it can handle cooling higher density CPUs and other server components.

Where Does All the Hot Air Go?

There are two ways of using heat containment in a server room. The first is cold aisle containment, where the area between two rows of servers holds the incoming air-conditioned air. Hot air leaves the servers and enters the computer room air conditioning (CRAC)unit. This hot air has to travel through the workspace.

This has a major downside. All that heat escaping into the room waiting for treatment makes the workspace really hot. This can make it uncomfortable for workers to service the machines.

A better way of organizing your server aisles is to direct all that hot exhaust air into the middle of the aisles. reversing the previous setup. This contains all that hot exhaust between the server aisles. The workspace becomes a cool air-conditioned temperature.

Security Standards

Better cooling technology doesn’t come without its costs. Many modern CRAC systems allow remote monitoring. This makes them potential entry points for hackers.

Target stores in the North Eastern U.S. had their networks hacked through stolen credentials from a third party HVAC provider. This same HVAC service works with multiple big box stores, like Trader Joe’s, in the area.

These stores give a third party service remote access to their power source to curb energy use. A hacker can then intercept this access and take control of the network. Credit card numbers from millions of customers were vulnerable during the attack.

The fact that these cooling systems weren’t designed with security in mind makes them even more vulnerable. Most have a default or no password, with few, if any, security patches.

Active/Active or Active/Passive?

Servers can be set up with a couple different operating methods. These methods are for maintaining high availability (HA) of your servers to reduce the chance of outages. HA is the concept of having at least two of everything when it comes to your network in case of failure.

Both of these terms refer to how your servers balance their workload. In an active/passive situation, one server processes data and delegates work to other servers. The second server monitors the first for any delays or outages. When the first server goes down, the second is already up to speed and ready to handle the job.

The downside of an active/passive set up is that you’re paying to operate two servers, but only getting the functionality of one. Your passive server is like insurance, you pay for it but only use it in emergencies.

An active/active setup requires both of those servers to be working on similar levels. Rather than one standing by waiting for failure, both servers are actively processing data. The downside of running servers in active/active mode is that they are both running at full capacity and there is no back up ready to take over in case of failure like there is in an active/passive setup.

To get the same computing power on an active/passive setup as an active/active, you have to buy servers that are twice as powerful. This costs much more than having two cheaper models doing the same amount of work as one super server.

How Much Does Building a Data Centre Cost?

The Facebook data centre Clonee division is the perfect way to catch a glimpse into the costs of opening a data centre. Built in 2016, the 25,500 sq. ft. cost a whopping €300 million ($346 million).

Facebook differentiated their Clonee center by powering it with only renewable resources. One unique earth friendly feature of the campus is the 50,000 honey bees. They can travel up to 5 km around the Irish landscape pollinating trees, flowers, and shrubs.

It took over 1,100 construction workers to build the massive energy-efficient complex. Those workers used 23 Olympic swimming pools worth of cement and enough steel to build two more Eiffel towers.

No Apple Data Centre for Ireland

Apple’s decision to back out of the deal created some backlash. The people of Ireland are saying that the building codes are too restrictive. They fear it is hurting their competitive edge as a tech hub.

The problems Apple had in Ireland show that extent of data centre concerns. Natural disasters and server capabilities aren’t the only problems facing data centre construction.

Data centres have to deal with the same politics and bureaucracy all business construction does. Apple didn’t have the luck of the Irish with them when attempting to build their billion-dollar data centre in Athenry.

Ireland’s building process is long and daunting. It proved to be too long to fit Apple’s desperate need for more data storage. The good news is that now someone can do it right. Don’t be like Apple, choose the best platform to run your e-commerce website on by reading this article now.