Register for our Life Sciences Capital Benchmarking Webinar here

Close button
Share Linkedin icon Twitter icon Facebook icon Whatsapp icon

25 March 2019

How to Build a Data Centre and Keep the Lights on

Data Centre

Data centres have gone from being almost hardly noticed to one of the most important pieces of infrastructure in the global digital economy. They host everything from financial records to Netflix movies. 

As a result, data centres have become a multibillion-dollar industry, precisely because their role is so important. Designing, building and supporting data centres requires strategic planning and careful construction, in order to keep clients’ mission-critical data secure and available 24/7 - regardless of what it is. 

There are many factors that must be addressed when designing and building a data centre. For starters, it’s all about power - finding it and managing it.

Finding the power

Data centres require an incredible amount of electricity to operate, and this electricity often requires the direct intervention of regional utilities in order to work. Energy infrastructure needs to be shifted, power lines need to be run and redundancies need to be established. The most secure data centres have two separate feeds from utilities, so that if something happens to one of the lines — like an unexpected squirrel attack — the centre doesn’t immediately lose all of its functionality. 

Coordinating that takes a lot of effort, and often the clout of a large corporation in order to get anywhere. But even the big players need to check the policies of utilities and local governments in any area in which they are planning on building a data centre, to make sure that they will be able to do establish those inputs. Because without that redundancy, data centres can be vulnerable to power outages that could result in not only the loss of critical customer data, but also a negative impact on the brand of the data centre owner.

The price and availability of that power are also incredibly important considerations because a data centre is going to be a large draw at all times. With a significant amount of power going into computing, and even more going into cooling computers down, it’s no surprise that data centres are using more than 1.8% of the power of the entire United States. Again, companies planning data centres need to work with local governments and utilities for subsidies and deals that can make that energy easier to afford. 

Keeping the lights on

Much of the support infrastructure in data centres is focused on making sure that their power cannot be interrupted. Uninterruptible power supplies (UPS) - powerful batteries that can start providing power almost instantaneously - are critical for this effort. 

They ensure that in an emergency, power comes back on in milliseconds, instead of seconds or minutes that could result in the loss of data or functionality for thousands of computer systems. But most UPS systems don’t serve as backup power for long. They simply don’t have the kind of power storage capacity that it takes to power a data centre for more than a matter of minutes. In order to keep data centres fully running without utility power, data centre operators usually turn to large diesel-powered generators, stocked with 24-48 hour of fuel at all times. 

All of this redundancy is required because of the incredible amount of energy that data centres use. But the other key factor in a data centre’s success is the efficiency with which that energy is used. That starts with the organisational strategy used for cooling. 

Staying cool

Data centres are carefully planned structures. Every square foot needs to contribute to the wider goals of powerful and efficient computing. You can’t just slam server racks together, because their placement needs to fit in with the cooling system used to prevent overheating. 

Data centres run hot, and today’s advances in High-Performance Computing (HPC) mean that they are using as much as five times more energy than they used to. This makes a cooling solution one of the most important decisions that a data centre operator has to make.

By far the most common data centre cooling method involves airflow, using HVAC systems to control and lower the temperature as efficiently as possible. 

Rise of liquid cooling

While liquid cooling has historically been the domain of enterprise mainframes and academic supercomputers, it is being deployed more and more in data centres. More demanding workloads driven by mobile, social media, AI and the IoT are leading to increased power demands, and data centre managers are scrambling to find more efficient alternatives to air-based cooling systems.

The liquid cooling approach can be hundreds of times more efficient and use significantly less power than typical HVAC cooling systems. But the data centre market is still waiting for some missing pieces of the puzzle, including industry standards for liquid-cooling solutions and an easy way for air-cooling data centres to make the transition without having to manage two cooling systems at once. Still, as the growing need for more efficient cooling shows no signs of slowing, liquid cooling will likely becoem the norm in years to come. 

Building a data centre is about executing an extremely complex plan, with input from experts in wide-ranging fields. Firms thinking about building their own data centre should consult with experts who have dealt with their specific difficulties before, to make sure that all of these core areas can be built without incident. 

Modern data centres are planned down to the last wire on Building Information Management (BIM) applications and similar software, so that the outcome is as guaranteed as possible before the first wall is erected. Data centres are key arteries of the digital economy, funnelling the data of the modern economy between consumers, companies, governments and citizens. That takes a lot of energy! 

Share

Related Insights