search
close-icon
Data Centers
PlatformDIGITAL®
Partners
Expertise & Resources
About
Language
Login
Talk to a specialist
search
globe-iconAsia (EN)
Talk to a specialist
banner
Article

Data Centre Design & Architecture

Data is essential to businesses in the digital age and is at the heart of critical processes from internal operations to customer services. These vast amounts of data need to be stored somewhere that align to both technology and business requirements, classically an on-premise IT hub, with servers and equipment onsite. Today, modern data centres are the home of data, essential to keep our society running at speed and scale. These data centres are designed and built with huge amounts of planning and resources to ensure that the facilities within can support companies for decades to come.

Choosing the Right Data Centre Design

Designing data centre infrastructure at scale has the following key considerations:

  • Power
  • Cooling (Hot Isle/Cold Isle)
  • Connectivity to partners, carriers and exchanges
  • Security
  • Location

These considerations, along with business requirements, feed into choices that determine data centre architecture. Purpose built data centres often utilise the physical space as efficiently as possible, leaving room for growth in years to come. Racks and servers need careful planning with modular designs that allow for airflow and expansion.

Data Centre Connectivity

Modular builds are favoured in data centre design, meaning racks and servers can be removed or added without interrupting operations making scaling up or down seamless. Cabling systems in data centres often outlive other equipment, with some systems staying in use for years on end. Therefore cable organisation and management within racks is critical, as individual units can also be connected or disconnected without disruption. Both power and data cables are normally separated in cabling systems to prevent any potential interference.

Data Centre Cooling

Data centre cooling systems are core features of any data centre, as heat can easily shorten the lifespan of equipment and even destroy expensive equipment. The density of this equipment in one building means the amount of heat generated is significant, but there are several different data centre cooling solutions available with systems varying depending on where the data centre is located.

Types of data centre cooling systems:

  • Adiabatic cooling (Air)

The most common air-cooling systems use rack mounted fans that conduct warm air away from the servers. These fans are arranged in a configuration called hot aisle / cold aisle. On one side of the server, conditioned air travels along a cold aisle, drawing cooled air into the rack. On the opposite side, fans push hot air from the servers along the hot aisle, leaving the room.

  • Liquid Cooling (Water)

Chilled water conducts heat more effectively than air as it is a denser fluid. This water is pumped through coils in a closed system drawing away heat, and then this heated water is run through a chilling unit restarting the cycle.

  • Liquid Immersion Cooling

Often referred to as the future of cooling, this technique replaces water, a known enemy of electricity, with dielectric liquid coolant (chemically and electrically inert). This fluid allows for components to be immersed in the liquid, negating the need for pumps and coils. Liquid immersion cooling solves potential problems with water around electric equipment, and overcomes issues with spotty airflow allowing for denser, higher performance data centres of the future.

Data Centre Support Teams

When designing a data centre, accounting for engineer’s hand and eye support is essential. Interdisciplinary staff need a space within the data centre to collaborate, a centre of operations with advanced monitoring equipment. Both IT professionals and engineer specialists in cooling, humidity, and physical racks should be onsite, available for quick response to outages and access for maintenance and emergencies. They can also provide consultative migration to data centre environments for businesses looking to take advantage of colocation and cloud services. Support teams can also assist with data centre network design, managing network configurations for complex multi and hybrid cloud.

Data Centre Security

Keeping sensitive data protected is one of the main reasons a business will opt to employ the services of a data centre. The data and equipment housed in these facilities require robust security protocols as standard to protect business critical assets. Entry to data centres is strictly controlled for both security and equipment integrity - dust and other contaminants should be kept to a minimum. Larger data centres include physical barriers at each stage of the facility: anti-tailgating measures, biometric scanners, and on-site security staff. Within the server room, lockable server racks give businesses extra peace of mind, meaning only data centre managers hold control over who has access to servers.

Data Centre Management Tools

Data centres will often employ tools that help both CTOs and IT professionals monitor IT infrastructure and how it is performing:

  • Data Centre Infrastructure Management (DCIM): Software with consoles to monitor a company's resources offsite, across all assets, colocation, and public clouds.
  • Software Defined Networking (SDN): The shift of hardware control to software, partitioning resources for most efficient use according to business requirements.
  • Customer portals: Self-service online portals to manage your data and available compute resources.
Airflow Design

Data centre airflow management is just as vital to operations as cooling systems, working synchronously to protect sensitive data and equipment. Computer Room Air Conditioning (CRAC) are cooling units that use chilled water and air to keep the internal environment functioning between set thresholds for temperature, airflow, and humidity. Equipment within servers such as magnetic tape (integral to data storage), is especially sensitive to changes in humidity so it is important to consider the provisions for this when thinking about data centre architecture.

Power Supply

Power supply in modern data centres is focused on density, upwards of 10kW+ per rack (in our LON3 centre in London) is the next industry standard for emerging services in AI, VR and streaming high definition content on demand. With high power comes the question of efficiency, an industry standard metric, Power Usage Effectiveness (PUE) determines the power efficiency of data centres, with many designed to lower this value as much as possible for greener operations.

Data centres require a large supply of power, connected to the main grid. This voltage is then transformed and supplied through cable that is designed to minimise heat and maximise power distribution. All data centres should have at least N+1 redundancy for power, so that service can continue uninterrupted when electricity sources fail. Facilities account for this by using both a fuel-powered generator and an intermediary battery powered Uninterruptible Power Supply (UPS) system. In the event of a power outage, the UPS takes over with backup batteries, protecting against voltage spikes and powering equipment until backup generators switch on to take over the load.

Conclusion

Data centres are hubs of technological innovation, housing big data and cutting-edge equipment. The facilities that house these are designed for maximum connectivity, efficiency, and performance. Our purpose-built data centre campus in the heart of London leverages the very latest in data centre architecture with high density power, 5 layers of security, and connected communities of interest for digital growth in London. LON3, our colocation centre in Brick Lane is densely connected with multiple fibre and power redundancies to ensure your business continuity for today, and the future.

*This content was originally published by Interxion: A Digital Realty Company.

Tags