Data centres are a fundamental part of the business enterprise, designed to support applications and provide services such as data storage, access management, backup, and recovery, writes Vincent Fogarty.

Data centres also provide productivity applications, such as online meeting portals, e-commerce transactions and provisioning for online gaming communities. Recently big data, machine learning and artificial intelligence have prompted the growth of data centres

Pooling of data stored

Cloud computing is a primary driver of data centre growth. The cloud relies upon the pooling of data stored and then processed within the capabilities provided by the likes of Apple, Microsoft, Amazon, and Google.

Users connect via internet devices, and through the network's tentacles, data centres allow users access to the data they need. The data is in all formats, from audio files, photographs, and compute software.

Data centres are the internet's core, and the cloud is only made possible by high-speed, resilient, and reliable networks. These cloud networks may be public, private, or commercial.

Following the rise of the Internet of Things (IoT) and Industry 4.0, manufacturers depend on big data analytics to enhance their operations' output efficiency and cost-effectiveness.

The IoT usually refers to the instrumented world where IP(1) addresses are embedded in objects in the environment(2). These 'Things' are devices operated in their home or carried by people. Modern-built assets tend to have intelligent doors, lighting, and controls that all interface with IP addresses.

All types of Bluetooth, RFID(3), GPS(4), vehicles and many more 'Things' are connected by the network's tentacles. The potential of a digital twin(5) that augments the creation of virtual reality offers the possibility to simulate all types of asset design and function scenarios, create extensive data, and compute demand.

Several locations

Many IoT missions may require several locations for IoT data analysis and storage, including endpoint devices with integrated computing and storage; nearby devices that perform local computation; intelligent gateway devices; and on-premises data centres, managed to host sites, colocation facilities, and network providers' point-of-presence locations. The diversity of edge computing locations reflects the diversity of markets for IoT.

Several IoT deployments may end up storing, integrating and moving data across a combination of public cloud and other commercial facilities, including colocation sites, with both distributed micro-modular edge data centres and enormous centralised core data centres, including those of public cloud providers playing a role. Even within similar IoT applications, network architectures and data centre types may have various interfaces and data exchange paths, as shown in Figure 1.

Figure 1 – Data Centre Interfaces with the Internet of Things (IoT).

The internet has primarily fuelled this sustained growth of data creation. The smartphone has been a big part of this growth. However, more IoT devices have further generated data through Internet connections.

The processing of mega quantities of data prompts the need for the internet via cloud computing because stand-alone technology does not have the capacity. The pivotal engine of this physical cloud computing infrastructure is data centres.

In this age of data, reports(6) indicate that there were 36 billion IoT devices installed worldwide by 2021 and a forecast of 76 billion by 2025. The generation of large masses of data affects the transactions also to be captured, transmitted, stored, evaluated, and retrieved. Data centres house these treasuries of this internet age.

The stock market confirms that some of the 10 biggest global companies by market capitalisation are Alphabet(7), Apple, Amazon, Microsoft, and Meta(8). It may well be obvious how much data those big five produce and how it drives the data centre's needs.

It may be less obvious that your local shop and sporting bookmaker also has data centre needs generally catered for by a co-locator data centre provider. However, some businesses have privacy concerns about client data, such as banks, insurance companies, health providers, and others who continue to have enterprise data centres. 

Data demand versus compute efficiency

There is a competing axis around which data centre size continually evolves. The first part is chipset efficiency, the second is software efficiency, and the third is rack density.

The first influence is chipset efficiency; the latest generation of server processors delivers more workload than those engaged previously. Every new server technology generation has delivered a leap in efficiency across the board for the past 15 or so years.

In this context, it is worth recognising Moore's Law as a term used to refer to the opinion given by Gordon Moore(9) when in 1965 said that the number of transistors in a dense integrated circuit (IC) doubles about every two years. In 2021 Intel claimed(10) that the semiconductor industry would meet or beat Moore's Law(11).

Secondly, some computer scientists point out that the efficiency or performance of the software decreases when the hardware becomes more powerful(12). Many reasons are impacting this condition. The really significant reason is that the cost of creating software is dramatically increasing while, at the same time, computer hardware is becoming less expensive(13). 

In 1995, computer scientist Niklaus Wirth stated, "Software is getting slower more rapidly than hardware becomes faster." This statement was later recognised as Wirth's Law(14). It is because software is becoming more intricate as the hardware progresses; the actual execution of computers might not be improved as people anticipated.

The term 'software bloat' was created to describe the occurrence. Subsequently, computer scientists continued to make similar statements about Wirth's Law. British computer scientist Michael David May stated, "Software efficiency halves every 18 months, compensating Moore's Law"(15). 

This declaration became identified as May's Law. While Wirth's Law, May's Law and other laws contend that inefficiency counteracts the effect of Moore's Law, it is accepted that hardware efficiency trumps software inefficiency in productivity gains(16). On the one hand, the software is slow and inefficient; on the other hand, the hardware industry follows Moore's Law, providing overabundant hardware resources.

The third is rack density within the data centre space. Racks are like a framing system that organises the high-density blade(17) servers [18], network and storage equipment. Each blade has an energy consumed measure that may be stated in kilowatts (kW).

The summation of power consumed in a single rack may range from 2kW to 20kW and sometimes beyond. This number of kW per rack is generally known as the rack density; the more kW per rack, the greater the density.

The rack power density calculation is one of the most fundamental regarding server room and data centre designs. When the client or developer has decided on the data centre's capacity, the design density of the racks may offer most of the answer to the size of the floor area of the data centre.

While densities less than 10kW per rack remain the norm, deployments at 15kW are typical in hyper-scale facilities, and some are even nearing 25kW. An increased rack density for a total design load effectively reduces the data centre's footprint.

The development tendency has been to provide increased rack density and an ever-increasing chipset efficiency, thus getting more data transactions per footprint unit area of the data centre.

Squeeze more 'compute'

In addition to floor area requirement, the power per rack multiplied by the total number of racks in the room provides the basis for capacity planning, sizing, critical power protection and cooling systems. The industry trend is to squeeze more 'compute(18)' out of less footprint and power consumed.

It follows, therefore, that the chipset efficiency and power consumed to provide the data transaction is an ever-evolving equation and is a counterbalance to the ever-increasing demand for more data transactions.

As the network latencies improve with the enabling of fully immersive virtual worlds that are accessible to everybody, the compute infrastructure layer continues to be pivotal in that journey.

This increase in chipset efficiency may lead to more extensive retro refitting of data centres where new racks with new computer chipsets are replacing their older incumbents and reducing the need to build new data centres.

However, the cooling requirements also increase as you increase the power load to higher densities. In retrofit projects where rack density increases the power usage and, therefore, the cooling need, it may prompt a complete redesign of the mechanical and electrical services with plant and systems replacement.


1) An Internet Protocol address is a numerical label such as that is connected to a computer network that uses the Internet Protocol for communication. 

2) Greengard, S. (2015). The internet of things. Cambridge, Massachusetts: Mit Press. 

3) AB&R (2019). What is RFID and How Does RFID Work? - AB&R®. [online] AB&R. Available at: 

4) (n.d.). Definition of GPS. [online] Available at: 

5) Nath, S.V., Schalkwyk, P. van and Isaacs, D. (2021). Building Industrial Digital Twins Design, Develop, and Deploy Digital Twin Solutions for Real-World Industries Using Azure Digital Twins. Birmingham: Packt Publishing, Limited.

6) Statista (2012). IoT: number of connected devices worldwide 2012-2025 | Statista. [online] Statista. Available at:

7) Alphabet Inc. is an American multinational technology conglomerate holding company headquartered in Mountain View, California. It was created through a restructuring of Google on October 2, 2015,[2] and became the parent company of Google and several former Google subsidiaries.

8) Meta as the overarching company that runs Facebook, WhatsApp, Instagram, and Oculus, among others.

9) Gianfagna, M. (2021). What is Moore’s Law? | Is Moore’s Law Dead? | Synopsys. [online] Available at: 

10) VentureBeat. (2021). Intel promises industry will meet or exceed Moore’s law for a decade. [online] Available at: [Accessed 2 Sep. 2022].

11) VentureBeat. (2021). Intel promises industry will meet or exceed Moore’s law for a decade. [online] Available at:

12) Software Efficiency Or Performance Optimization At The Software And Hardware Architectural Level. (n.d.). [online] Available at:

13) techslang (2021). What is Wirth’s Law? — Definition by Techslang. [online] Techslang — Tech Explained in Simple Terms. Available at: [Accessed 3 Sep. 2022]. 

14) Wikipedia. (2021). David May (computer scientist). [online] Available at: [Accessed 3 Sep. 2022]. 

15) Published, C.M. (2012). 10 laws of tech: the rules that define our world. [online] TechRadar. Available at: [Accessed 3 Sep. 2022]. 

16) SearchDataCentre. (n.d.). What is a Blade Server? [online] Available at: 

17) Blade, cabinets, or server racks 

18) Network routing, computation, and storage. 

Author: Vincent Fogarty, MSc (King's College) in Construction Law & Dispute Resolution, BSc (Hons) in Cost Management in Building Engineering Services. Diploma in Engineering Services, FRICS, FSCSI, MCIArb, ACIBSE, MIEI, I.Eng. He is a dual-qualified engineer and quantity surveyor and started his data centre journey as a mechanical and electrical consultant for the Bank of Ireland data centre in Dublin, which housed reel-to-reel data storage for some of the first ATMs. Many years later, he ventured to Luleå in northern Sweden as commercial manager on Facebook's first data centre built outside the USA. Since then, he has provided commercial advice and dispute resolution services on various commercial matters on many data centre projects in many jurisdictions. Fogarty is also a founding partner of Project Art, a data centre site currently under a planning application process in Ireland.

He has more than 38 years of combined quantity surveying and mechanical and electrical engineering experience within the construction industry. He initially trained as a mechanical and electrical engineer with a firm of specialist consultants and later joined consultants and contractors working on the whole project life cycle from inception to completion and then handover and operation. He has been appointed as an expert in complex mechanical and electrical quantum matters concerning commercial data centre disputes.

Fogarty has also given a quantum opinion on operational costs and loss of revenue in energy generation. Combining the skills of an engineer with that of a quantity surveyor has been a Fellow of the RICS since 2014 and maintains membership in the Institute of Engineers as an incorporated engineer. He recently became a member of the Chartered Institute of Arbitrators, having an MSc in Construction Law and Dispute Resolution from King's College, London.