5G multiplies the geographical distribution of datacenters

Newsroom -

July 14, 2022

With the increasing availability of 5G, datacenter operators must change quickly – transforming containers and virtual servers into cloud services and edge computing – or risk being left behind.

The low latency brought by 5G requires edge computing sites to become tangible. Bringing data processing to the edges of the network makes it easier to achieve speeds of at least 50 megabits per second, with latency rates 10 times lower than those offered by 4G networks. Overall, it is believed that the data processing and delivery speed of 5G will be up to 100 times faster than 4G networks (data from the Next Generation Mobile Networks Alliance).

To achieve these goals, operators of all sizes – including the increasingly capitalised and consolidated ISPs – and large enterprise users are already looking at how to geographically distribute and grow the size of their data centres. It is predicted that there will be a veritable explosion of new datacenters in every region and every vertical, as operators will need to plan to upgrade existing infrastructure, develop new architectural approaches to handle hyper-local edge data centres and learn to incorporate automation for seamless 5G network management.

According to the Data Center 2025: Closer to the Edge survey conducted in 2019 by Vertiv, the number of datacenters focused on edge computing functions is expected to grow 226% by 2025. For the US consultancy Medium, by 2025 75% of corporate data will be processed in data centres deployed at the edge of the network. Today this mark is at 10%.

Edge up

For years, the main trend, worldwide, in the datacenter industry has been driven by economies of scale achieved by ever larger facilities, so-called hyperscale data centres. For good reason. Operating cost per rack or per server is lower when overhead costs can be shared across a larger number of servers. Connectivity is easier to acquire for larger data centres. And it is comparatively easier to build a redundant power infrastructure (2N, 2N+) for a very large facility. 

From nothing a decade ago, there are now more than 500 giant hyperscale datacenters around the world. The trend is unlikely to be reversed any time soon. Tech giants, cloud providers Amazon, Apple, Facebook, Google and Microsoft are seeing robust year-on-year growth as traditional on-premises IT gradually migrates to the cloud. They need more computing power and more cooling power than ever before. As the sharing of data generated and intended for use by other machines (automated devices) grows, it is reasonable to expect the trend towards the larger data centre to continue in the near future.

“At the same time, we are witnessing a parallel trend of micro data centres that are in locations closer to the edge, where data is being generated and consumed,” explains Simon Besteman, Managing Director of the Dutch Cloud Community. These data centres at the edge of the network are much smaller than large data centres and complement the existing infrastructure. They solve a connectivity problems today and play a decisive role in preparing the IT infrastructure of the future. 

The weakness of centralization is that all data is in a datacenter at the core of the network, and all users must connect to the central point to access the data. This has two consequences: it requires a significant amount of bandwidth and latency (the time it takes for data to reach the user) cannot be reduced enough to manage certain applications efficiently. 

Moving data closer to the edge makes it possible to manage the flow of data more efficiently for the user. Instead of having to connect to hyperscale to download a series, data is cached in an Edge datacenter closer to them. The data only needs to travel from the edge facility to the end user. It also allows for a much faster response as the data only needs to travel a short distance.

The second issue addressed by Edge is developing and will come to maturity in the next 24 to 36 months. It is the exponential increase in the volume of data generated and exchanged. The two main components of this are separate but related: IoT and 5G.

“Whether it is a large datacenter, or those implemented in modular processing units, with sizes ranging from one cabinet to one floor, to containerised models, the logic governing this market is the same: the data centre is a critical environment that has to deliver excellent quality digital services,” explains Luis Arís, Business Development Manager at Paessler LATAM.

The large amount of data that 5G networks must process, store and distribute could overwhelm datacenters and increase demand for computing capacity and associated infrastructure. Cisco reports that 5G will support more than 10% of the world’s mobile connections by 2023, with an average speed of 575 Mbps. That’s 13 times faster than the average mobile connection today.

Downtime, an ever present concern

Data centre downtime remains a real risk. Downtime drives down product sales for data centre-based companies, and also affects the value of their brands. And the datacenter, for its part, suffers doubly: with SLAs that offer 100% service availability, any failure in power, temperature and bandwidth management can lead to heavy penalties.

“I’ve been hearing complaints from datacenter managers about the pressure they’re under at this time of major expansion of critical digital infrastructure,” says Arís. It’s phrases like these:

  • “24×7 availability is our goal when monitoring a datacenter. It is the heart of the network and has to function in an automated way, even if the system administrator is not on site.”

  • “My role is to provide the right availability – contracted in SLA format – to customers. This requires not only that I check the status of the devices, but also that I am able to analyse the usage and consumption of key parameters such as power and bandwidth. I need to ensure the quality of the user experience and, at the same time, avoid over-investment in infrastructure.

  • “As an administrator of a physical data centre, I need to have visibility over the entire IT infrastructure of that environment, and go beyond that by gaining control over the surroundings of the data centre as well.”

According to Arís, one strategy to overcome these challenges may be the adoption of agnostic monitoring platforms, capable of bringing a predictive view on all aspects of a datacenter, including classic IT challenges, environmental sensors and security devices. The goal is to get a single view of the environment as a whole and, whenever recommended, to use individual dashboards focused on specific aspects of the data factory.

It is possible to use this type of solution to monitor a datacenter, or thousands of datacenters, in an integrated manner and from a single interface. These are “White Label” monitoring solutions, says Arísm, which in addition to already delivering hundreds of sensors that measure the status of each element of the data centre and network, can also serve as a development platform for new sensors.