The future of data centers

/ Technology & Smart Cities / Sunday, 05 July 2020 12:15

As the amount of data being generated continues to increase – from 5G to emerging technologies like artificial intelligence and virtual reality – the future of data centers in the IT landscape is evolving dramatically.

Traditional data centers are simply centralized locations where computing and networking equipment is retained for the purpose of collecting, storing, processing, distributing or allowing access to large amounts of data.

In an ever-evolving data center market, owners find it very difficult to predict future IT needs. How much equipment space will be needed? What power density should be designed for? What is going to happen in the future? This is how every data center design conversation starts.

Co-location data center providers offer a unique solution to those questions about the future by allowing the customer to lease space for IT equipment as well as power, cooling, and communications bandwidth.

If a company’s IT needs to grow, and it needs more data center space, the owner can lease it. If the company’s IT needs to shrink and doesn’t need as much equipment, the owner can always renegotiate at the end of the lease and give space back. This flexible, “future-proof” solution is one of the reasons why co-location data centers have become so popular and continue to grow.

Colocation facilities providers generally focus on companies with smaller data center needs and help companies save money on infrastructure.

Additionally, data centers consume high levels of energy, so placing them next to power sources can make them more affordable to run, as well as being more energy efficient. As emissions rise and big tech companies come under scrutiny for using non-renewable energy to power data centers, clean sources are an important consideration.

In Sweden, Facebook has built a mega-data center next to a hydroelectric plant. Data centers have started cropping up across the Nordic region, where renewable resources like hydropower and wind are abundant. In addition to proximity to clean energy sources, companies building data centers may also look for cooler climates. Places in the Nordic region and Arctic Circle can allow data centers to save energy on cooling.

Structure

Traditionally, most network traffic has run through data centers in the network core. But this model is rapidly changing. Many of the functions traditionally performed by centralized data centers are becoming dispersed across the network and at the edge, also referred to as decentralization. 

As the volume and velocity of data increases, it becomes more and more inefficient to send this information to a data center for processing. Decentralization of the network means that a lot of traffic no longer needs to be routed through on-premise data centers.

In recent years, cloud computing has played the role of reliable and cost-effective technology, easing the pressure off a vast number of traditional data centers by providing virtual infrastructure for off-premise computing. With the rapid adoption of cloud data centers, it becomes quite clear that there is a change needed to meet rising demands. Cloud data centers offer increased performance, higher capacity, and greater ease of management compared with traditional data centers.

In a fast-changing digital world, hybrid cloud has become an extremely popular option as it gives enterprises the ability to gain the best of both worlds–on-premise and public cloud. This model uses both on-premise and public cloud models to store data and applications, while simultaneously allowing organizations to take advantage of new opportunities at a lesser cost. As more choice is offered to enterprises by multiple cloud providers, hybrid cloud adoption is expected to increase further.

The continuous rise of data has also put a strain on networking bandwidth. Enterprises are working hard to decentralize compute power and place it closer to the point where data is generated. That is, the edge. Analyzing the data where it is created rather than sending it across the data centers helps in reducing the data transfer time, thereby removing the bottleneck where traffic meets at the core of the network.

Other edge technologies, such as wireless medical devices and sensors, lack the necessary compute capacity to process large streams of complex data directly. As a result, smaller, modular data centers are being deployed to provide storage and processing capacity at the edge. These data centers are placed at the base of cell towers or as close to the origination of data as possible.

On the other end of the spectrum are mega data centers — data centers with at least 1M square feet of data center space. These facilities are large enough to serve the needs of tens of thousands of organizations at once and benefit greatly from economies of scale. While these mega data centers are expensive to build, the cost per square foot is far better than that of an average data center.

One of the largest projects is a 17.4M square foot facility built by Switch Communications, which provides businesses with housing, cooling, power, bandwidth, and physical security for their servers.

One of the most important technologies in the new structure of data centers is artificial intelligence (AI). The implementation of AI in the data center will:

  • - Improve security - AI-based cybersecurity can analyze incoming and outgoing data, detect malware and ultimately protect data
  • - Conserve energy - AI can learn and set temperature points, evaluate cooling equipment and fix energy inefficiencies to reduce energy consumption
  • - Reduce downtime - AI can monitor server performance, network congestions, and predict outages

The data center will remain a crucial part of the network. Future data centers will inevitably demand an enormous amount of energy and processing ability. The size and location of data centers will meet the increasing demand, without putting a strain on energy consumption.

Restructuring the data center is important so that traffic is routed closer to the network edge, and deploying AI in the data center can help distribute the workload. This will be essential for service providers that want to meet the standards of emerging technologies and the 5G network.

When more devices and users enter the network, and as traffic grows, meeting bandwidth demands gets more difficult. Hence, to future proof data centers, the effective implementation of AI and the utilization of the edge will significantly ease the pressure off traditional data centers. 

Latest Issue

Please publish modules in offcanvas position.