The Changing Data Center: The Impact of Network Costs

The Changing Data Center: The Impact of Network Costs

As data center infrastructure adopts artificial intelligence and software-defined technologies, users will have a huge demand for its computing power. It will provide more opportunities for hyper-scaling and virtual computers across multiple data centers. Therefore, the impact of changing data centers and lower network costs will eventually make it possible to solve problems that are currently impossible to solve.

Mattias Fridström, chief evangelist at Swedish telecom operator Telia Carrier, said that lower-cost network hardware is forcing data centers and metro networks to fundamentally change the way they do business: "Today, any location with access to fiber can become a data center, which opens up new opportunities for organizations to design, manage, and operate cloud computing and on-demand computing resources."

[[219666]]

In the past, network hardware costs were very high, so connecting different data centers to another data center was often a costly task. Companies such as Google, Facebook, Amazon, Intel, etc. have been at the forefront of the software design revolution, and they are moving into the network field of SDN and SD-WAN. The use of traditional expensive dedicated chips in network equipment is about to become a thing of the past. Due to lower costs and faster speeds, its technology and market dynamics are also changing. In turn, this is also reducing the costs associated with data centers, public clouds, hybrid clouds, and private clouds, which will make them more accessible.

Capacity is limited

For many years, the capacity of networks inside data centers has been limited by their underlying technology, but as new processors and signal processing technologies have emerged, costs have been reduced. At the same time, network performance inside data centers has also increased. Connection speeds used to be limited to 10Gb/s, but now it is common to have 100GB/s or more processing power. As a result, lower costs and higher performance have become the new norm. This means that enterprises can now take advantage of this new high-capacity WAN connection.

Cost has always been a disincentive for widespread adoption, but reducing costs and commodity hardware along with open source software-defined functionality has brought flexibility to organizations of all sizes, while changing the dynamics of WANs and bringing new possibilities. Nevertheless, latency and its impact must be considered when planning new installations.

Deploy anywhere

Fridström said fiber can now be deployed in data centers built and located anywhere. Organizations can now create global access to data centers to alleviate location constraints for disaster recovery (DR). Doing so can move computing closer to consumers, or even closer to the edge. But its propagation speed is limited and can cause problems, making it difficult to move data between data centers. Network latency and packet loss remain issues that can degrade data center performance.

When designing geographically dispersed solutions, many organizations fail to consider the impact of network connection speed. For high-speed transaction platform data, the distance between data centers will affect the time between transactions. However, for low-speed transaction data consisting of a small number of packets, a few milliseconds of latency is not important.

Data Acceleration

Transaction latency and packet loss are a thorny issue when transferring large amounts of data, such as workloads or backup as a service. You can’t make the network faster, so you have to find another way to solve the problem. Data acceleration solutions, such as PORTrockIT, can have a dramatic impact on recovery data throughput by using artificial intelligence through the use of parallelism. Unlike WAN optimization, they can also allow encrypted files to be securely transferred between data centers located outside of their own.

WAN optimization solutions are typically unable to handle encrypted data and usually require that data be sent unencrypted to ensure faster data transfer. And, while WAN optimization and SD-WAN vendors often claim that they can handle latency issues, they are often not enough to have an impact on network performance at today's higher WAN speeds. In contrast, data acceleration solutions use machine learning to mitigate the effects of data and network latency. At the same time, the need to optimize data centers and disaster recovery sites in different parts of the world has made it more feasible to significantly reduce latency.

New Opportunities

However, lower costs create opportunities for the design, management and operation of on-demand cloud computing resources. In fact, the adoption of fiber optics provides a range of opportunities for organizations of all sizes as service providers deploy globally. Many people still believe that public cloud is the only cloud computing model available. However, large organizations with virtualized, distributed data centers are connected with high-speed fiber optics. Therefore, they can create their own cloud computing infrastructure for cloud storage and computing. However, this will lead to a debate on whether it is cheaper to outsource to a third-party data center or to own and operate your own data center.

So, given the ever-changing data center and the need for data acceleration that remains extremely important, here are some essential tips:

  1. Understand the performance and latency requirements of these databases, DRaaS, or BaaS applications for end-user applications.
  2. Use data acceleration solutions such as PORTrockIT and reduce WAN SLA requirements based on latency and packet loss SLAs.
  3. Keep in mind that using SD-WAN is a good idea for managing your WAN, but it does not solve latency and packet loss issues.
  4. Software-defined open source networking software can significantly reduce capital and operating costs.

Future Development

It is difficult to predict the development of technology ten years from now. However, looking at some current trends, we can see the future from today's market. First, as the amount of data continues to increase, the power and energy consumption of data centers will inevitably increase exponentially. This will also generate a lot of heat, which data centers will have to deal with.

Increased fiber coverage and performance networks have made it possible to deploy data centers almost anywhere, but this demand is still not met in rural areas of many countries. This means that data centers are likely to remain close to cities. However, as investment in network infrastructure improves, more data centers can be located in lower-cost, less urbanized areas.

It's also worth noting that the web is cloud computing, and all this interconnectedness makes it possible for anyone and everyone (not just large data centers) to offer idle storage and computing power to users in the same way they buy electricity. So, the evolving data center may find that it will face more and more non-traditional competition, providing more choices for organizations and consumers.

The changing data center will also be increasingly software-defined, hyper-scaled, and virtualized. As data center infrastructure adopts artificial intelligence and software-defined technologies, users will have a huge demand for its computing power. It will provide more opportunities for hyper-scaling and virtual computers across multiple data centers. Therefore, the impact of the changing data center and lower network costs will eventually make it possible to solve problems that are currently impossible to solve.

<<:  51% of companies said that lack of appropriate technical infrastructure and IT systems is a major challenge to digitalization

>>:  A Complete Guide to Data Center Site Selection

Recommend

F5: Five trends in enterprise AI applications in Asia Pacific by 2025

Over the past year, I have had the privilege of s...

Chip shortages, edge computing and IoT will drive IT transformation in 2022

Forrester Research pointed out in its research th...

Explore Java application startup speed optimization

[[418030]] 1. Can you have both high performance ...

Five IoT business models that will make you profitable

IoT products have the ability to collect data, cr...

How can operators innovate traffic management models?

Traffic is an important carrier in the Internet+ ...

Guangxi Maitong: We didn't miss Ruijie!

"I missed Lenovo 10 years ago, but I cannot ...

Mobile 2G 3G 4G 5G Communication Base Station Architecture Evolution

Mobile communication systems have evolved from th...