How edge computing, edge networking, and edge data management work together

How edge computing, edge networking, and edge data management work together

Edge computing, edge networking, and edge data management are three elements that make up a successful edge computing strategy. Understanding the differences between each element enables IT teams to unlock the true promise of the edge.

Across the IT landscape, practitioners and providers continue to struggle to clarify what constitutes “edge” infrastructure. But the end goal is the same: lower latency and improve application resiliency.

The latest innovations within the industry reveal three distinct elements: edge computing, edge networking, and edge data management, which form the cornerstones of a successful edge computing strategy. Understanding the differences between each element and how they work together will enable IT teams to unlock the true promise of the edge.

[[417730]]

What is edge computing?

Edge computing is best described as the ability to move dynamic computing to the edge of the Internet and closer to the users and machines that need computing and processing power. Led by trends such as the Internet of Things and 5G, telecom operators have led the way because they already have the space needed to support this trend. By adding micro data centers and partnering with cloud providers, telecom companies are able to bring processing power from centralized local or cloud data centers to the edge.

The push for edge computing is driven by the need to improve application performance and optimize server resources; however, adoption has come with challenges. Building an application that lives within the confines of a highly distributed edge computing environment is different from building an application that runs in one or two data centers. For years, there have been few tools to make this approach scalable and repeatable.

Recently, containers and serverless infrastructure have made edge compute more accessible. Some companies now run their workloads across globally distributed Kubernetes clusters or serverless functions in service provider environments. For example, a gaming company can leverage different cloud resources or content delivery networks to ensure optimal performance in those regions. Alternatively, it can also choose to increase workloads in the same location in a specific region or country to support increased capacity during the launch of a new game.

However, many challenges still exist, such as data synchronization, distributed fleet management, global traffic and workload coordination, etc.

What is edge networking?

As employees and end users become more distributed and dynamic in terms of geography and connectivity, edge networking is quickly becoming a mainstream focus for infrastructure investments. While edge computing focuses on moving processing and compute closer to the user, edge networking encompasses all aspects of connecting applications to audiences, with an emphasis on routing data and network traffic across a distributed footprint in a more optimized manner.

What matters is the successful distribution of application workloads, which makes applications more performant and resilient, and the enterprise less dependent on a single data center or cloud or CDN provider. Teams use application traffic control policies to appropriately direct and balance real-time workloads between resources as conditions and demands change across a dynamic and distributed infrastructure footprint.

Edge networking can also be implemented across distributed physical infrastructure, including branch offices or campuses. The retail industry is a good example: these companies want to optimize the footprint of equipment in stores to improve availability while minimizing the need for additional infrastructure. The primary mechanism for achieving success is the same scalable distributed network management and optimization to efficiently connect application resources across fleets.

What is edge data management?

Despite significant progress in the edge space, another challenge remains the importance and mobility of data.

For a company that runs a large number of databases and code in hundreds of locations around the world, optimizing data mobility is critical. Teams often struggle to move the most relevant parts of the database to the right location at a specific point in time to minimize latency, but the data is not always in transit. They must determine the minimum set of locations where this data set can live to maximize performance while minimizing the overhead of distributing data around the world.

Solving this huge challenge will increasingly become the focus for unlocking scalable innovation at the edge.

Integration of edge components yields the greatest success

IT leaders should clearly understand each edge building block as new trends, such as supporting a distributed workforce and increasing demand for online services, prompt enterprises to consider IMPs.

<<:  SpaceX executive says Starlink can expand service to 30 million Americans

>>:  Report: By 2025, 5G mobile phones will account for more than 50% of smartphone sales revenue

Recommend

Talk: It's time to talk about what IPv4 and IPv6 are

On November 25, 2019, the RIPE NCC, which is resp...

Application research on intercepting web crawler traffic technology

1. Headers verification The essence of a web craw...

The core technical principles behind DingTalk document collaborative editing

Some people say that the most profound change tha...

With a downlink rate of over 100Mbps, can Starlink really replace 5G?

According to Mobile World Live, Ookla's lates...

Is Matter worth the wait?

An ambitious new smart home networking standard i...

China Mobile launches A-share listing: "Making money" but not "cutting leeks"

On the evening of May 17, World Telecommunication...

Efficiently build vivo enterprise-level network traffic analysis system

1. Overview With the rapid development of network...