Edge computing/fog computing and what it means for CDN providers?

Edge computing/fog computing and what it means for CDN providers?

CDN is usually a large number of distributed systems deployed in multiple data centers located in multiple regions across the Internet. The main purpose of this technology is to distribute content to the end user group and speed up the distribution. Content owners choose CDN technology for many reasons, including ensuring high performance and high availability for end users. It can provide bandwidth-intensive content such as graphics, real-time streaming, online stored files and social network databases to users anytime and anywhere in a flexible, on-demand manner.

The most common services currently provided by CDN providers include:

Web acceleration: This set of technologies makes the distribution of dynamic/static website content more efficient, including: caching large static content, dynamic cache control mechanisms, TCP acceleration and data compression and other technologies. Load balancing: Because CDN servers are located at the edge, they can have a deeper and clearer understanding of the inbound traffic and the status of the source server. This enables CDN to use application-layer load balancing, thereby improving the efficiency of traffic distribution by accurately measuring the actual load of each source server. Security: CDN abstracts the source server by hiding the identity of the source server, thereby protecting the source server from attacks directly targeting the IP address and facilitating the management of traffic. Many CDN providers provide security solutions for DDOS, web application attacks and spam/bot attacks. Content storage: Many CDN providers allow customers to store content on edge servers, whether it is large media files, database files or scripts, which can be intelligently distributed on edge servers as needed.

How is the CDN market changing?

1. The rise of the Internet of Things

The rise of the Internet of Things has changed the Internet landscape. As more entities are connected to the Internet, bandwidth requirements, data volumes, and the need for computing to be close to the devices are among the biggest challenges to be addressed.

It is estimated that by 2020, there will be 20,415,400,000 IoT devices connected to the Internet. As more devices are connected to the Internet, data will be more dynamic than ever. Due to limited bandwidth, using network acceleration to solve the middle mile will be a solution of the past when faced with the task of distributing such a large amount of dynamic content. IoT requires true edge computing/fog computing to meet these challenges.

The IoT is demanding. Even with the increased computing power of modern devices, managing the devices (where the business logic runs) is often tricky.

Most IoT devices require faster computation on the data/information collected on the device side, but since the business logic is independent and requires more computing power, it is often recommended to put the logic on the edge rather than on the device side. It requires a feedback loop with high availability so that the IoT device requests the edge to perform certain computations and use the results to speed up processing.

Computing logic at the edge significantly shortens the middle mile and enables near real-time responses, thereby improving performance.

At the same time, if a BtoB model is created, industrial data or IoT home appliance data is extremely valuable to the data analytics farm and can be used to implement business strategies. Although the large amount of data collected from IoT devices may make the data cluster unable to handle, the data aggregation logic can be placed at the edge to make the data more compact and reasonable before sending it to the cloud.

Hadoop-based batch processing services can be placed at the edge to aggregate and compress data after triggering before sending it to the cloud. It will greatly reduce the overhead of data collection while improving the performance of source servers and IoT devices.

2. More dynamic content and ownership distribution

Dynamic content has grown dramatically, and the amount of cacheable content will decrease dramatically. As more people use the Internet, the variety of data will increase, and static websites will become a very small part of the overall consumer ecosystem. Content will become more distributed across the Internet, making the concept of a true origin server invalid. More client-side code will determine which content to fetch from which origin.

At the same time, web servers are becoming more business logic-centric by distributing the concerns. For example, the Oath authentication model might authenticate the user entirely on the client side, and then secure the connection between the origin server and the client. It makes the origin server completely independent and secure, with authentication done by another Oauth server.

As the Serverless pattern becomes more well-known and important, it will soon make the logic more distributed. Edge computing will play an important role to host a small portion of specific logic that needs to be run close to the customer source to provide a reliable experience to the customer while reducing the load on the source server. For example, a large image/video uploaded by a customer can be resized at a lower resolution at the edge before being sent to the cloud.

3. Edge computing/fog computing and what does it mean for CDN providers?

There are different proposals for edge computing. In fact, the term edge is more abstract than it used to be. But the overall goal is the same, which is to bring computing closer to the device, significantly improving performance by avoiding the middle mile.

CDN provides a definition of edge: having servers that cache content close to the customer source. The same foundation can be used if IoT devices are considered data sources. Although the business model is very different from CDN logic, CDN providers may come up with a platform that runs business logic close to IoT devices instead of running it at the source server. CDN comes with several traditional concepts: edge-to-edge connectivity, content hosting, and storage, which will make CDN providers the foundation for building fog computing models. This will enable CDN providers to open new doors for customers from the IoT ecosystem or next-generation web applications.

4. Edge computing does not mean running origin servers at the edge

It is absolutely impossible for CDN providers to run heavy-duty application servers at the edge, which requires a lot of computing power and infrastructure. At the same time, hosting a fully functional application server that is always running (which CDN itself is not designed for) is often a very complicated situation. However, it is possible for CDN providers to put a small amount of lightweight business logic at the edge, which will greatly improve the reliability of clients or improve the performance of IoT devices.

The Evolution of CDNs

Given the limitations on total computing power and the potential commercial market, CDN providers should turn to Serverless platforms for certain functions - Function as a Service (FaaS) model to host dynamically managed, event-driven, lightweight business logic that can be easily deployed to improve the end-user/device experience.

1. Why use Serverless?

Serverless computing is a cloud computing execution model in which cloud service providers dynamically manage the allocation of machine resources. Prices are based on the actual amount of resources consumed by the application, rather than purchasing a certain amount of capacity in advance. It is a form of utility computing.

2. Fast deployment and fast update

Serverless functions are stateless; with proper separation and encapsulation, different versions can be easily rolled in/out and deployed easily. The Docker-based ecosystem allows for rapid deployment and rollout of changes.

3. Scalability is the core

A function is not a fully functional web application, but a small group of encapsulated business logic. This allows only a certain function to be scaled in large quantities, rather than allowing the entire application to be scaled.

4. Event-driven

What makes the Serverless platform really different is that functions hosted on the Serverless platform are not always running, but are started and scaled based on events. This makes the pricing model completely different from traditional web server hosting, where a certain amount of capacity is purchased in advance. The price is based on the number of requests and resource consumption. No request is made, and there is no function.

<<:  Some thoughts on the information construction of the financial industry

>>:  Unlimited mobile data packages are criticized: "Unlimited" is very big, and "Speed ​​limit" is hard to find

Recommend

What is Mesh Technology? What are the advantages of mesh networking?

Since its birth in the 1960s, network technology ...

Summary and analysis of the top ten optical communication technologies in 2016

5G channel coding technology In October 2016, Hua...

Teach you to understand the communication protocol in three minutes

Once you enter the communications industry, you w...

The Smart Network: Cisco's most disruptive innovation in a decade

A little over a year ago, my colleague David McGr...

Frontier | The Internet of Vehicles security ecosystem is taking shape

Internet of Vehicles Security Requires a Platform...

600,000 new 5G base stations will be built in 2021

2020 is a critical year for the large-scale comme...

Why do mobile network testers still pursue speed at all costs?

Communications operators must refocus on covering...

Three major operators: 5G package user penetration rate will exceed 60% in 2022

On March 13, China Telecom Corporation Limited an...

12 Myths About Blockchain Technology

Blockchain, the distributed ledger technology, ha...