Edge computing redefines enterprise infrastructure: new three-tier architecture is more flexible

Edge computing redefines enterprise infrastructure: new three-tier architecture is more flexible

【51CTO.com Quick Translation】With the industry's mass adoption of public cloud, many enterprises have clearly crossed the cloud chasm. Executives in the financial and public sectors who were once the most skeptical are now convinced of the value of the cloud. As cloud computing becomes mainstream, what is the next stop for enterprise infrastructure?

[[182830]]

The next big trend in enterprise IT is edge computing, which reduces the amount of data transmitted back and forth between data centers and public clouds, shortening the latency caused by public cloud platforms. More importantly, edge computing will allow IT departments to keep sensitive data internally while still taking full advantage of the elastic network provided by public clouds.

There is a misconception that edge computing is only designed for IoT. In fact, while edge computing is ideal for IoT solutions, it also provides tremendous value for departmental applications and traditional business applications.

The edge computing layer will run closer to the data source. Each unit of edge computing has its own set of resources, manifested as computing resources, storage resources, and network resources. These units will be configured for certain specific functions of the device, such as network switching, routing, load balancing, security, and audit tracking, and are also responsible for running data processing pipelines. The data routinely collected by the enterprise will be analyzed by the complex event processing engine to decide whether to process it locally in the enterprise or send it to the public cloud for further processing. Generally, "hot data" will be analyzed, stored, and processed immediately by the edge computing layer, and this data is critical to the operation of the local infrastructure. "Cold data" that facilitates long-term analysis will be moved to the public cloud for batch processing.

The applications built for edge computing in the future will be based on a three-tier architecture that is completely different from the three-tier architecture of the 1990s. In the process of moving from client/server to distributed computing architecture, Microsoft, Sun, IBM, and Oracle pursued this model: the user interface, business logic, and database run in separate layers, which is the traditional three-tier architecture familiar to many J2EE architects. However, the emerging three-tier architecture for future applications has no resemblance to the design patterns of the past. It is a completely new model, built around advanced technologies based on cloud computing, machine learning, and fast data.

The emerging three-tier architecture will consist of the following logical layers:

Data sources: Computing is increasingly data-driven. Everything from TVs to smartphones to industrial equipment, customer relationship management (CRM), supply chain management (SCM), and enterprise resource planning (ERP) is a data source. As computing and storage become affordable resources, it is easier to acquire and store data from a range of data sources. Integrating and correlating these data sets helps us uncover new insights. The data source layer includes any data source that can generate data, including machine logs, clickstreams, social media content, RDBMS, unstructured data, and structured data. In the new three-tier architecture, the data source becomes the top tier.

Intelligence layer: Machine learning (ML) is becoming an integral part of the user experience. Microsoft, Google, Amazon, and IBM are working hard to embed ML into phones, apps, platforms, and the cloud. In the contemporary three-tier architecture, ML will span the edge computing layer and the cloud computing platform to provide intelligence. Data scientists will harness the power of the cloud to create machine learning models, which requires access to the raw computing power available from the public cloud. With innovations in GPUs, FPGAs, and custom chips, we can create trained machine learning models based on large data sets and complex algorithms. These models are tested in the public cloud and then moved to the edge to process real-time data sets. Whenever a new model needs to be created or an existing machine learning model needs to be optimized, the model will return to the public cloud. Therefore, the public cloud will handle the heavy lifting, while the edge computing layer will handle production-grade data sets. This intelligence layer spanning the edge layer and the public cloud is the second layer of the new architecture.

Operational, actionable insights: This layer is responsible for taking action based on the information provided by the previous layer. In the future, business decision makers will be able to get accurate insights based on the analysis provided by the intelligence layer. This will speed up the decision-making process of business executives. This layer can be authorized to act on behalf of users. For example, when a certain condition is evaluated as true by the rule engine, the components in this layer can control the machine or equipment. In short, users will have access to rich dashboards with KPIs here.

It is foreseeable that affordable computing and storage resources, coupled with the rise of machine learning, will jointly promote the adoption of edge computing in the near future. Even traditional enterprise applications will begin to take full advantage of this architecture, not just the Internet of Things.

Original title: Edge Computing -- Redefining the Enterprise Infrastructure

Author: Janakiram MSV

[Translated by 51CTO. Please indicate the original translator and source as 51CTO.com when reprinting on partner sites]

To learn more about hot news, please follow 51CTO's "Technology News Morning Report" column!

【Editor's recommendation】

  1. Simplify Linux security with hardening and desktop gadgets!
  2. The dark web is also being "black-eat-black". Anonymous hackers take over 20% of the dark web in 21 steps
  3. Trump overturns Obama's Internet and telecommunications policies, free data is suspended

<<:  Why is optical fiber cheaper than noodles?

>>:  It’s settled! 5G official logo officially announced

Recommend

What is the function of each layer in the computer network layered model?

1. Layering of computer networks In the computer ...

WiFi 6 is not suitable for individual users yet

5G has become a household name, but its new WiFi ...

IPv6: Why should I make up for the mistakes made by IPv4?

IPv6 should have been developed greatly in the pa...

5G will be the world's most intelligent and interconnected cloud computing

We will enter the 5G era around 2020. 5G will hav...

5G commercialization is accelerating. What does this mean for drones?

Since the beginning of this year, my country'...

Advantages of 5G networks and the main problems they face

5G networks are the next generation of wireless t...

Thoroughly understand Cookie, Session, Token

[[281563]] Development History 1. A long time ago...

Four predictions for SD-WAN in 2018

2018 will be the year of WAN transformation, as r...

The successful commercialization of NB-IoT is not achieved overnight

According to media reports, the Ministry of Indus...

Accident review: We duplicated the order ID!

[[428490]] introduce In many business systems, we...

5G Internet: A High-Speed ​​Alternative to Cable?

If you want fiber-like speeds or high-speed inter...

Little-known tips for ordinary users to install broadband at home

In the past two years, broadband has become a mus...

5G+Industrial Internet, how is this addition “calculated”?

On September 17, the Zhongguancun Industrial Inte...