How to deal with the four major challenges of edge computing

How to deal with the four major challenges of edge computing

Edge computing use cases are broad and its early deployments are highly customized. Infrastructure and operations leaders need to develop a multi-year edge computing strategy that addresses the challenges of diversity, location, protection, and data.

Key findings

  • The variety of use cases and requirements can cause best-in-class edge computing deployments to spread without creating any synergies or complicating the efforts to secure and manage these deployments.
  • The scale of distributed computing and storage required for edge computing, combined with the fact that they are often deployed in locations without IT staff, creates new management challenges.
  • With processing and storage placed outside of traditional information security visibility and control, edge computing introduces new security challenges that need to be addressed in depth.
  • Edge computing creates a large data footprint in a distributed architecture that needs to be managed, integrated, and processed.

suggestion

Infrastructure and operations leaders building a cloud-edge computing strategy should:

  • Create a dynamic strategic plan, approach, and framework for edge computing that balances the various needs within manageable guidelines.
  • Ensure that the proof-of-concept deployment can handle realistic scale for management, connectivity, security, compute, and storage.
  • Minimize the attack surface by ensuring edge computing hardware, software, applications, data, and networks have built-in security and self-protection.
  • Invest in technologies that automate edge data management and governance whenever possible.

Strategic Planning Assumptions

By 2022, 50% of edge computing solutions implemented as proof-of-concepts (POCs) will not scale to production use.

By 2022, more than 50% of enterprise-generated data will be created and processed outside of the data center or cloud.

analyze

Enterprises pursuing edge computing solutions encounter four unique challenges that need to be overcome (see Figure 1). These four challenges can be used to measure the efficiency of edge computing solutions.

Figure 1. Four edge computing challenges

However, as enterprises expand from a single edge computing use case to multiple, infrastructure and operations (I&O) leaders will need a comprehensive edge computing strategy to address each challenge over the long term and increase the efficiency and agility of edge computing. Solutions must evolve from customization and consulting to more common operating models, leveraged skills, standards, and mature, shareable technologies.

1. Diversity

Four different needs require edge computing solutions:

  • Latency/determinism — prefer lightweight, real-time solutions
  • Data/Bandwidth – More processing power is needed to handle large amounts of data
  • Limitations of self-management — requiring a more general subset of data center or cloud capabilities
  • Privacy/Security – Determine where processing and storage takes place, and protect data collected at the edge

The diversity of these requirements and use cases (i.e., interactions between people, businesses, and things) is a primary and unique challenge for edge computing. There are requirements for technology, topologies, environmental conditions, power availability, connected things and/or people, heavy data processing vs. light data processing, data storage vs. not, data governance constraints, analytical styles, latency requirements, and so on. Generally speaking, the closer edge computing is to the endpoint, the more special purpose, customized, and targeted it will be. With this diversity, standards will take years to evolve.

Enterprises will deploy and adopt many different use cases for edge computing, and the challenge will be to achieve customization where it is needed while finding collaboration across investments, skills, processes, technologies and partners.

There will be a tug-of-war between "perfection" and "pragmatism".

Enterprises need to strike the right balance between purpose-built, unique edge computing devices and topologies that focus on use cases (and associated management) versus general-purpose edge computing solutions that are efficient for many use cases but can be less efficient.

Selecting solution providers will also be a challenge as vendors struggle to find a balance between high-volume standard solutions that can drive business models and best-of-breed solutions that have smaller markets but potentially higher margins. In the early days of edge computing, most deployments were unique and often led by consulting firms. They produced highly customized solutions that created significant durability risks and reduced long-term flexibility. It will take years for the market to stabilize, allowing entry into a limited number of competitive computing markets. Until then, however, companies will need to plan for an unstable edge computing market, vendors that change products and strategies, and vendors that fail or are acquired.

To effectively navigate the diversity and ensure more efficient and flexible edge computing deployments, enterprises need a strategic plan for edge computing, or at least a strategic approach.

suggestion:

  • Create a dynamic strategic plan, approach, and framework for edge computing that balances the various needs within manageable guidelines.
  • Include vendor/technology viability in edge computing risk and ROI decisions.
  • When selecting technologies, partners, or processes, evaluate them based on the ability to leverage them for other future edge computing needs.

2. Location

IT organizations typically know how to manage and leverage a limited set of data centers (e.g., their own, those of hosting and cloud providers), and they typically know how to manage large numbers of end-user devices (laptops, mobile phones, etc.). Edge computing combines these needs into a unique new problem - the scale of managing many (dozens, hundreds, thousands) of strange pseudo-data centers that need to be managed in a low-touch or no-touch (often with no personnel or few visits) manner. Some edge computing nodes will be located in traditional data centers. However, most of them will not - they will have different power supplies and environmental conditions (outdoors, in a home or office or store, on a factory floor, etc.). Given the sheer scale, traditional data center management processes will no longer apply.

Currently, many POC deployments work on a small scale but are less successful at large-scale remote management.

To meet the challenges, edge computing nodes will vary depending on the use case. Enterprises will need to remotely manage a variety of edge computing technologies and topologies, including hardware, software platforms, software applications, and data (production data, configuration data, analytical models, etc.). This often requires low or no contact. Hardware needs to be easy to deploy and replace, and software needs to be easy to deploy and update. These locations rarely have technicians, so simplicity of operation and automation will be key.

Some edge computing nodes will handle a specific number of static endpoints. However, there is also a need to support dynamic, scalable discovery and changes in endpoints. In addition, by definition, an edge computing solution will be part of a distributed processing topology that starts at the endpoint and ends in a backend data center or cloud. Edge computing can be done in layers, including embedded processing, smart gateways, edge servers, and/or aggregated processing. Edge schedulers are important to locate work to the right processing location (for example, based on storage/compliance, latency, and compute power requirements). All of this needs to be managed.

Edge computing nodes may need to be resilient to being disconnected from the internet. In some cases, edge computing nodes themselves may need to be architected for resilience (leveraging other nodes) or multipath connectivity. To ensure simplicity and low touch, edge computing hardware will tend to be ruggedized designs that often have appliance-like functionality. The traditional general-purpose and fully scalable model for data centers does not make sense for edge computing outside of the data center. Some designs will evolve from existing solutions to get closer to the edge, such as wi-fi routers gaining storage and processing capabilities. Others will evolve from data center solutions, such as edge servers gaining connectivity capabilities and becoming more ruggedized. Edge computing requires a programmable software platform on the edge computing nodes - including the following aspects:

  • Bare Metal Firmware
  • container
  • Hypervisors and virtual machines (VMs) — for example, KubeVirt
  • Cloud solutions – for example, Amazon Web Services (AWS) Outposts

suggestion:

  • Make sure the POC deployment can handle actual management, connectivity, security, compute, and storage.
  • Choose a software platform that supports location heterogeneity, remote management, and autonomy at scale; supports developers; and integrates well with core processing (in the cloud or data center).
  • Deployment of general-purpose edge computing solutions in a data center or cloud, moving closer to the edge, can only become more specialized if justified by the costs, benefits, or existing infrastructure at the edge.

3. Protection

Edge computing significantly expands the enterprise’s attack surface (through edge computing nodes and devices), breaching traditional data center security, information security visibility, and control. Edge computing security combines the requirements of data center and cloud computing security (protecting configurations and workloads—see “How to Make the Cloud More Secure Than Your Own Data Center”) with the scale and location diversity of heterogeneous mobile and Internet of Things (IoT) computing security. Similar to securing mobile devices, enterprises need to develop defenses in depth and manage the edge computing stack—software and data—that must be assumed to be compromised. However, unlike mobile device security, edge computing nodes are more heterogeneous and complex—more like small data centers, performing a variety of jobs, and connecting to a variety of endpoints—each of which can also be compromised.

However, edge computing has some key differences from on-premises and cloud-based data centers. First, edge computing locations must be assumed to be uncontrolled and subject to physical tampering and theft. Second, network connectivity cannot be assumed to be constant. Security controls are required to continue to provide protection even if an intermittent or changing network is disconnected from its management console. Third, computing power will be limited in certain situations protected by security controls, so a strategy of low overhead, minimum viable protection must be chosen. These differences will require adjustments to products.

When evaluating products, encryption of data at rest must be considered mandatory, with hardware-based protection of keys. Boot-time integrity checks are mandatory, with strong controls for software updates. Each edge computing device must have an associated identity that is set up and managed. Zero Trust Network Access (ZTNA, also known as software-defined perimeter) will hopefully ensure the security of communication patterns.

An edge computing protection strategy must use a defense-in-depth strategy in four main areas:

  • Securing network communications to and from the edge
  • Anti-tampering, anti-theft, and secure software updates for edge computing platforms
  • Protecting data analyzed and stored at the edge, including privacy and compliance
  • Acts as a control point for edge device authentication and trust assurance

Network communications should use a new identity-based access assurance approach called ZTNA. ZTNA is the ability to securely communicate with edge computing locations, which Gartner calls the Secure Access Service Edge. Security and network services can be embedded into the network fabric used to establish access. Examples include ZTNA, traffic prioritization, encryption, firewalls, network inspection, and session monitoring.

The most important challenge will be to ensure the security of edge computing platforms. They must be designed with the assumption that they will be subject to physical attack and compromise. The security of edge computing relies on in-depth defense of extreme hard hardware and hard-software stacks, as well as hardware-based system integrity proof during the boot process. Systems must be able to be automatically and remotely updated only from trusted software update sources. Edge computing platforms must be able to monitor their own system behavior using agents, sidecar containers, or network traffic analysis to detect attacks or anomalies.

Edge computing nodes will also increasingly receive sensitive corporate, government, device, and personal data. Data protection will rely primarily on encryption of data at rest to prevent physical theft. However, this requires that the encryption keys used to decrypt the data cannot be stored on the drive with the data—for example, using a local Trusted Platform Module (TPM) chip or similar chip that protects confidentiality in hardware. If the data collected is personally identifiable, then privacy regulations may apply to the storage of the data and the rights of individuals to correct or destroy their data.

Regulatory compliance will need to be managed and will vary by region and the sensitivity of the data being collected. More generally, as data becomes more intimate, businesses and people will further self-regulate — managing data sovereignty, deciding what data goes where, what data can be transferred beyond the edge (e.g., faces on video), and what needs to be destroyed after use.

Finally, edge computing platforms typically act as an aggregation point for collecting telemetry data from edge devices. Authentication of these edge devices will involve an adaptive form of network access control to guarantee that the device is what it claims to be (e.g., through the use of digital certificates). Ideally, the edge computing platform will also be able to monitor and baseline the behavior of edge devices to determine if the device is damaged or malfunctioning.

In addition to regulatory compliance, privacy, customer trust and ethical considerations will become key edge computing challenges.

suggestion:

  • Choose an edge computing security solution that is centrally managed (preferably cloud-based) and provides tightly controlled administrative access and updates.
  • Require encryption of all data at rest and ensure that keys are stored separately from the data they protect.
  • Assume the network is hostile and intermittent. The product must be able to provide protection even if network connectivity is intermittent and compromised, and restrict access to edge platforms using ZTNA products.
  • Ensure edge computing hardware, software, applications, and networks are hardened and as small as possible to reduce the attack surface. Support systems that use TPM or similar hardware-based mechanisms to store secrets. Edge protection policies should verify integrity at boot time and verify/control which executables are allowed to run using application controls.
  • Monitor the behavior of edge nodes directly using agents or through network monitoring. Use machine learning (ML) to change the behavior of edge nodes.

4. Data

The amount of data at the edge will grow rapidly. By 2022, more than half of enterprise-generated data will be created and processed outside of the data center or cloud; however, that data will be different. On average, a byte of data at the edge will be worth less than a typical byte of data in today’s data center. In many edge use cases, especially IoT scenarios involving asset monitoring, much of the data collected does not reflect useful or interesting changes in the environment or state of the monitored endpoints. For example, there may not be any significant changes in a video stream, or an asset may continue to report status within the expected allowable range for a long period of time.

Data that can be determined to have no value should be considered for disposal. Unlike other types of use cases, the approach to data retention should focus on which data can be discarded, as they are usually the majority of the data.

On average, a byte of data at the edge also has a shorter half-life—it may not be truly valuable until the event occurs (or hundreds of milliseconds later) and may be less valuable for anything other than historical analysis. On average, a byte of data at the edge tends to be more valuable locally (to local things and people) than it is non-locally in a data center or cloud-based data store. While data also provides value when collected centrally (for example, to perform performance analytics across a group of edge environments or assets), the primary value may come from taking action on data that represents local events that only need to be processed locally and with lower latency.

Rather than centrally collecting data (e.g., data pools and data warehouses), edge computing creates potentially massive distributed data stores—data bits—anywhere. Additionally, data integration is critical to ensuring data is received, transformed, distributed (potentially to aggregation points or the cloud), and synchronized across edge environments, and appropriate local governance controls must be established to monitor and ensure data quality and privacy, while developing appropriate retention and disposal policies. In a highly distributed edge computing architecture, decisions about whether, where, and how data is persisted and structured determine cost and efficiency, and can also present governance challenges.

Finally, more and more analytics capabilities will need to be deployed in edge environments to deliver value directly and quickly when needed locally. Analytics can be done effectively in real-time with event stream processing, or through deeper, higher latency approaches (including aggregating data for the development of more complex models, perhaps solved using ML techniques). AI-based approaches will increasingly be applied at the edge - and the development of ML models may also occur at the edge.

In other words, value doesn’t have to be determined in advance—it can be discovered as you go along.

suggestion:

  • Invest in data management, integration, analytics, and governance capabilities in edge environments – As more data is generated, stored, and applied in edge environments, traditional data center-centric capabilities will lose value.
  • Leverage existing efforts (policies, formalized roles, governance processes) to manage traditional data types by applying them to the management of edge data. Requirements will need to expand, but established principles and policy types (quality, security, privacy, and retention/disposal) will remain relevant.
  • Advance your skills in data science and ML, and add event stream processing techniques to extract the appropriate value from data at the edge.
  • Evaluate existing and potential data management vendors by examining their capabilities to handle distributed data. Evaluate vendors’ capabilities for specific edge computing needs—for example, the ability to run on or interoperate with edge operating systems and gateways.

<<:  How does Baidu Netdisk steal your traffic?

>>:  Huawei Cloud Online Education Innovation Season kicks off to light up the dream classroom with technology

Recommend

The greater development of 5G lies in industrial applications

[[181724]] Some people say that 4G has changed ou...

China's fourth largest telecommunications operator is here

Chinese people are already familiar with the thre...

ABC in the eyes of communication professionals...

[[375451]] As a communications engineer, I am exp...

Regular end-to-end encryption may not be that secure

[51CTO.com Quick Translation] Is the messaging pl...

If WeChat declines, who will replace it? Big guesses about the Internet in 2018

Where will the major domestic Internet companies ...

Easy to understand, this article will introduce you to the HTTP protocol?

1. What is http? Http protocol is Hypertext trans...