Edge computing use cases are broad and its early deployments are highly customized. Infrastructure and operations leaders need to develop a multi-year edge computing strategy that addresses the challenges of diversity, location, protection, and data. Key findings
suggestion Infrastructure and operations leaders building a cloud-edge computing strategy should:
Strategic Planning Assumptions By 2022, 50% of edge computing solutions implemented as proof-of-concepts (POCs) will not scale to production use. By 2022, more than 50% of enterprise-generated data will be created and processed outside of the data center or cloud. analyze Enterprises pursuing edge computing solutions encounter four unique challenges that need to be overcome (see Figure 1). These four challenges can be used to measure the efficiency of edge computing solutions. Figure 1. Four edge computing challenges However, as enterprises expand from a single edge computing use case to multiple, infrastructure and operations (I&O) leaders will need a comprehensive edge computing strategy to address each challenge over the long term and increase the efficiency and agility of edge computing. Solutions must evolve from customization and consulting to more common operating models, leveraged skills, standards, and mature, shareable technologies. 1. Diversity Four different needs require edge computing solutions:
The diversity of these requirements and use cases (i.e., interactions between people, businesses, and things) is a primary and unique challenge for edge computing. There are requirements for technology, topologies, environmental conditions, power availability, connected things and/or people, heavy data processing vs. light data processing, data storage vs. not, data governance constraints, analytical styles, latency requirements, and so on. Generally speaking, the closer edge computing is to the endpoint, the more special purpose, customized, and targeted it will be. With this diversity, standards will take years to evolve. Enterprises will deploy and adopt many different use cases for edge computing, and the challenge will be to achieve customization where it is needed while finding collaboration across investments, skills, processes, technologies and partners. There will be a tug-of-war between "perfection" and "pragmatism". Enterprises need to strike the right balance between purpose-built, unique edge computing devices and topologies that focus on use cases (and associated management) versus general-purpose edge computing solutions that are efficient for many use cases but can be less efficient. Selecting solution providers will also be a challenge as vendors struggle to find a balance between high-volume standard solutions that can drive business models and best-of-breed solutions that have smaller markets but potentially higher margins. In the early days of edge computing, most deployments were unique and often led by consulting firms. They produced highly customized solutions that created significant durability risks and reduced long-term flexibility. It will take years for the market to stabilize, allowing entry into a limited number of competitive computing markets. Until then, however, companies will need to plan for an unstable edge computing market, vendors that change products and strategies, and vendors that fail or are acquired. To effectively navigate the diversity and ensure more efficient and flexible edge computing deployments, enterprises need a strategic plan for edge computing, or at least a strategic approach. suggestion:
2. Location IT organizations typically know how to manage and leverage a limited set of data centers (e.g., their own, those of hosting and cloud providers), and they typically know how to manage large numbers of end-user devices (laptops, mobile phones, etc.). Edge computing combines these needs into a unique new problem - the scale of managing many (dozens, hundreds, thousands) of strange pseudo-data centers that need to be managed in a low-touch or no-touch (often with no personnel or few visits) manner. Some edge computing nodes will be located in traditional data centers. However, most of them will not - they will have different power supplies and environmental conditions (outdoors, in a home or office or store, on a factory floor, etc.). Given the sheer scale, traditional data center management processes will no longer apply. Currently, many POC deployments work on a small scale but are less successful at large-scale remote management. To meet the challenges, edge computing nodes will vary depending on the use case. Enterprises will need to remotely manage a variety of edge computing technologies and topologies, including hardware, software platforms, software applications, and data (production data, configuration data, analytical models, etc.). This often requires low or no contact. Hardware needs to be easy to deploy and replace, and software needs to be easy to deploy and update. These locations rarely have technicians, so simplicity of operation and automation will be key. Some edge computing nodes will handle a specific number of static endpoints. However, there is also a need to support dynamic, scalable discovery and changes in endpoints. In addition, by definition, an edge computing solution will be part of a distributed processing topology that starts at the endpoint and ends in a backend data center or cloud. Edge computing can be done in layers, including embedded processing, smart gateways, edge servers, and/or aggregated processing. Edge schedulers are important to locate work to the right processing location (for example, based on storage/compliance, latency, and compute power requirements). All of this needs to be managed. Edge computing nodes may need to be resilient to being disconnected from the internet. In some cases, edge computing nodes themselves may need to be architected for resilience (leveraging other nodes) or multipath connectivity. To ensure simplicity and low touch, edge computing hardware will tend to be ruggedized designs that often have appliance-like functionality. The traditional general-purpose and fully scalable model for data centers does not make sense for edge computing outside of the data center. Some designs will evolve from existing solutions to get closer to the edge, such as wi-fi routers gaining storage and processing capabilities. Others will evolve from data center solutions, such as edge servers gaining connectivity capabilities and becoming more ruggedized. Edge computing requires a programmable software platform on the edge computing nodes - including the following aspects:
suggestion:
3. Protection Edge computing significantly expands the enterprise’s attack surface (through edge computing nodes and devices), breaching traditional data center security, information security visibility, and control. Edge computing security combines the requirements of data center and cloud computing security (protecting configurations and workloads—see “How to Make the Cloud More Secure Than Your Own Data Center”) with the scale and location diversity of heterogeneous mobile and Internet of Things (IoT) computing security. Similar to securing mobile devices, enterprises need to develop defenses in depth and manage the edge computing stack—software and data—that must be assumed to be compromised. However, unlike mobile device security, edge computing nodes are more heterogeneous and complex—more like small data centers, performing a variety of jobs, and connecting to a variety of endpoints—each of which can also be compromised. However, edge computing has some key differences from on-premises and cloud-based data centers. First, edge computing locations must be assumed to be uncontrolled and subject to physical tampering and theft. Second, network connectivity cannot be assumed to be constant. Security controls are required to continue to provide protection even if an intermittent or changing network is disconnected from its management console. Third, computing power will be limited in certain situations protected by security controls, so a strategy of low overhead, minimum viable protection must be chosen. These differences will require adjustments to products. When evaluating products, encryption of data at rest must be considered mandatory, with hardware-based protection of keys. Boot-time integrity checks are mandatory, with strong controls for software updates. Each edge computing device must have an associated identity that is set up and managed. Zero Trust Network Access (ZTNA, also known as software-defined perimeter) will hopefully ensure the security of communication patterns. An edge computing protection strategy must use a defense-in-depth strategy in four main areas:
Network communications should use a new identity-based access assurance approach called ZTNA. ZTNA is the ability to securely communicate with edge computing locations, which Gartner calls the Secure Access Service Edge. Security and network services can be embedded into the network fabric used to establish access. Examples include ZTNA, traffic prioritization, encryption, firewalls, network inspection, and session monitoring. The most important challenge will be to ensure the security of edge computing platforms. They must be designed with the assumption that they will be subject to physical attack and compromise. The security of edge computing relies on in-depth defense of extreme hard hardware and hard-software stacks, as well as hardware-based system integrity proof during the boot process. Systems must be able to be automatically and remotely updated only from trusted software update sources. Edge computing platforms must be able to monitor their own system behavior using agents, sidecar containers, or network traffic analysis to detect attacks or anomalies. Edge computing nodes will also increasingly receive sensitive corporate, government, device, and personal data. Data protection will rely primarily on encryption of data at rest to prevent physical theft. However, this requires that the encryption keys used to decrypt the data cannot be stored on the drive with the data—for example, using a local Trusted Platform Module (TPM) chip or similar chip that protects confidentiality in hardware. If the data collected is personally identifiable, then privacy regulations may apply to the storage of the data and the rights of individuals to correct or destroy their data. Regulatory compliance will need to be managed and will vary by region and the sensitivity of the data being collected. More generally, as data becomes more intimate, businesses and people will further self-regulate — managing data sovereignty, deciding what data goes where, what data can be transferred beyond the edge (e.g., faces on video), and what needs to be destroyed after use. Finally, edge computing platforms typically act as an aggregation point for collecting telemetry data from edge devices. Authentication of these edge devices will involve an adaptive form of network access control to guarantee that the device is what it claims to be (e.g., through the use of digital certificates). Ideally, the edge computing platform will also be able to monitor and baseline the behavior of edge devices to determine if the device is damaged or malfunctioning. In addition to regulatory compliance, privacy, customer trust and ethical considerations will become key edge computing challenges. suggestion:
4. Data The amount of data at the edge will grow rapidly. By 2022, more than half of enterprise-generated data will be created and processed outside of the data center or cloud; however, that data will be different. On average, a byte of data at the edge will be worth less than a typical byte of data in today’s data center. In many edge use cases, especially IoT scenarios involving asset monitoring, much of the data collected does not reflect useful or interesting changes in the environment or state of the monitored endpoints. For example, there may not be any significant changes in a video stream, or an asset may continue to report status within the expected allowable range for a long period of time. Data that can be determined to have no value should be considered for disposal. Unlike other types of use cases, the approach to data retention should focus on which data can be discarded, as they are usually the majority of the data. On average, a byte of data at the edge also has a shorter half-life—it may not be truly valuable until the event occurs (or hundreds of milliseconds later) and may be less valuable for anything other than historical analysis. On average, a byte of data at the edge tends to be more valuable locally (to local things and people) than it is non-locally in a data center or cloud-based data store. While data also provides value when collected centrally (for example, to perform performance analytics across a group of edge environments or assets), the primary value may come from taking action on data that represents local events that only need to be processed locally and with lower latency. Rather than centrally collecting data (e.g., data pools and data warehouses), edge computing creates potentially massive distributed data stores—data bits—anywhere. Additionally, data integration is critical to ensuring data is received, transformed, distributed (potentially to aggregation points or the cloud), and synchronized across edge environments, and appropriate local governance controls must be established to monitor and ensure data quality and privacy, while developing appropriate retention and disposal policies. In a highly distributed edge computing architecture, decisions about whether, where, and how data is persisted and structured determine cost and efficiency, and can also present governance challenges. Finally, more and more analytics capabilities will need to be deployed in edge environments to deliver value directly and quickly when needed locally. Analytics can be done effectively in real-time with event stream processing, or through deeper, higher latency approaches (including aggregating data for the development of more complex models, perhaps solved using ML techniques). AI-based approaches will increasingly be applied at the edge - and the development of ML models may also occur at the edge. In other words, value doesn’t have to be determined in advance—it can be discovered as you go along. suggestion:
|
<<: How does Baidu Netdisk steal your traffic?
[[181724]] Some people say that 4G has changed ou...
Chinese people are already familiar with the thre...
What is Standard PoE? PoE, or Power over Ethernet...
Introduction to Knative Knative implements its Se...
VPSSLIM is a foreign hosting company registered i...
[[375451]] As a communications engineer, I am exp...
[51CTO.com Quick Translation] Is the messaging pl...
This month, Moack.co.kr launched a specially conf...
Where will the major domestic Internet companies ...
[51CTO.com original article] During HUAWEI CONNEC...
In today's era of increasingly dynamic IT env...
iWebFusion is a website of H4Y, a well-establishe...
[51CTO.com original article] Is it a crisis or an...
1. What is http? Http protocol is Hypertext trans...
At the 2018 Global Network Technology Conference,...