Edge computing is easy to sell but hard to define. More a philosophy than any single architecture, edge and cloud are on a spectrum, with the current cloud service model often dependent on in-browser processing, and even the most edgy deployments reliant on central infrastructure.
The philosophy of edge, as most Reg readers know doubt know, is to push as much processing and compute as close as possible to the points of collection and utilisation.
If biology is any guide, edge computing is a good evolutionary strategy. The octopus has a central brain, but each tentacle has the ability to analyse its environment, make decisions and react to events. The human gut looks after itself, with roughly the same processing power as a deer, while both eyes and ears do local processing before passing data back. All these natural systems confer efficiency, robustness and flexibility: attributes that IT edge deployments should also expect.
But those natural analogies also illustrate another of edge's most important aspects – its diversity. 5G is often quoted as the quintessential edge case. It owes most of its potential to being designed around edge principles, moving the decision-making about setting up and managing connections into distributed control systems. The combination of high bandwidth, low latency, traffic management through prioritisation, all across moving targets, just can't work unless as much processing as possible takes place as close to the radios (and thus the users) as possible.
But another high-profile edge application, transport, needs a very different approach. An aircraft can generate a terabyte of performance and diagnostic data on a single flight, which outstrips the capabilities of in-flight datacomms.
Spread that across a fleet in constant global flux, and central control isn't an option. Autonomous processing onboard, prioritising of immediate safety information such as moment-to-moment engine parameters for available real-time links, and efficient retrieval of bulk data when possible, lead to design decisions far removed from 5G engineering.
Each sector touted as a natural fit for edge – IoT, digital health, manufacturing, energy, logistics – defies the idea of edge as a single discipline. As the Openstack Project says in its "Edge Computing: Next Steps in Architecture, Design and Testing" paper: "While there is no question that there is continuing interest in edge computing, there is little consensus on a standard edge definition, solution or architecture."
Yet by concentrating on the common features, their benefits and challenges, we can see where it might be headed.
Edge computing needs scalable, flexible networking. Even if a particular deployment is stable in size and resource requirements over a long period, to be economic it must be built from general-purpose tools and techniques that can cope with a wide variety of demands. To that end, software defined networking (SDN) has become a focus for future edge developments, although a range of recent research has identified areas where it doesn't yet quite match up to the job.
SDN's characteristic approach is to divide the task of networking into two tasks of control and data transfer. It has a control plane and a data plane, with the former managing the latter by dynamic reconfiguration based on a combination of rules and monitoring. This looks like a good match for edge computing, but SDN typically has a centralised control plane that expects a global view of all network activity. As Imperial College London researchers point out in a recent paper [PDF], this is neither scalable or robust, two key edge requirements. Various approaches – multiple control planes, increased intelligence in edge switch hardware, dynamic network partitioning on demand, geography and flow control – are under investigation, as are the interactions between security and SDN in edge management.
The conclusion, here as elsewhere, is that this is an area of very active research and while the potential is not yet realised, these techniques are going to be the basis of efficient edge networking.
The reason for that last conclusion leads onto another aspect of edge development: what rules to apply in general to developing and managing infrastructure, services and apps.
Development and management at the edge
Because edge architectures are still evolving, the extension of DevOps principles into the infrastructure provides more visibility into how things are working, the adoption of common open-source components and approaches that prove themselves, and the practical advantages of rapid reconfiguration and deployment.
With the "everything as code" approach, which SDN and container management/deployment tools like Kubernetes exemplify, the whole variety of edge architectures from heavily centralised to highly distributed can be managed with the same tools, an important consideration as the technologies mature and take their place in the market.
Kubernetes provides a common layer of abstraction on top of physical resources like compute, storage and networking, allowing deployment in a standard way anywhere, including to heterogeneous edge devices across varied infrastructures. This meshes well with the increased performance of cross-platform development tools, allowing a device-agnostic approach that fits well with the economics of edge and its need to cultivate diverse ecosystems.
- AI caramba, those neural networks are power-hungry: Counting the environmental cost of artificial intelligence
- Edgy: HPE's first message from the International Space Station to Microsoft's Azure? 'hello world'
- AWS EKS Anywhere (as long as it's VMware) hits full release
- Open Compute Project to design open silicon and optics in Strategy 2.0
With all this, monitoring and testing needs to follow. One approach to building maintainable edge deployments is artifact review, where anything created to be part of an overall system is documented well enough to be tested and built upon by others, with reproducible results.
In general, all of the DevOps best practice ideas – communication between teams, standardisation of practice, automation wherever possible, instrumentation – have to be amped up to cope with the new scale, the constantly changing makeup, and the varied business demands that edge brings.
That problem of managing edge deployments which, in many cases such as IoT, has end nodes that are diverse in age, capabilities and technology quickly leads to vastly complex permutations of different configurations. Mobile app developers know this all too well, with constant decisions about what minimum configuration to support, how to deal with different geographic areas, and how to support clients that diverge from the norm. They are a good if unwitting test bed for the realities of some aspects of edge.
Edge standards are being developed to bring this under control. ETSI, the European telecoms standards body, and the 3GPP mobile standards group have been working together [PDF] to integrate cloud services and mobile networks at the edge, including how to handle the discovery of edge services by applications. Internet-based systems like DNS have underlying assumptions that the entities they expose stay where they are; edge, especially mobile edge, does not work that way.
Another important nexus of activity is LF Edge, the Linux Foundation's edge group, which has just released EdgeX 2.0 Ireland, a major update to its nascent standards package. This includes secure APIs to connect devices and networks and manage data channels, and the Open Retail Reference Architecture (ORRA), a common deployment platform to manage apps, devices and services.
Although the EdgeX standard has been in some flux, the intent is to use this as the basis for a Long Term Support (LTS) release later in 2021. The standards package is available in a Docker container, emphasising the consensus that edge will have to be built on DevOps lines to be viable.
Edge's hidden vices
For edge to make a good business case, it must be the most efficient way to solve a problem. For the poster children – 5G, transportation, IoT – it's often the only feasible solution. But in more general-purpose cases, it has to be more efficient than the cloud-first, device-second model. Here, the big cloud providers are stiff competition, with their hyper-efficient internal management systems and economies of scale.
In a review [PDF] of the technological, economic and industrial future of edge computing across the European Union, it's noted that Google claims its admins monitor 10,000 servers each, compared to one admin per hundred servers in standard enterprise-class data centres, and Amazon's data centres as three-and-a-half times more energy efficient in a similar comparison. If your edge deployment is to take a lot of data processing away from the cloud, these are economies of scale it may have to fight. These are the raw economics that make clouds so dominant, and they're not changing.
Security is also highly challenging. Moving data centre workloads to the edge removes physical protection against theft and vandalism, and managing the security credentials for thousands or hundreds of thousands of nodes when connectivity or power may be intermittent for some is not a solved problem. With care, edge can be more secure than standard approaches – many IoT sensors have little spare resource for strong encryption, but a local control node can add that before sending it on.
But an edge deployment increases the surface area of a system, so having the monitoring and active log scanning for signs of trouble needs to scale up to match.
The future of edge computing, more than most technologies growing in importance, depends on everyone in the business. From high-level academic researchers to the DevOps engine room, all layers of the industry need to be aware of what the other is doing. For edge to work, a whole mesh of existing ideas in infrastructure, management, development, monitoring, security and architectural understanding have to explore the options together.
No one organisation, not even the tech giants, can drive it where it doesn't fit, and no one organisation can hold it back when it locks into workable innovation. Even without the hype, life at the edge is going to be interesting. ®