Technology-enabled solutions are becoming ever more critical to the day-to-day operations of many enterprises. Among the most impactful technologies are AI (Artificial Intelligence), IoT (Internet of Things), cloud computing, and next generation communication technologies such as 5G. Edge computing may garner fewer headlines, but it is a key enabler for many solutions that leverage the emerging technologies listed above.
Fundamentally, edge computing makes processing and storage resources available in close proximity to edge devices or sensors, complementing centralised cloud resources and allowing for analytics close to those end devices. This results in a number of benefits that can be very relevant in an enterprise context, including:
The simplest applications of edge technology are probably in building control and facilities management, including HVAC control, and security monitoring, access control and alarm systems. Clearly, such scenarios benefit not only from autonomous operation at the edge, but also the location of analytics at the (local) enterprise edge allows for good oversight of all parameters associated with a particular facility. This kind of application is clearly applicable in almost any vertical sector.
Another potential application is production monitoring and control in an industrial context. For example, an edge-enabled system can be used to monitor a range of production lines and machinery in a manufacturing location to pre-emptively identify any maintenance that is required, and dynamically re-scheduling production given anticipated downtime, availability of parts, and the profile of orders that need to be fulfilled. Edge-enabled, AI-based quality control can be applied to video streams of manufactured products on a production line, to check for any defects in manufactured products. Information can also be streamed to Augmented Reality (AR) interfaces such as tablet computers, or video goggles, with minimal latency. Such applications are currently a reality in a production manufacturing environment, but may also find application in smart cities (for example, to control traffic) and also healthcare environments (for example, to manage patient flows).
The application of AI and Machine Learning (ML) to CCTV feeds is a key opportunity for edge computing, particularly given the proportion of data communication volumes that it is possible to redact by applying analytic rules at the edge. AI enabled cameras can simply transmit an ‘alert’ that a certain decision rule has been triggered, averting the need to transmit a full moving image feed to cloud infrastructure for analysis. Such solutions are commonly productised in the form of facial recognition cameras, or security cameras, and clearly have application in almost all vertical sectors.
Another application that is particularly suitable for edge computing is the management of an estate of devices, for instance as might be found at a solar farm or a wind farm or in the context of a range of agricultural and aquacultural applications, transportation management (particularly rail, and public transport), oil and gas extraction and mining operations.
In all, there are many, many contexts in digital transformation in which edge computing can be significantly beneficial.
There are many definitions of ‘Edge’ and some complexity around how it might work, so it’s worth unpicking some of that complexity.
Thus far, we’ve referred to ‘the edge’ in quite general terms as being characterised by deploying compute power closer to edge devices or sensors. However, there are many different kinds of edge location as illustrated in the chart below.
The most local kind of edge computing is where compute power is installed on an actual end device (‘Device Edge’ in the figure above), for example an industrial robot. Next most local would be an Edge Gateway, located close to an end device; often this would be an industrial computer deployed to connect an OT (Operational Technology) asset to IT (Information Technology) systems.
The Enterprise Edge includes servers and compute power that are local to an enterprise, or on-site at an industrial facility, and sit at the interface between that local network and associated cloud infrastructure.
The Network Edge (specifically Multi-Access Edge Computing, or MEC, or increasingly Mobile Edge Computing ) is an evolution of the enterprise edge scenario, where edge processing is provided at the ‘edge’ of a communications network. Although, in fact, in most currently planned deployments of MEC, mobile network operators are intending to deploy edge resources either at locations that could be better characterised as the ‘Data Centre Edge’ (see below) or at the Enterprise Edge in the case of Mobile Private Networks and particularly in association with 5G. In time, compute power can be expected to migrate ‘closer’ to the actual (radio access) network edge in public networks.
Beyond these quintessentially ‘local’ versions of edge is another definition of edge that has been adopted by providers of cloud infrastructure and which includes the provision of hosting capacity in secondary and tertiary locations: closer to the end devices, from the perspective of a cloud provider. We mention this for completeness only since, from the perspective of a solution designer, such locations are for most intents and purposes still ‘cloud’ locations. This is a rapidly developing area though, since the public cloud providers have been partnering with communications service providers to co-locate the cloud edge and network edge (MEC) together.
In terms of the processing of data, data lakes and large databases of information tend to reside ‘in the cloud’, whilst data streams can potentially be processed at any of the edge locations described above.
The next aspect of edge to consider is the dynamics of information (data) as it flows over edge assets.
Currently, organisations will often establish their analytic approaches and frameworks at central (cloud) locations. However, as described above, there can be significant benefits to undertaking certain analyses closer to the end device and the first dynamic associated with the advent of edge computing is a trend to push decisions closer to the edge.
Conversely, raw information originating from a device can be redacted (i.e. filtered or aggregated) as it flows over edge infrastructure, ensuring that only necessary and meaningful information remains for onward transmission.
Lastly, information flowing from edge devices can be augmented as it flows across various edge assets by associating contextual and other information, often including information from other local devices.
In general then, volumes of data tend to reduce as they travel ‘away’ from an end asset, but the information that remains is generally richer and more meaningful.
These dynamics of information in the context of edge assets are illustrated in the figure below.
DataOps is a fast-emerging concept in the edge domain, allowing for the optimisation of data flows (and location of application deployment) within a local campus environment, including deployment to (and federated support of) devices with limited on-board processing. DataOps at the edge might include configurable application-specific synchronisation policies between cloud and edge (and within the on-campus edge), and data partitioning so that only the business data needed at a specific edge location is distributed and synchronised. This approach effectively renders all available compute-processing assets within a campus location as a single, managed, distributed computing environment and has the potential to more effectively support a range of edge-type applications.
The general principles underlying this kind of distribution of applications within a campus edge environment are that application functionality that is either mission-critical, or time-critical, or relies on high data volumes, should be deployed as close to the relevant source(s) of data as possible, meanwhile recognising the potential to increase the scope of an application (in terms of the variety of data that it can ingest from different sources) as it moves further ‘away’ from an edge data source.
Whilst the advent of DataOps at the edge can bring significant benefits, there are also challenges inherent with deploying such a flexible system. One such challenge is edge orchestration, including the automated configuration, coordination, and management of computer systems and software.
What is needed is a container orchestration system for automating software application deployment, scaling, and management and which can be deployed to provide a platform for the deployment of software containers (that function like individual computers from the perspective of software programmes running in them) in an edge environment. Open-source Kubernetes is one of the main such solutions.
But such application platforms are not the full story. Up to this point, this report has focussed on edge capabilities in an abstract sense. In a real-life deployment, it is likely that edge applications provided by different vendors, and intended to support different use cases, will need to be deployed alongside each other on the same infrastructure. A Kubernetes-type approach can help with the actual deployment of applications but may not ensure that one application cannot interfere with another. Clearly there is a need for some level of security in the edge environment including extending to ensure trustworthy operations.
Any industrial facility is likely to include machinery from a range of OEMs. This immediately highlights two potentially conflicting dynamics: the desire of the facility owner to deploy edge applications that support their overall smart factory vision, and the desire of OEM equipment manufacturers to deploy applications to support ‘their’ machinery.
However, with flexible deployment of applications to edge assets, supported by a Kubernetes-based platform and enabled by appropriate security measures, these kinds of conflict can be dealt with, and both stakeholders (the facility operators and the OEM) can potentially deploy their own applications alongside each other on the same edge assets.
There’s no reason why the story should stop there though. Both the facility owners and OEM could maintain their own ecosystems of applications that can potentially be installed onto edge hardware. These ecosystems could extend to include third parties such as finance and insurance providers, and so on.