State of the Edge

IoT, AI & Networking at the Edge

By Blog, LF Edge, Open Glossary of Edge Computing, State of the Edge

Written by LF Edge member Mike Capuano, Chief Marketing Officer for Pluribus Networks

This blog originally ran on the State of the Edge website. 

5G is the first upgrade to the cellular network that will be justified not only by higher speeds and new capabilities targeted at consumer applications such as low latency gaming but also  by its ability to support enterprise applications. The Internet of Things (IoT) will become the essential fuel for this revolution, as it transforms almost every business, government, and educational institution. By installing sensors, video cameras and other devices in buildings,  factories, stadiums and in other locations, such as in vehicles, enterprises can collect and act on data to make them more efficient and more competitive. This digital transformation will create a better and safer environment for employees and to deliver the best user experience possible to end customers. In this emerging world of 5G-enabled IoT, edge computing will play a critical role.

IoT will leverage public and private 5G, AI, and edge compute. In many cases, analysis of the IoT data will be highly complex, requiring the correlation of multiple data input streams fed into an AI inference model—often in real time. Use cases include factory automation and safety, energy production, smart cities, traffic management, large venue crowd management, and many more. Because the data streams will be large and will often require immediate decision-making, they will benefit from edge compute infrastructure that is in close proximity to the data in order to reduce latency and data transit costs, as well as ensure autonomy if the connection to a central data center is cut.

Owing to these requirements, AI stacks will be deployed in multiple edge locations, including on premises, in 5G base station aggregation sites, in telco central offices and at many more “edges”. We are rapidly moving from a centralized data center model to a highly distributed compute architecture. Developers will place workloads in edge locations where applications can deliver their services at the highest performance with the lowest cost.

Critical to all of this will be the networking that will connect all of these edge locations and their interrelated compute clusters. Network capabilities must now scale to a highly distributed model, providing automation capabilities that include Software Defined Networking (SDN) of both the physical and virtual networks.

Network virtualization—like the virtualization of the compute layer— is a flexible, software-based representation of the network built on top of its physical properties. The physical network is obviously required for basic connectivity – we must move bits across the wire, after all. SDN automates this physical “underlay” but it is still rigid since there are physical boundaries. Network virtualization is a complete abstraction of the physical network. It consists of dynamically-constructed VXLAN tunnels supported by virtual routers, virtual switches and virtual firewalls — all defined and instantiated in software, all of which can be manipulated in seconds and is much faster than reconfiguring the physical network (even with SDN).

To satisfy the requirements of a real-time, edge-driven IoT environment, we must innovate to deliver cost effective, simple, and unified software defined networking across both the physical and virtual networks that support the edge. The traditional model for SDN was not based on these requirements. It  was built for large hyperscale and enterprise data centers that rely on multiple servers and software licenses incurring costs and consuming space and power. This approach also requires complex integrations to deploy and orchestrate the physical underlay, the virtual overlay, along with storage and compute.

Deploying traditional SDN into edge environments is not an attractive solution. Often there will not be space to deploy multiple servers for management and SDN control of the physical and virtual networks. Furthermore, all the SDN controllers need to be orchestrated by a higher layer “controller of controllers” to synchronize network state, which adds unnecessary latency, cost and complexity.

In some cases, companies also deploy SmartNICs (a network interface controller that also performs networking functions to offload processing from the CPU). SmartNICS allow packet processing associated with network virtualization without burdening the primary compute (which is better utilized supporting other workloads). Also, hardware-based taps, probes and packet brokers are being deployed to support network telemtry and analytics.

Applying the network automation model we built for large centralized data centers will be expensive and cumbersome, as well as space and power inefficient, in edge environments. The industry needs to rethink the approach to edge network automation and deliver a solution designed from the ground up for distributed environments. Ideally this solution does not require additional hardware and software but can leverage the switches and compute that are already being deployed to resource constrained edge environments.

The good news is that a number of companies are developing new approaches that deliver highly distributed SDN control that unifies the physical underlay and virtual overlay along with providing network analytics — all with no additional external hardware or software. These new technologies can utilize, for example, the fairly powerful and underutilized CPU and memory in “white box” Top of Rack (TOR) switches to deliver SDN, network virtualization, network analytics and a DCGW (Data Center Gateway) router function. In other words, these solutions have been designed with the edge in mind and are delivering powerful automation with no extra hardware and additional software licenses – supporting the edge with a cost effective solution that also saves space and power.

Pluribus Networks delivers open networking solutions based on a unique next-gen SDN fabric for data centers and distributed cloud edge compute environments.

The State of the Edge 2020 report is available to read now for free. We encourage anyone who is interested in edge computing to give it a read and to send feedback to State of the Edge.

State of the Edge 2020: Democratizing Edge Computing Research

By Blog, Open Glossary of Edge Computing, State of the Edge

Written by Matt Trifiro, Open Glossary of Edge Computing TSC Chair, Co-Chair at State of the Edge and CMO at Vapor IO

State of the Edge 2020, a vendor-neutral report supported by The Linux Foundation’s LF Edge contains unique and in-depth research on edge computing, covering the major trends, drivers and impacts of the technology. The report provides authoritative market forecasting and trend analysis from independent contributors, bringing authoritative research on edge computing to everyone.

Edge Computing and LF Edge

Many believe edge computing will be one of the most transformative technologies of the next decade, and that by positioning dense compute, storage and network resources at the edge of the network new classes of applications and services will be enabled which support use cases from life safety to entertainment.

The Linux Foundation’s LF Edge has greatly contributed to the growth of edge computing in the industry, both in terms of technical projects and a deep shared understanding of the concepts and terminology underpinning this new area of technology.


One of the key projects within LF Edge is the Open Glossary of Edge Computing. This project seeks to harmonize the terminology used across the industry when discussing edge computing and has been adopted by a number of projects and contributors in the community. These community members recognize that without a common and accurate definition of key terms and concepts, it is much harder to collaborate on challenges.

State of the Edge

The Open Glossary of Edge Computing was originally born as part of the inaugural State of the Edge report in 2018, where an initial version was published and included as part of the report. Shortly after this, the Open Glossary of Edge Computing was adopted as an LF Edge project.

State of the Edge is itself an open and collaborative community of organizations and individuals who are passionate about the future of edge computing. The project looks to advance edge computing within the industry through consensus building, ecosystem development and effective communication. To that end, State of the Edge reports are written and published using contributions from a diverse community of writers and analysts. By including many voices, State of the Edge publications avoid the often incomplete, skewed and overly vendor-driven material and research typically available.

Multiple reports have been published to date, and more are planned for release during 2020, including coverage of topics highly relevant to edge computing, such as 5G networks. In addition, the State of the Edge 2020 report contains the latest version of the Open Glossary of Edge Computing, which reached version 2.0 during 2019.


Democratizing Edge Computing Research

The first State of the Edge report in 2018 focused on establishing a baseline of knowledge from across the edge computing industry. This made it possible for readers to accurately assess what edge computing meant for them, their customers and their unique use cases. This first report covered what were many new and often misunderstood concepts, tying them together in a way that enabled more people than ever before to appreciate and understand the edge.

When it came to the State of the Edge 2020 report, following extensive feedback and surveys, the collaborative team decided that market forecasting on edge computing was hard to come by, and in high demand. Though forecast models on edge computing exist, they are often proprietary and are not built transparently. Moreover, they are typically locked behind expensive paywalls that limit the number of people that can benefit from them.

By drawing on the expertise of professional researchers and well-regarded contributors, State of the Edge has released its first market forecast, along with a comprehensive narrative that discusses the new trends in edge computing.

The State of the Edge is run like an open source project and publishes all of its reports under a Creative Commons license, making it freely available to anyone who is interested. This approach allows the community to benefit from shared knowledge and valuable research on edge computing, without limiting it to those with money to spend.

Available to Read Now 

The State of the Edge 2020 report is available to read now for free. We encourage anyone who is interested in edge computing to give it a read and to send any feedback to State of the Edge.