Category

Blog

Pushing AI to the Edge (Part One): Key Considerations for AI at the Edge

By Blog, LF Edge, Project EVE, State of the Edge, Trend

Q&A with Jason Shepherd, LF Edge Governing Board member and VP of Ecosystem at ZEDEDA

This content originally ran on the ZEDEDA Medium Blog – visit their website for more content like this.

This two-part blog provides more insights into what’s becoming a hot topic in the AI market — the edge. To discuss more on this budding space, we sat down with our Vice President of ecosystem development, Jason Shepherd, to get his thoughts on the potential for AI at the edge, key considerations for broad adoption, examples of edge AI in practice and some trends for the future.


Chart defining the categories within the edge, as defined by LF Edge

Image courtesy of LF Edge

LF Edge Member Spotlight: HPE

By Akraino, Akraino Edge Stack, Blog, Member Spotlight

The LF Edge community is represents a diverse set of member companies and people that represent the IoT, Enterprise, Cloud and Telco Edge. The Member Spotlight blog series highlights these members and how they are contributing to and leveraging open source edge solutions. Today, we sat down with Rohit Arora, Enterprise Architect at Hewlett Packard Enterprise (HPE) to discuss the importance of open source, leading Multi Access Edge Computing (MEC) initiatives, participating in the Technical Advisory Committee (TAC) and collaborating with the LF Edge ecosystem.

Can you tell us a little about your organization?

HPE is a global, edge-to-cloud Platform-as-a-Service company. HPE solutions connect, protect, analyze, and act on data and applications wherever they live, from edge to cloud, so insights can be turned into outcomes at the speed required to thrive in today’s complex world.

Why is your organization adopting an open source approach?

We at HPE believe in innovation and open source encourages innovation by bringing communities together to build common platform. HPE has been involved in various open source projects.

Why did you join LF Edge and what sort of impact do you think LF Edge has on the edge, networking, and IoT industries?

We joined LF edge because it aligns with HPE’s direction of edge to cloud. Edge computing is creating a major transformation in most industries and we believe initiatives driven by LF edge are critical for this digital transformation

What do you see as the top benefits of being part of the LF Edge community?

There are many benefits of being part of LF edge but we believe the biggest is to be part of a community which is driving the innovation for the next gen networks at the edge.

What sort of contributions has your team made to the community, ecosystem through LF Edge participation?

HPE has contributions on the LF Edge Governing Board and TAC, HPE has also made some contributions to the infrastructure requirements for LF Edge. HPE is also actively involved in LF edge projects such as Akraino and process adoption.

What do you think sets LF Edge apart from other industry alliances?

There are two main reasons LF Edge is different from other industry alliances

  1. A wide set of different community members: There is a wide variety of community members in LF edge from telco services providers, NEPs to chip manufacturers. This provides different viewpoints and provides the right level of expertise that is needed.
  2. Projects execution: The community really believes in executing and we have seen some projects coming from idea to development and then testing at a very fast pace.

How will  LF Edge help your business?

HPE is leading infrastructure provider and have wide variety of solutions for the edge. We are also leading MEC (Multi Access Edge Computing) initiatives with some major telcos. By being part of LFEdge we get access to latest innovations and resources in edge computing. This can help us build our solution to fit industry needs.

What advice would you give to someone considering joining LF Edge?

There are so many projects LF Edge is driving, the best place to start would be to pick a project which aligns with your company’s directions and see how you can drive innovation with your contributions for the project. There are many resources available and all the community members are very helpful to provide any info you need.

To find out more about LF Edge members or how to join, click here.

Additionally, if you have questions or comments, visit the  LF Edge Slack to share your thoughts and engage with community members. 

 

Finalists for the 2020 Edge Woman of the Year Award!

By Blog, State of the Edge

Written by Candice Digby, Partner and Events Manager at Vapor IO, a LF Edge member and active community member in the State of the Edge Project

Last year’s Edge Woman of the Year winner Farah Papaioannou is ready to pass the torch.

“I was honored to have been chosen as Edge Woman of the Year 2019 and to be recognized alongside many inspiring and innovative women across the industry,” said Farah Papaioannou, Co-Founder and President of Edgeworx, Inc. “I am thrilled to pay that recognition forward and participate in announcing this year’s Edge Woman of the Year 2020 finalist categories; together we have a lot to accomplish.”

(left to right) Matt Trifiro, Farah Papaioannou, Gavin Whitechurch

With more nominations in the 2nd annual competition, it was difficult for State of the Edge and Edge Computing World to select only ten top finalists. The Edge Woman of the Year 2020 nominees represent industry leaders in roles that are impacting the direction of their organization’s strategy, technology or communications around edge computing, edge software, edge infrastructure or edge systems.

The Edge Woman of the Year Award represents a long-term industry commitment to highlight the growing importance of the contributions and accomplishments made by women in edge computing.  The award is presented at the annual Edge Computing World event which gathers the whole edge computing ecosystem, from network to cloud and application to infrastructure end-users and developers while also sharing edge best practices.

The annual Edge Woman of the Year Award is presented to outstanding female and non-binary professionals in edge computing for outstanding performance in their roles elevating Edge. The 2020 award committee selected the following 10 finalists for their excellent work in the named categories:

  • Leadership in Edge Startups
    • Kathy Do, VP, Finance and Operations at MemVerge
  • Leadership in Edge Open Source Contributions
    • Malini Bhandaru, Open Source Lead for IoT & Edge at VMware
  • Leadership in Edge at a Large Organization
    • Jenn Didoni, Head of Cloud Portfolio at Vodafone Group Business
  • Leadership in Edge Security
    • Ramya Ravichandar, VP of Product Management at FogHorn
  • Leadership in Edge Innovation and Research
    • Kathleen Kallot, Director, AI Ecosystem, arm
  • Leadership in Edge Industry and Technology
    • Fay Arjomandi, Founder and CEO, mimik technology, Inc.
  • Leadership in Edge Best Practices
    • Nurit Sprecher, Head of Management & Virtualization Standards, Nokia
  • Leadership in Edge Infrastructure
    • Meredith Schuler, Financial & Strategic Operations Manager, SBA Edge
  • Overall Edge Industry Leadership
    • Nancy Shemwell, Chief Operating Officer, Trilogy Networks, Inc.
  • Leadership in Executing Edge Strategy
    • Angie McMillin, Vice President and General Manager, IT Systems, Vertiv

The “Top Ten Women in Edge” finalists are selected from nominations and submissions submitted by experts in edge from around the world. The final winner will be chosen by a panel of industry judges. The winner of the Edge Woman of the Year 2020 will be announced during this year’s Edge Computing World, being held virtually October 12-15, 2020.

For more information on the Women in Edge Award visit: https://www.lfedge.org/2020/08/25/state-of-the-edge-and-edge-computing-world-announce-finalists-for-the-2020-edge-woman-of-the-year-award/

 

Akraino’s AI Edge-School/Education Video Security Monitoring Blueprint

By Akraino, Akraino Edge Stack, Blog, Use Cases

Written by Hechun Zhang, Staff Systems Engineer, Baidu; Akraino TSC member, and PTL of the AI Edge Blueprint; and Tina Tsou, Enterprise Architect, Arm and Akraino TSC Co-Chair

In order to support end-to-end edge solutions from the Akraino community, Akraino uses blueprint concepts to address specific Edge use cases. A Blueprint is a declarative configuration of the entire stack i.e., edge platform that can support edge workloads and edge APIs. In order to address specific use cases, a reference architecture is developed by the community.

The School/Education Video Security Monitoring Blueprint belongs to the AI Edge Blueprint family. It focuses on establishing an open source MEC platform that combined with AI capacities at the Edge. In this blueprint, latest technologies and frameworks like micro-service framework, Kata container, 5G accelerating, and open API have been integrated to build a industry-leading edge cloud architecture that could provide comprehensive computing acceleration support at the edge. And with this MEC platform, Baidu has expanded AI implementation across products and services to improve safety and engagement in places such as factories, industrial parks, catering services, and classrooms that rely on AI-assisted surveillance.

Value Proposition

  • Establish an open-source edge infrastructure on which each member company can develop its own AI applications, e.g. video security monitoring.
  • Contribute use cases which help customers adopt video security monitoring, AI city, 5G V2X, and Industrial Internet applications.
  • Collaborate with members who can work together to figure out the next big thing for the industry.

Use cases

Improved Student-Teacher Engagement

 

Using deep learning model training for video data from classrooms, school management can evaluate class engagement and analyze individual student concentration levels to improve real-time teaching situations.

Enhanced Factory Safety and Protection

Real-time monitoring helps detecting factory workers who might forget security gadgets, such as helmets, safety gloves, and so on, to prevent hazardous accidents in the workplace. Companies can monitor safety in a comprehensive and timely way, and used findings as a reference for strengthening safety management.

Reinforced Hygiene and Safety in Catering

Through monitoring staff behavior in the kitchen, such as smoking breaks and cell phone use, this solution ensures the safety and hygiene of the food production process.

Advanced Fire Detection and Prevention

Linked and networked smoke detectors in densely populated places, such as industrial parks and community properties, can help quickly detect and alert authorities to fire hazards and accidents.

Network Architecture

OTE-Stack is an edge computing platform for 5G and AI. By virtualization it can shield heterogeneous characteristics and gives a unified access of cloud edge, mobile edge and private edge. For AI it provides low-latency, high-reliability and cost-optimal computing support at the edge through the cluster management and intelligent scheduling of multi-tier clusters. And at the same time OTE-Stack makes device-edge-cloud collaborative computing possible.

Baidu implemented video security monitoring blueprints on the Arm infrastructure, including cloud-edge servers, hardware accelerators, and custom CPUs designed for world-class performance. Arm and Baidu are members of the Akraino project and use edge cloud reference stack of networking platforms and cloud-edge servers built on Arm Neoverse. The Arm Neoverse architecture supports a vast ecosystem of cloud-native applications and combines AI Edge blueprint for an open source mobile edge computing (MEC) platform optimized for sectors such as safety, security, and surveillance.

“Open source has now become one of the most important culture and strategies respected by global IT and Internet industries. As one of the world’s top Internet companies, Baidu has always maintained an enthusiastic attitude in open source, actively contributing the cutting edge products and technologies to the Linux foundation. Looking towards the future, Baidu will continue to adhere to the core strategy of open source and cooperate with partners to build a more open and improved ecosystem.” — Ning Liu, Director of AI Cloud Group, Baidu

In the 5G era, OTE-Stack has obvious advantages in the field of edge computing:

  • Large scale and hierarchical cluster management
  • Support third cluster
  • Lightweight cluster controller
  • Cluster autonomy
  • Automatic disaster recovery
  • Global scheduling
  • Support multi-runtimes
  • Kubernetes native support

For more information about this Akraino Blueprint, click here.  For general information about Akraino Blueprints, click here.

Breaking Down the Edge Continuum

By Blog, State of the Edge, Trend, Use Cases

Written by Kurt Rinehart, Director of Data Science at Section. This blog originally ran on the Section website. For more content like this, please click here.

There are many definitions of “the edge” out there. Sometimes it can seem as if everyone has their own version.

LF Edge, an umbrella organization that brings together industry leaders to build “an open source framework for the edge,” has a number of edge projects under its remit, each of which seeks to unify the industry around coalescing principles and thereby accelerate open source edge computing developments. Part of its remit is to define what the edge is, an invaluable resource for the edge community to coalesce around.

Latest LF Edge White Paper: Sharpening the Edge

In 2018, State of the Edge (which recently became an official project of LF Edge) put out its inaugural report, defining the edge using four criteria:

  • “The edge is a location not a thing;
  • There are lots of edges, but the edge we care about today is the edge of the last mile network;
  • This edge has two sides: an infrastructure edge and a device edge;
  • Compute will exist on both sides, working in coordination with the centralized cloud.”

Since that inaugural report, much has evolved within the edge ecosystem. The latest white paper from LF Edge, Sharpening the Edge: Overview of the LF Edge Taxonomy and Framework, expands on these definitions and moves on from simply defining two sides (the infrastructure and the device edge) to use the concept of an edge continuum.

The Edge Continuum

The concept of the edge continuum describes the distribution of resources and software stacks between centralized data centers and deployed nodes in the field as “a path, on both the service provider and user sides of the last mile network.”

In almost the same breath, LF Edge also describes edge computing as essentially “distributed cloud computing, comprising multiple application components interconnected by a network.”

We typically think of “the edge” or “the edges” in terms of the physical devices or infrastructure where application elements run. However, the idea of a path between the centralized cloud (also referred to as “the cloud edge” or “Internet edge”) and the device edge instead allows for the conceptualization of multiple steps along the way.

The latest white paper concentrates on two main edge categories within the edge continuum: the Service Provider Edge and the User Edge (each of which is broken down into further subcategories).

edge continuum diagram
Image source: LF Edge

The Service Provider Edge and the User Edge

LF Edge positions devices at one extreme of the edge continuum and the cloud at the other.

Next along the line of the continuum after the cloud, also described as “the first main edge tier”, is the Service Provider (SP) Edge. Similarly to the public cloud, the infrastructure that runs at the SP Edge (compute, storage and networking) is usually consumed as a service. In addition to the public cloud, there are also cellular-based solutions at the SP Edge, which are typically more secure and private than the public cloud, as a result of the differences between the Internet and cellular systems. The SP Edge leverages substantial investments by Communications Service Providers (CSPs) into the network edge, including hundreds of thousands of servers at Points of Presence (PoPs). Infrastructure at this edge tier is largely more standardized than compute at the User Edge.

The second top-level edge tier is the User Edge, which is on the other side of the last mile network. It represents a wider mix of resources in comparison to the SP Edge, and “as a general rule, the closer the edge compute resources get to the physical world, the more constrained and specialized they become.” In comparison to the SP Edge and the cloud where resources are owned by these entities and shared across multiple users, resources at the User Edge tend to be customer-owned and operated.

Moving from the Cloud to the Edge

What do we mean when we talk about moving from the cloud to the edge? Each of the stages along the edge continuum take you progressively closer to the end user. You have high latency and more compute in the centralized cloud versus low latency and less compute as you get closer to the User Edge. When we talk about moving from the cloud to the edge, it means we want to leverage the whole stack and not solely focus on the centralized cloud.

Let’s look at the most obvious use case: content delivery networks (CDNs). In the 1990s, Akamai created content delivery networks to allow localized websites to serve a global audience. A website based in New York could leverage Akamai’s distributed network of proxy servers and data centers around the world to be able to store their static assets globally, including HTML, CSS, JavaScript, video, and images. By caching these in Akamai’s distributed global points of presence (PoP), the website’s end users worldwide were guaranteed high availability and consistent performance.

These days, CDNs are considered to be only one layer in a highly complex Internet ecosystem. Content owners such as media companies and e-commerce vendors continue to pay CDN operators to deliver their content to end users. In turn, a CDN pays ISPs, carriers, and network operators for hosting its servers in their data centers. That’s the Service Provider Edge we’re talking about.

An edge compute platform is still a geographically distributed network, but instead of simply providing proxy servers and data centers, an edge compute platform also offers compute. How do we define this? Compute can be defined as many things, but essentially, it boils down to the ability to run workloads wherever you need to run them. Compute still gives you high availability and performance, but it also allows for the capability to run packaged and custom workloads positioned relatively spatially to users.

An edge compute platform leverages all available compute between the cloud provider and the end user, together with DevOps practices, to deliver traditional CDN and custom workloads.

Applying Lessons from the Cloud to the Edge

We can take the lessons we’ve learned in the cloud and apply them to the edge. These include:

  • Flexibility – At Section, we describe this as wanting to be able to run “any workload, anywhere”, including packaged and customized workloads;
  • Taking a multi-provider approach to deployments – This offers the opportunity to create a higher layer of abstraction. Infrastructure as Code (IaC) is the process of managing and provisioning computer data centers through machine-readable definition files as opposed to physical hardware configuration or interactive configuration tools. At Section, we have 6-7 different providers, from cloud providers to boutique providers to bare metal providers.
  • Applying DevOps practices – In order to provide the capabilities that the cloud has at the infrastructure edge, we need to enable developers to get insight and to run things at the edge at speed, just as they did in the cloud. This is DevOps. It’s important to be able to apply DevOps practices here since, “if you build it, you own it”. You want to make things open, customizable, and API-driven with integrations, so that developers can leverage and build on top of them.
  • Leveraging containerized workloads – Deploying containers at the edge involves multiple challenges, particularly around connectivity, distribution and synchronization, but it can be done, and in doing, allows you to leverage this architecture to deploy your own logic, not just pre-packaged ones. Containerization also offers:
    • Security
    • Standardization
    • Isolation; and
    • A lightweight footprint.
  • Insights and Visibility – We need to give developers deep, robust insight into what’s happening at the edge, just as we do in the cloud. The three pillars of observability are logs, metrics and tracing. An ELK stack can provide this, giving developers the invaluable ability to understand what is happening when things inevitably go wrong.

Edge Computing Use Cases in the Wild

There are many examples of use cases already operating at the Edge. A few of the many interesting ones out there include:

  • Facebook Live – When you see a live stream in your feed and click on it, you are requesting the manifest. If the manifest isn’t already on your local PoP, the request travels to the data center to get the manifest, and then fetches the media files in 1 second clips. ML algorithms operate on the 1 second clips to optimize them in real time to deliver the best, fastest experience for users.
  • Cloudflare Workers – These are Service Worker API implementations for the Cloudflare platform. They deploy a server-side approach to running JavaSCript workloads on Cloudflare’s global network.
  • Chick-fil-A – A surprising one. Chick-fil-A has been pushing into the device edge over the last couple of years. Each of their 20,000 stores has a Kubernetes cluster that runs there. The goal: “low latency, Internet-independent applications that can reliably run our business”, in addition to high availability for these applications, a platform that enables rapid innovation, and the ability to horizontally scale.

We’re Not Throwing Away the Cloud

One last thing to make clear: we’re not talking about throwing away the cloud. The cloud is going nowhere. We will be working alongside it, using it. What we’re talking about is moving the boundary of our applications out of the cloud closer to the end user, into the compute that is available there. And, as we’ve seen, we don’t need to throw away the lessons we’ve learned in the cloud; we can still use the tools that we’re used to, plus gain all the advantages that the edge continuum has to offer.

You can download the LF Edge taxonomy white paper here. You can also watch the LF Edge Taxonomy Webinar, which shares insight from the white paper, on our Youtube Channel. Click here to watch it now.  

MicroMEC now available with the Akraino R3 Release!

By Akraino, Akraino Edge Stack, Blog

Written by Tapio Tallgren, Technical Leader at Nokia Mobile Networks, Community Sub-Committee Chair of Akraino TSC,Ferenc Szekely, Program Manager, SUSE, Committer of Micro MEC blueprint of Akraino TSC and Tina Tsou, Enterprise Architect, Arm, Akraino TSC Co-Chair

The MicroMEC platform started life as a platform to run applications at the very edge of the network, like in a light pole. We joined the LF Edge’s Akraino project from the very beginning.

To find out what the use cases would be first, we participated in the IoThon hackathon in 2019 where we built a miniature city with sensors, cameras and small servers — also known as Raspberry Pis. Our plan was that we will provide APIs to enable developers to access the sensors, cameras, or other independent hardware devices attached to our small servers, ie. the MicroMEC nodes. It was clear that we wanted to deploy all the APIs as well as the apps in containers. We needed a tool like Kubernetes to help us build and manage the MicroMEC cluster. As we targeted “small” devices, with max 4GB of RAM -at that time- and low power consumption we looked into alternatives to k8s. That is how we picked k3s. 

By the autumn of 2019 we had our lab running Raspberry Pi 3B+ and 4B nodes with k3s. We had a successful hackathon – Junction 2019 – in Finland where the teams presented solutions utilizing the MicroMEC cluster. We also added OpenFaaS Cloud (OFC) into the mix and a developer UI to the platform. This allowed developers to write serverless applications for the MicroMEC cluster and deploy them with ease. They could concentrate on their core business: developing apps while MicroMEC with OFC took away the burden of cluster management, deployment etc.

Right after Junction, we were at the Akraino 5G MEC Hackathon in the USA. For this event MicroMEC had to become more “MEC”. This implied the implementation of MEC-11 interfaces and the UI to manage those apps that our MEC-11 implementation made discoverable for customers near the MicroMEC cluster. The MEC cluster runs on Arm architecture based hardware.

With all this activity, we missed the first two Akraino releases, but now we are very happy to join the Akraino R3 release! For this, we had to figure out what is the easiest way to install the stack on the device with a MMC card. The easiest way is to not install anything on the fragile card, but boot the stack from a network server. Eventually we made all MicroMEC nodes to boot from a network server using PXE and the storage of each node was attached via iscsi. This requires a fast enough LAN, but thankfully cheap gigabit switches are widely available these days. 

Learn more about Akraino here.

 

The Over the Edge Podcast

By Blog, State of the Edge

If you ask 100 people to define edge, you might get 112 different answers, but we do know this much: Edge computing represents a long-term transformation of the Internet that could take decades to fully materialize.

Over The Edge is a podcast about edge computing and those in the industry who are creating the future of the internet. On the show we talk to corporate leaders, open-source experts, technologists, journalists, analysts, and the community at large, to discuss technological innovations, trends, practical applications, business models, and the occasional far-flung theory. Over the Edge is brought to you by the sponsorship of Catchpoint, NetFoundry, Ori Industries, Packet, Seagate, Vapor IO, and Zenlayer.

Listen to the podcast here: OverTheEdgePodcast.com

Check out some of the LF Edge member interviews:

July 29 – Matt Trifiro, VaporIO

July 29 – Galeal Zino, Netfoundry

July 29 – Jacob Smith, Packet

August 5 – Joe Zhu, Zenlayer

August 19 – Malini Bhandaru, VMware

August 26 – Jason Shepherd, ZEDEDA

 

EdgeX Foundry Welcomes New Contributors for Q2

By Blog, EdgeX Foundry

Written by Aaron Williams, LF Edge Developer Advocate

The second quarter has been really busy for the EdgeX community.  We released Geneva and are working hard on Hanoi, our fall release.  This release was made possible through the hard work of 52 community members contributing code in GitHub over the past three months.  Over the past three years, EdgeX has enjoyed 117 unique contributors and the community is continuously growing. We want to welcome and recognize our four first time contributors from Q2.

We encourage our new contributors to keep up the great work and we look forward to their next contribution.  You are helping to improve and grow EdgeX and our community.

Q2 New Contributors’ Usernames:

nbfhscl

bill-mahoney

charles-knox-intel

wogsland

You can find these contributors on github and see what other projects they are working on.

We would be remiss if we didn’t thank our other contributors who posted code, help with documentation, or answered questions on our slack workspace in Q2.   We had over 80k lines of code committed from 50 unique (66 YTD) developers making 665 commits (1.3k YTD).  And here are our top ten committers for the second quarter:

lenny-intel tonyespy
ernestojeda rsdmike
cherrycl difince
lranjbar iain-anderson
hahattan jamesrgregg

You can find most of them on our slack workspace (edgexfoundry.slack.com) where we have had over 2000 messages from 101 members!  On our slack channels, you can ask questions and get help, or you can follow our working groups’ channels.

Do you want to get involved with EdgeX Foundry-The World’s First Plug and Play Ecosystem-Enabled Open Platform for the IoT Edge or just learn more about the project and how to get started?  Either way, visit our Getting Started page and you will find everything that you need to get going.  We don’t just need developers, we could use tech writers, translators, and many other disciplines.

EdgeX Foundry is an open source project hosted by LF Edge that is building a common open platform for IoT Edge computing. The interoperable platform enables an ecosystem of plug-and-play components that unifies the marketplace and accelerates the deployment of IoT solutions across a wide variety of industrial and enterprise use cases.

EdgeX is unique in its scope, broad industry support, credibility, investment, vendor-neutrality, and Apache 2.0 open source licensing model. As such, EdgeX is a key enabler of digital transformation for IoT Use Cases and businesses across many different vertical markets.

EdgeX offers all interested developers or companies the opportunity to collaborate on IoT solutions built using existing connectivity standards combined with their own proprietary innovations.

Visit the EdgeX Foundry website for more information or join our Slack to ask questions and engage with community members. If you are not already a member of our community, it is really easy to join.  Simply visit our wiki page and/or check out our Git Hub and help us get to the next 6 million and more downloads!

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

OpenAirInterface End User Case Study – Running 5G on Akraino’s KNI Provider Access Edge Blueprint

By Akraino Edge Stack, Blog, Use Cases

Written by Ricardo Noriega, Project Team Lead, Akraino Kubernetes Native Infrastructure Blueprint Family, and Raymond Knopp,  Executive Committee member of the OpenAirInterface Alliance

Overview

Blueprints in the Akraino Kubernetes-Native Infrastructure (KNI) Blueprint Family leverage the best-practices and tools from the Kubernetes community to declaratively manage edge computing stacks at scale and with a consistent, uniform user experience from the infrastructure up to the services and from developer environments to production environments on bare metal or on public cloud.

One of the many use cases that the KNI blueprint family covers is the Provider Access Edge (PAE). The need for deploying mobile application on the edge is growing in latest times. Providing a platform that is capable of supporting deployment of mobile applications, using Kubernetes, and based on kubernetes tooling and declarative configuration from end to end is needed.

The OpenAirInterface project fosters a community of industrial as well as research contributors for software and hardware development for the core network (EPC) and access network and user equipment (EUTRAN) of 3GPP cellular networks. The OpenAirInterface alliance, has chosen the Akraino KNI PAE blueprint as the reference platform to develop, test and deploy its 4G and 5G open source mobile networks.

Key features on the Provider Access Edge blueprint

Telco / 5G network functions are among the more exigent Kubernetes workloads, but they are not unique: customers from high performance computing, high frequency trading, industrial control, et al. are asking for pretty much similar sets of capabilities.

This blueprint targets small footprint deployments able to host NFV (in particular vRAN) and MEC (e.g. AR/VR, machine learning, etc.) workloads. Its key features are:

  • Lightweight, self-managing clusters based on CoreOS and Kubernetes (OKD distro).
  • Support for VMs (via KubeVirt) and containers on a common infrastructure.
  • Application lifecycle management using the Operator Framework.
  • Support for multiple networks using Multus.
  • Support for high throughput interfaces using the SRIOV operator.
  • Support for real-time workloads.
  • Support for Stream Control Transmission Protocol (SCTP).

OpenAirInterface network deployment

The OpenAirInterface alliance has made a great effort on moving all the components that form a 4G/5G mobile network to the Kubernetes world. Building all the container images and writing the corresponding manifests to match a specific deployment model has been a tremendous work.

To support the 5G network in a production-like deployment, we configured the OpenShift based KNI PAE blueprint to segregate real-time and non-real-time compute workloads as well as management, control, and data plane traffic according to the following logical deployment architecture:

Conclusion

5G is designed to bring to the enterprise world as well as to the regular consumer, high throughput and low latency bandwidth that will enable the use cases of the future like IoT, autonomous cars, and many other applications deployed at the edge of the networks. The Akraino Kubernetes Native Infrastructure blueprint family allows to run these very demanding workloads on top, and OpenAirInterface has chosen us as the reference platform.

References

https://www.openairinterface.org/docs/workshop/8_Fall2019Workshop-Beijing/Talks/2019-12-05-DEFOSSEUX.pdf

Learn more about OpenAirInterface here. Learn more about Akraino here.

LF Edge Member Spotlight: Mocana

By Blog, EdgeX Foundry, LF Edge, Member Spotlight

The LF Edge community comprises a diverse set of member companies and people that represent the IoT, Enterprise, Cloud and Telco Edge. The Member Spotlight blog series highlights these members and how they are contributing to and leveraging open source edge solutions. Today, we sat down with Dave Smith, President of Mocanato discuss the importance of open source, collaborating with industry leaders in edge computing, security, how they leverage the EdgeX Foundry framework and the impact of being a part of the LF Edge ecosystem.

Can you tell us a little about your organization?

Mocana revolutionizes OT and IoT with cyber protection as a service for trustworthy systems. The company helps device operators bridge the adoption challenge between vendors and service providers, and delivers key cybersecurity benefits to the emerging 5G network, edge computing applications, and SD-WAN enterprise networks. Mocana protects the content delivery supply chain and device lifecycle for tamper-resistance from manufacture to end of life, with root-of-trust and chain-of-trust anchors. Mocana measures devices for sustained integrity and the trustworthiness of operations and data to power artificial intelligence/machine learning analytics. The Mocana team of security professionals works with semiconductor vendors and certificate authorities to integrate with emerging technologies to comply with data privacy and protection standards. The goal of cyber protection as a service is to eliminate the initial cost of modernization for device vendors and empower service providers to offer subscription-based services for the effective and efficient expansion of corporate and industrial digital transformation strategies.

Mocana’s core technology protects more than 100 million devices today, and is trusted by more than 200 of the largest energy, government, healthcare, manufacturing, IoT, telecommunications & networking, and transportation companies globally.

Why is your organization adopting an open-source approach?

Mocana is eager to support the global body of customers adopting the EdgeX Foundry open source solution. OpenSSL is by far the most broadly integrated and implemented open source security stack. It comes freely available and is distributed as part of the LF Edge distributions. However, in recent years OpenSSL has come under scrutiny because of critical security vulnerabilities and the resulting issuance of CVEs. The Heartbleed vulnerability from 2014 was a notable exploit, and there are several other recent CVEs that have generated concern in the information security community. The strategy of taking a defensive position through ongoing patching of vulnerabilities continues to challenge efforts to protect mission-critical OT environments.

Since the founding of the LF Edge projects, the goal has been to pull together a body of code to standardize the microservices delivery and orchestration for edge computing systems and devices. The projects continues to advance commercial third-party solutions to address key functional areas, especially for mission-critical and vertical industry applications. Mocana’s solution is based upon a commercially supported, NIST FIPS 140-2 certified, cryptographic module. Many of the company’s Fortune 500 customers have realized significant benefits from the ability to quickly migrate from default products integrated with OpenSSL to Mocana’s offering, leveraging its OpenSSL connector.

Why did you join LF Edge, and what sort of impact do you think LF Edge has on edge computing, networking, and IoT industries?

Developing, deploying, operating, and managing IoT and edge computing requires a community of key, forward-looking technology innovators. The IoT-edge ecosystem spans a wide supply chain from first silicon to the cloud, and includes system integrators, end-user operators and asset owners. Mocana was one of the first 50 founding members of EdgeX Foundry in 2017. Early on, the company took an industry leadership position by driving industry adoption through off-the-shelf solutions developed through stakeholder collaboration. This approach addressed a variety of common use cases delivered by new edge computing technologies and applications, and required much more than a reference architecture. Mocana recognized the need for the user community and developing ecosystem to leverage community-developed code (e.g. Github) to reduce feature and software code duplication and enable the broadest possible market adoption. The customer benefit reduces the implementation risk for such new technologies and accelerates community stakeholder time to market.

What do you see as the top benefits of being part of the LF Edge community?

Mocana values LF Edge’s ecosystem breadth and depth of community members and stakeholders, which includes chip companies, device ODMs, OEMs, carrier service providers, and asset owner/operators. Each contributes key use case challenges that have been invaluable for ensuring that LF Edge can support key technology developments and marketplace challenges.

What sort of contributions has your team made to the community, ecosystem through LF Edge participation?

As key contributor to the community, Mocana worked with the EdgeX Foundry Security Working Group and offered insights and guidance on vital security use cases. The company ensured there was always a path to address developing cybersecurity mandates and best practices from NIST Cybersecurity Framework and ISA/IEC 62443. As a result, the community has delivered a number of key security functions. They added a reverse proxy, provided a method to secure the key store with the ability to manage it, and has integrated access to session-based security to the microservices.

Perhaps most important, Mocana has enabled the community to incorporate a scalable, robust, and commercially supported cybersecurity offering for EdgeX Foundry production development and deployments.

Mocana developed its OpenSSL connector to ease migration from default project configurations with OpenSSL to Mocana’s TrustCenter and TrustPoint offerings. This solution aligns well with the project’s objectives to accelerate adoption and deployments of standardized implementations addressing key edge computing use cases with microservices.

What do you think sets LF Edge apart from other industry alliances?

Delivering actual code that organizations can download, compile, run, and then operate is a tremendous benefit compared to most other industry alliances. It is a major differential in comparison to groups that only suggest frameworks and prescriptions of possible features, implementations, and suggested “best practices.”

How will LF Edge help your business?

Demand is growing for edge computing solutions. Hitting 5 million downloads of the EdgeX Foundry SDK in May are proof of that. Mocana also is beginning to see initial commercial success and adoption in the innovation and R&D centers by key community members. The company’s ability to enable its fully integrated TrustCenter and TrustPoint solutions leveraging an OpenSSL connector provides a clear and rapid path to EdgeX device security lifecycle management and supply chain provenance. Plus, it will increase adoption of Mocana’s latest edge device offerings from the community.

What advice would you give to someone considering joining LF Edge?

Find your niche in one of LF Edge’s nine collaborative projects where your offering can deliver the most value and contribute. There has never been a better time to participate in this open source community, which is looking for complementary solutions and ways to deepen the ecosystem.

To learn more about EdgeX Foundry, click here. To find out more about our members or how to join LF Edge, click here.

Additionally, if you have questions or comments, visit the  LF Edge Slack or the EdgeX Foundry Slack to share your thoughts and engage with community members.