Category

LF Edge

LF Edge Member Spotlight: OSIsoft

By Blog, Fledge, LF Edge, Member Spotlight

The LF Edge community is comprised of a diverse set of member companies and people that represent the IoT, Enterprise, Cloud and Telco Edge. The Member Spotlight blog series highlights these members and how they are contributing to and leveraging open source edge solutions. Today, we sit down with Daniel Lazaro, Senior Technical Program Manager at OSIsoft, to discuss the importance of open source, collaborating with industry leaders in edge computing, their contributions to Fledge and the impact of being a part of the LF Edge ecosystem.

Can you tell us a little about your organization?

Since 1980, OSIsoft has focused on a single goal: getting operations data into the hands of industrial professionals so that they have the information they need to reduce costs, increase productivity and transform business. Decades ago, before the modern internet, big data, or AI arrived on the scene, the company’s flagship PI System software became known for breaking ground as a historian: a database used by engineers in an operational environment that captures streaming, time-series data that reflects the state of the physical equipment (assets).

Over time, the PI System has expanded to meet modern industrial needs, allowing not only operations staff but also executives, software developers, data scientists and others to understand, share, and make decisions based on highly curated operations data. With its addition of edge and cloud-based capabilities, the PI System now makes this essential data accessible, usable and valuable to people, systems and communities of customers, suppliers and partners across and beyond the enterprise.

Why is your organization adopting an open source approach?

Open source enables collaboration and integration of heterogeneous technologies across organizational boundaries. Moreover, it provides a platform for innovation to create solutions designed to address technical and business challenges such as those at the edge. Our CEO and founder Pat Kennedy saw the opportunity for an open source approach to address such challenges at the edge and started Dianomic. We believe that open source is the fast track to innovation. Industrial systems are unique in the number of protocols, data mappings and overall diversity. Open source can uniquely address these edge computing challenges by collaborating on code that all participants can access, modify and expand upon.

Why did you join LF Edge and what sort of impact do you think LF Edge has on the edge, networking, and IoT industries?

Zededa introduced Dianomic to LF Edge before its inception. As a result, Dianomic and OSIsoft joined as founding members. The original idea was and remains to build a thriving open source industrial community. This is a challenge to the Linux Foundation in that industrial companies have not been traditional open source users. The operations (OT) side of the Industrial market tends not to be software/compute experts, they are machine, manufacturing and process experts.

LF Edge curates several open source projects and a community around them that addresses the challenges of edge computing in a wide range of vertical markets at the edge of the network. This framework provides a collaboration platform for organizations to build non-differentiating infrastructure for solutions at the edge driven by inherent tradeoffs between the benefits of centralization and decentralization.

LF Edge plays a critical role helping accelerate deployments of Industrial IoT enabling and expanding visibility of previously untapped aspects of operations.

What do you see as the top benefits of being part of the LF Edge community?

In the LF Edge community, we see a group of like-minded organizations willing to work together. By collaborating through open source, we join forces to build the framework and ecosystem for the future of edge computing. Each project targets different pieces of the puzzle or building blocks to assemble in order to address the complexities encountered at the edge. Divide and conquer, focus, specialize and thrive. The community ecosystem provides learning and growing opportunities and a better together experience. At the same time, it enables exciting new revenue opportunities for new types of services and customers.

What sort of contributions has your team made to the community, ecosystem through LF Edge participation?

As a Premier member of LF Edge, OSIsoft actively influences the strategic and technical direction of LF Edge as voting members of the Governing Board and technical advisory council. OSIsoft brings its industrial perspective and expertise to LF Edge and contributes its vision working with different committees and through public speaking at various LF Edge related events. Our contributions play an important role in nurturing and growing a community of end users across various industry verticals within LF Edge. The end user vertical solution group kicked off October, 2020 with presentations by industrial companies showcasing valuable use cases implementing solutions using LF Edge projects.

We believe that the industrial edge has a different set of requirements that are better addressed with a specialized approach tailored to its specific needs, namely Fledge. The thriving and growing Fledge community of industrials has contributed back code to the project already deployed in production environments today. This adds to the previous contributions by service providers, system integrators, OEMs such as Dianomic, OSIsoft and Google to name a few. Fledge started when Dianomic contributed the entire FogLAMP stack in winter of 2019. At that time, the code was in its 8th release and had been commercially deployed in energy, manufacturing and mining operations.

Fledge works closely with other LF Edge projects such as EVE and Akraino. EVE provides system and orchestration services and a container runtime for Fledge applications and services. Together industrial operators can build, manage, secure and support both Supervisory Control and Data Acquisition (SCADA) and non-SCADA connected machines, IIoT and sensors as they scale. Fledge is also integrated with Akraino, as both projects support the roll out 5G and private LTE networks.

What do you think sets LF Edge apart from other industry alliances?

Traditionally, alliances have focused on delivering recommendations, guidelines or standards Instead, LF Edge focuses on delivering reference implementations in the form of quality open source software ready for adopters to integrate in their solutions. The strong communities of end users and developers around the software that customize, integrate, implement and contribute back to the projects sets LF Edge apart.

How will LF Edge help your business?

The LF Edge framework lowers the barrier to adoption of edge computing solutions translating in increased industrial implementations that enable new use cases that were not technically possible or cost effective before. This allows OSIsoft customers to rapidly expand their real-time data infrastructure to new systems and devices in industry and operations for greater visibility into operations and business, faster decisions and higher value.

Moreover,  LF Edge enables the expansion to new market opportunities through technical solutions as well as its communities of end users and vertical solutions. LF Edge governance provides customers with confidence that the projects within the framework are developed with broad industry support and openness without vendor lock-in.

Finally, access to a large developer community and marketing efforts are opportunities to share resources and drive down costs.

What advice would you give to someone considering joining LF Edge?

Get familiar with the framework and ecosystem of projects. You can start by checking the website and read the various resources available, white papers and documentation provided by the community. Identify the projects, groups and communities that align with your organization’s goals. Join the relevant groups and communities, mailing lists and calls, listen in and learn and when you are ready participate and contribute. If you identify gaps or have solutions that can enrich the current ecosystem, bring them on. Contributions come in many shapes, not just code, and they are the means to drive direction and influence within LF Edge.

To find out more about LF Edge members or how to join, click here. To learn more about Fledge, click here. To see use cases for Fledge, check out these videos. Additionally, if you have questions or comments, visit the LF Edge Slack Channel and share your thoughts in the #fledge, #fledge-help or #fledge-tsc channels.

Open Networking & Edge Summit: Distributed but always Connected

By Blog, LF Edge

This year, hundreds of professionals spanning the telecom, IoT and edge industries came together at Open Networking and Edge Summit, which took place virtually September 28-30.

Hosted by the Linux Foundation, LF Edge and LF Networking, there were 1,778 registrations (1,322 for the live event and an additional 456 to date for post-event platform access to content and technical showcase). Those attendees that joined live hailed from 523 organizations in 71 countries around the globe with 54% from the United States, Canada, and Mexico. 

51% of attendees spent 10 or more hours on the virtual event platform, and over 68% of attendees spent 4 or more hours on the platform. 69% were first-time Open Networking & Edge Summit attendees. To learn more about the event, check out the 2020 post-event report here

If you missed it, you can find the entire Open Networking & Edge Summit playlist here, which includes keynote presentations, lightning talks, in-depth tutorials, panel discussions and project and use case sessions.  

Arpit Joshipura, General Manager of Networking, IoT and Edge at the Linux Foundation, kicked off the event with a keynote that highlighted the five hard questions answered by LF Edge and LF Networking. 

  1. Why Open Source?
  2. Standards or Open Source?
  3. Why Contribute?
  4. POC to Production?
  5. Money?

Keynotes Day 1

LF Edge projects were featured in sessions:  

Your Path to Edge Computing with Akraino: https://youtu.be/_UCaQzachuM

Akraino TSC Co-Chairs Kandan Kathirvel (AT&T) and Tina Tsou (Arm) shared details about the Akraino R2 blueprints and R3 community goals, how to engage and contribute and demos of certain blueprints. 

How Akraino is Used (Panel Discussion): https://youtu.be/4kecTzrUdsI

Akraino TSC Co-Chair Tina Tsou (Arm), Sha Rahman (Facebook), Changming Bai (Alibaba), Mark Shan ( Tencent) and Yongsu Zhang and Hechun Zhang (Bytedance) shared end user stories and opinions on how Akraino is used in 5G, the AI Edge, Connected Vehicle, mixed reality AR/VR, and Private LTE/5G.

Serverless in Akraino: https://youtu.be/7Bosql0T5K8

Tapio Tallgreen (Nokia) decided to use the Akraino uMEC project in junction which is the biggest hackathon in Europe. Their concept was to use a scaled-down version of a smart city, with sensors and servers running in lightpoles. All servers were in the same k3s cluster and connected to the Internet. He wanted to make it possible for developers to create applications that can run on the cluster, and do it in 48 hours or less! This presentation details their journey to leveraging OpenFAAS Cloud as the user management and what we learned.

Self Checkout Theft Detection Showcase Using EdgeX Foundry: https://youtu.be/EQyQRFRsz0o

EdgeX Foundry Vertical Solutions Chair Henry Lau (HP) gave a presentation about how HP teamed up with other LF Edge members to demonstrate solving complex retail problems with an EdgeX Foundry powered HP Retail IoT Gateway. By making use of self-checkout theft detection use case, it showcased the ability to bring together multiple sensor data streams using EdgeX industry-leading open framework that is cloud-agnostic and sensor-agnostic.

Building the Android for the IoT Edge with Project EVE: https://youtu.be/0lchg7slk1k

ZEDEDA’s Roman Shaposhnik and LF Edge Governing Board member Jason Shepherd highlighted the key challenges of edge computing, the unique requirements of the IoT edge and why EVE is critical for IoT scale by serving as an open, standardized edge computing engine. We will host a brief demo and talk to what’s next, including integrating EVE with Kubernetes to extend the benefits from the data center to the IoT edge.

Living on the Edge to Meet Today’s Demands (Panel Discussion): https://www.youtube.com/watch?v=rWWjZ4nEMfs&list=PLbzoR-pLrL6psbdaoF_E1pE-2dRhroc_T&index=72

In this panel, LF Edge Outreach Chair Balaji Ethirajulu (Ericsson),  EdgeX Foundry Chair of the Security Committee Malini Bhandaru (VMware), Roman Shaposhnik (ZEDEDA) and Akraino TSC Co-Chair Tina Tsou (Arm) discussed:

  • Edge use cases being addressed to satisfy the industry need
  • Collaboration between the LF Edge Projects and scope of each project
  • How to engage and contribute to each project

How LF Edge Powers the IoT Vertical Across the Stack (Panel Discussion): https://youtu.be/nZSNYDwK3Xc 

In this panel, LF Edge Board member Thomas Nadeau (Red Hat), EdgeX Foundry Chair of the Security Committee Malini Bhandaru (VMware), LF Edge Governing Board member Jason Shepherd (ZEDEDA), LF Edge Governing Board member Daniel Lazaro (OSISoft) and LF Edge Governing Board member Vikram Siwach (MobileedgeX) discussed contributions and cross-project synergies across Akraino, Baetyl, EdgeX Foundry, Fledge, Glossary, and HomeEdge. 

LF Edge/LF Networking Pavilion

As co-hosts, LF Edge and sister umbrella project LF Networking teamed up to present a pavilion at Open Networking and Edge Summit that showcased different technologies and use cases. Between 100-150 attendees visited the booths to download materials and watch demo videos. You can find the LF Edge demo videos on our Youtube Channel

Feedback from attendees was positive, with an overall average satisfaction rating of over 95%. 97% said they plan to attend a future Open Networking & Edge Summit, and 94% said they would recommend it to a friend or colleague. We’re in the process of planning next year’s event…stay tuned for more details!

Edge Excitement! Innovation & Collaboration (Q4 2020)

By Blog, LF Edge, Open Horizon

Written by Ryan Anderson, Member and Contributor of LF Edge’s Open Horizon Project and IBM Architect in Residence, CTO Group for IBM Edge

This article originally ran on Ryan’s LinkedIN page

Rapid innovation in edge computing

It is an exciting time for the edge computing community! Since my first post April 2019, we are seeing a rapid acceleration of innovation driven by the convergence of multiple factors:

·     a convergence towards shared “mental models” in the edge solution space;

·     the increasing power of edge devices – pure CPU, as well as CPU plus GPU/VPU;

·     enormous investments in 5G infrastructures by network and communications stakeholders;

·     new collaborations across several IT/OT/IOT/Edge ecosystems;

·     increasing participation in, and support for, open source foundations such as LF Edge, by major players; and

·     widespread adoption of Kubernetes and Docker containers as a core layer of the edge.

With this convergence, innovation and accelerating adoption, Gartner’s prediction that 75% of enterprise-generated data will be created and processed at the edge, appears prescient.

Edge nodes – from datacenters to devices 

Much like “AI” and “IT” – edge computing is a broad and nebulous term that means different things to different stakeholders. In the diagram below, we consider four points of view for edge:

  1. Industrial Edge
  2. Enterprise Network Cloud Edge
  3. 5G / Telco Edge
  4. Consumer and Retail Edge
No alt text provided for this image

 

This model illustrates a few key ideas:

·     Some edge use cases fall squarely within one quadrant – whereas others span two, or sometimes three.

·     Solution mapping will help shape architecture discussions and may inform which stakeholders should be involved in conversations.

·     Edge can mean very different things to different people; and consequently, value propositions (and ROI/KPI) will also vary dramatically.

Technology tools for next generation edge computing must be flexible enough to work across different edge quadrants and work across different types of edge nodes.

Terminology. And what is edge computing?  

At IBM our edge computing definition is “act on insights closer to where data is created.”

We define edge node generically as any edge device, edge cluster, or edge server/gateway on which computing (workload, applications, analytics) can be performed, outside of public or private cloud infrastructure, or the central IT data center.

An edge device is a special-purpose piece of equipment that also has compute capacity integrated into that device on which interesting work can be performed. An assembly machine on the factory floor, an ATM, an intelligent camera or a next-generation automobile are all examples of edge devices. It is common to find edge devices with ARM or x86 class CPUs with one or two cores, 128MB of memory, and perhaps 1 GB of local persistent storage.

Sometimes edge devices include GPUs (graphics processing unit) and VPUs (vision processing units) – optimized chips that are very good for running AI models and inferencing on edge devices.

Fixed function IOT equipment that lack general open compute are not typically considered edge nodes, but rather IOT sensors. IOT sensors often interoperate with edge devices – but are not the same thing, as we see on the left side of this diagram.

 

No alt text provided for this image

An edge cluster is a general-purpose IT computer located in remote premises, such as a factory, retail store, hotel, distribution center or bank, for example – and typically used to run enterprise application workloads and shared services.

Edge nodes can also live within network facilities such as central offices, regional data-centers and hub locations operated by a network provider, or a metro facility operated by colocation provider.

An edge cluster is typically an industrial PC, or racked server, or an IT appliance.

Often, edge clusters include GPU/VPU hardware.

Tools for devices to data centers

IBM Edge Application Manager (IEAM) and Red Hat have created reference architectures and tools to manage the workload cross CPU and GPU/VPU compute resources.

Customers want simplicity. IEAM can provide simplicity with a single pane of glass to manage and orchestrate workloads from core to edge, across multiple clouds.

For edge clusters running Red Hat OpenShift Container Platform (OCP), a Kubernetes-based GPU/VPU Operators, solves the problem of needing unique operating system (OS) images between GPU and CPU nodes; instead, the GPU Operator bundles everything you need to support the GPU — the driver, container runtime, device plug-in, and monitoring with deployment by a Helm chart. Now, a single gold master image covers both CPU and GPU nodes.

Caution: Avoid fragmentation and friction with open source

This is indeed an exciting time for the edge computing community, as seen by the acceleration of innovation and emerging use case and architectures.

However, there is an area of concern as relates to fragmentation and friction in this emerging space.

Because the emerging edge market is enormous, there is a risk that some incumbents or niche players may be tempted to “go it alone,” trying to secure and defend a small corner (fragment) of a large space with a proprietary solution. If too many stakeholders do this – edge computing may fail to reach its potential.

This approach can be dangerous for companies for three reasons:

(1)  While isolated walled-garden (defensive) approach may work short term, over time isolated technology stacks may get left behind.

(2)  Customers are increasingly wary of attempts to vendor lock in and will source more flexible solutions.

(3)  Innovation is a team sport (e.g. Linux, Python).

Historically, emergent technologies can also encounter friction when key industry participants or standards organization are not working closely enough together (GSM/CDMA; VHS/Beta or HD-DVD/Blu-ray; Industrial IOT; Digital Twins).

So, what can we do to encourage collaboration?

The answer is open source.

Open source to reduce friction and increase collaboration

The IBM Edge team believes working with and through the open source community is the right approach to help edge computing evolve and reach its potential in the coming years.

IBM has a long history and strong commitment to open source. IBM was one of the earliest champions of communities like Linux, Apache, and Eclipse, pushing for open licenses, open governance, and open standards.

IBM engineers began contributing to Linux and helped to establish the Linux Foundation in 2000. In 1999, we helped to create the Apache Software Foundation (ASF) and supported the creation of the Eclipse Foundation in 2004 – providing open source developers a neutral place to collaborate and innovate in the open.

Continuing our tradition of support for open source collaboration, IBM and Red Hat are active members of Linux Foundation LF Edge;

  • LF Edge is an umbrella organization for several projects that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system.
  • By bringing together industry leaders, LF Edge will create a common framework for hardware and software standards and best practices critical to sustaining current and future generations of IoT and edge devices.
  • Fostering collaboration and innovation across the multiple industries including industrial manufacturing, cities and government, energy, transportation, retail, home and building automation, automotive, logistics and health care
No alt text provided for this image

IBM is an active contributor to Open Horizon – one of the LF Edge projects – and the core of IBM Edge Application Manager; LF Edge’s Open Horizon is an open source platform for managing the service software lifecycle of containerized workloads and related machine learning assets. It enables autonomous management of applications deployed to distributed webscale fleets of edge computing nodes – clusters and devices based on Kubernetes and Docker – all from a central management hub.

Open Horizon is already working with several other LF Edge projects including EdgeX Foundry, Fledge and SDO (Secure Device Onboard)

SDO makes it easy and secure to configure edge devices and associate them with an edge management hub. Devices built with SDO can be added as an Edge Node by simply importing their associated ownership vouchers and then powering on the edge devices.

Additional Resources for Open Horizon

Open-Horizon documentation: https://open-horizon.github.io

Open-Horizon GitHub (source code): https://github.com/open-horizon

Example programs for Open-Horizon: https://github.com/open-horizon/examples

Open-Horizon Playlist on YouTube: https://bit.ly/34Xf0Ge

LF Edge Member Spotlight: Zenlayer

By Akraino, Akraino Edge Stack, Blog, LF Edge, Member Spotlight

The LF Edge community comprises a diverse set of member companies and people that represent the IoT, Enterprise, Cloud and Telco Edge. The Member Spotlight blog series highlights these members and how they are contributing to and leveraging open source edge solutions. Today, we sit down with Jim Xu, Principal Engineer at Zenlayer, to discuss the importance of open source, collaborating with industry leaders in edge computing, their contributions to Akraino and the impact of being a part of the LF Edge ecosystem.

Can you tell us a little about your organization?

Zenlayer is an edge cloud service provider and global company headquartered in Los Angeles, Shanghai, Singapore, and Mumbai. Businesses utilize Zenlayer’s platform to deploy applications closer to the users and improve their digital experience. Zenlayer offers edge compute, software defined networking, and application delivery services in more than 180 locations on six continents.

Why is your organization adopting an open source approach?

Zenlayer has always relied on open source solutions. We strongly believe that open source is the right ecosystem for the edge cloud industry to grow. We connect infrastructure all over the globe. If each data center and platform integrate open-source software, it is much easier to integrate networks and make literal connections compared to a milieu of proprietary systems. Some of the open source projects we benefit from and support are Akraino Blue Prints, ODL, Kubernetes, OpenNess, DPDK, Linux, mySQL, and more.

Why did you join LF Edge and what sort of impact do you think LF Edge has on the edge, networking, and IoT industries?

We are a startup company in the edge cloud industry. LF Edge is one of the best open-source organizations both advocating for and building open edge platforms. The edge cloud space is developing rapidly, with continuous improvements in cloud technology, edge infrastructure, disaggregated compute, and storage options. Both impact and complexity go far beyond just cloud service providers, device vendors, or even a single traditional industry. LF Edge has helped build a community of people and companies from across industries, advocating for an open climate to make the edge accessible to as many users as possible.

What do you see as the top benefits of being part of the LF Edge community?

Our company has been a member of the LF Edge community for over a year now. Despite the difficulties presented by COVID-19, we have been able to enjoy being part of the Edge community. We interacted with people from a broad spectrum of industries and technology areas and learned some exciting use cases from the LF Edge community. This has helped us build a solid foundation for Zenlayer’s edge cloud services. 

What sort of contributions has your team made to the community, ecosystem through LF Edge participation?

We are proud to be part of the Edge Cloud community. Zenlayer is leading the Upstream subcommittee within Akraino and has invited multiple external communities such as ONF CORD,  CNTT, TIP OCN, O-RAN and TARS to share our common interest in building the edge. We also contributed to the upstream requirement and reviews for Akraino releases.

What do you think sets LF Edge apart from other industry alliances?

LF Edge has a clear focus on edge cloud and a very healthy and strong governing board to ensure unbiased technological drive toward open systems. 

How will LF Edge help your business?

We hope LF Edge will continue to empower rapid customer innovation during the drive to edge cloud for video streaming, gaming, enterprise applications, IoT, and more. As a member of a fast-growing community, we also look forward to more interactions via conferences and social events (digital or in person as is safe) so we can continue to get to know and better understand each other’s needs and how we can help solve them. 

What advice would you give to someone considering joining LF Edge?

LF Edge is a unique community pushing for the best future for edge cloud. The group brings together driven people, a collaborative culture, and fast momentum. What you put in you receive back tenfold. Anyone interested in the future of the edge should consider joining, even if they do not yet know much about open source and its benefits. The community will value their inputs and be happy to teach or collaborate in return.

To find out more about LF Edge members or how to join, click here. To learn more about Akraino, click here. Additionally, if you have questions or comments, visit the  LF Edge Slack Channel and share your thoughts in the #community or #akraino-tsc channels.

Pushing AI to the Edge (Part Two): Edge AI in Practice and What’s Next

By Blog, LF Edge, Project EVE, Trend

Q&A with Jason Shepherd, LF Edge Governing Board member, Project EVE leader and VP of Ecosystem at ZEDEDA

Image for post

This content originally ran on the ZEDEDA Medium Blog – visit their website for more content like this.

In Part One of this two-part Q&A series we highlighted the new LF Edge taxonomy that publishes next week and some key considerations for edge AI deployments. In this installment our questions turn to emerging use cases and key trends for the future.

To discuss more on this budding space, we sat down with our Vice President of ecosystem development, Jason Shepherd, to get his thoughts on the potential for AI at the edge, key considerations for broad adoption, examples of edge AI in practice and some trends for the future.

What do you see as the most promising use cases for edge AI?

As highlighted in Part One, the reasons for deploying AI at the edge include balancing needs across the vectors of scalability, latency, bandwidth, autonomy, security and privacy. In a perfect world all processing would be centralized, however this breaks down in practice and the need for AI (and ML) at the edge will only continue to grow with the explosion of devices and data.

Hands down, computer vision is the killer app for edge AI today due to the bandwidth associated with streaming video. The ability to apply AI to “see” events in the physical world enables immense opportunity for innovation in areas such as object recognition, safety and security, quality control, predictive maintenance and compliance monitoring.

Considering retail — computer vision solutions will usher in a new wave of personalized services in brick and mortar stores that provide associates with real-time insights on current customers in addition to better informing longer term marketing decisions. Due to privacy concerns, the initial focus will be primarily around assessing shopper demographics (e.g., age, gender) and location but increasingly we’ll see personalized shopping experiences based on individual identity with proper opt-in (often triggered through customer loyalty programs). This includes a trend for new “experiential” shopping centers, for which customers expect to give up some privacy when they walk in the door in exchange for a better experience.

While Amazon Go stores have led the trend for autonomous shopping environments, the use of computer-vision enabled self-service kiosks for grab-and-go checkout is growing rapidly overall. Given the recent health concerns with COVID-19, providers are rapidly shifting to making these solutions contactless by leveraging gesture control, instead of requiring interaction with a keypad or touch screen.

Computer vision use cases will often leverage sensor fusion, for example with barcode scans or radio-frequency identification (RFID) technology providing additional context for decision making in retail inventory management and point of sale (POS) systems. A camera can tell the difference between a T-shirt and a TV, but not the difference between large and medium sizes of the same shirt design. Still, perhaps eventually an AI model will be able to tell you if you have bad taste in clothing!

Another key vertical that will benefit from computer vision at the edge is healthcare. At ZEDEDA, we’ve worked with a global provider that leverages AI models in their medical imaging machines, located within hospitals and provided as a managed service. In this instance, the service provider doesn’t own the network on which their machines are deployed so they need a zero-trust security model in addition to the right tools to orchestrate their hardware and software updates.

Another example where bandwidth drives a need for deploying AI at the IoT Edge is vibration analysis as part of a use case like predictive maintenance. Here sampling rates of at least 1KHz are common, and can increase to 8–10KHz and beyond because these higher resolutions improve visibility into impending machine failures. This represents a significant amount of continuously streaming data that is cost-prohibitive to send directly to a centralized data center for analysis. Instead, inferencing models will be commonly deployed on compute hardware proximal to machines to analyze the vibration data in real time and only backhauling events highlighting an impending failure.

Analysis for predictive maintenance will also commonly leverage sensor fusion by combining this vibration data with measurements for temperature and power (voltage and current). Computer vision is also increasingly being used for this use case, for example the subtle wobble of a spinning motor shaft can be detected with a sufficient camera resolution, plus heat can be measured with thermal imaging sensors. Meanwhile, last I checked voltage and current can’t be checked with a camera!

An example of edge AI served up by the Service Provider Edge is for cellular vehicle-to-everything (C-V2X) use cases. While latency-critical workloads such as controlling steering and braking will always be run inside of a vehicle, service providers will leverage AI models deployed on compute proximal to small cells in a 5G network within public infrastructure to serve up infotainment, Augmented Reality for vehicle heads-up displays and coordinating traffic. For the latter, these AI models can warn two cars that they are approaching a potentially dangerous situation at an intersection and even alert nearby pedestrians via their smartphones. As we continue to collaborate on foundational frameworks that support interoperability it will open up possibilities to leverage more and more sensor fusion that bridges intelligence across different edge nodes to help drive even more informed decisions.

We’re also turning to processing at the edge to minimize data movement and preserve privacy. When AI inferencing models are shrunk for use in constrained connected products or healthcare wearables, we can train local inferencing models to redact PII before data is sent to centralized locations for deeper analysis.

Who are the different stakeholders involved in accelerating adoption of edge AI?

Edge AI requires the efforts of a number of different industry players to come together. We need hardware OEMs and silicon providers for processing; cloud scalers to provide tools and datasets; telcos to manage the connectivity piece; software vendors to help productize frameworks and AI models; domain expert system integrators to develop industry-specific models, and security providers to ensure the process is secure.

In addition to having the right stakeholders it’s about building an ecosystem based on common, open frameworks for interoperability with investment focused on the value add on top. Today there is a plethora of choices for platforms and AI tools sets which is confusing, but it’s more of the state of the market than necessity. A key point of efforts like LF Edge is to work in the open source community to build more open, interoperable, consistent and trusted infrastructure and application frameworks so developers and end users can focus on surrounding value add. Throughout the history of technology open interoperability has always won out over proprietary strategies when it comes to scale.

In the long run, the most successful digital transformation efforts will be led by organizations that have the best domain knowledge, algorithms, applications and services, not those that reinvent foundational plumbing. This is why open source software has become such a critical enabler across enterprises of all sizes — facilitating the creation of de-facto standards and minimizing “undifferentiated heavy lifting” through a shared technology investment. It also drives interoperability which is key for realizing maximum business potential in the long term through interconnecting ecosystems… but that’s another blog for the near future!

How do people differentiate with AI in the long term?

Over time, AI software frameworks will become more standardized as part of foundational infrastructure, and the algorithms, domain knowledge and services on top will be where developers continue to meaningfully differentiate. We’ll see AI models for common tasks — for example assessing the demographics of people in a room, detecting license plate numbers, recognizing common objects like people, trees, bicycles and water bottles — become commodities over time. Meanwhile, programming to specific industry contexts (e.g. a specific part geometry for manufacturing quality control) will be where value is continually added. Domain knowledge will always be one of the most important aspects of any provider’s offering.

What are some additional prerequisites for making edge AI viable at scale?

In addition to having the right ecosystem including domain experts that can pull solutions together, a key factor for edge AI success is having a consistent delivery or orchestration mechanism for both compute and AI tools. The reality is that to date many edge AI solutions have been lab experiments or limited field trials, not yet deployed and tested at scale. PoC, party of one, your table is ready!

Meanwhile, as organizations start to scale their solutions in the field they quickly realize the challenges. From our experience at ZEDEDA, we consistently see that manual deployment of edge computing using brute-force scripting and command-line interface (CLI) interaction becomes cost-prohibitive for customers at around 50 distributed nodes. In order to scale, enterprises need to build on an orchestration solution that takes into account the unique needs of the distributed IoT edge in terms of diversity, resource constraints and security, and helps admins, developers and data scientists alike keep tabs on their deployments in the field. This includes having visibility into any potential issues that could lead to inaccurate analyses or total failure. Further, it’s important that this foundation is based on an open model to maximize potential in the long run.

Where is edge AI headed?

To date, much of the exploration involving AI at the edge has been focused on inferencing models — deployed after these algorithms have been trained with the scalable compute of the cloud. (P.S. for those of you who enjoy a good sports reference, think of training vs. inference as analogous to coaching vs. playing).

Meanwhile, we’re starting to see training and even federated learning selectively moving to the Service Provider and User Edges. Federated learning is an evolving space that seeks to balance the benefits of decentralization for reasons of privacy, autonomy, data sovereignty and bandwidth savings, while centralizing results from distributed data zones to eliminate regional bias.

The industry is also increasingly developing purpose-built silicon that can increase efficiencies amid power and thermal constraints in small devices and even support either training or inference and this corresponds with the shift towards pushing more and more AI workloads onto edge devices. Because of this, it’s important to leverage device and application orchestration tools that are completely agnostic to silicon, compared to offers from silicon makers that have a vested interest in locking you into their ecosystem.

Finally, we’ll see the lower boundary for edge AI increasingly extend into the Constrained Device Edge with the rise of “Tiny ML” — the practice of deploying small inferencing models optimized for highly constrained, microcontroller-based devices. An example of this is the “Hey Alexa” of an Amazon Echo that is recognized locally and subsequently opens the pipe to the cloud-based servers for a session. These Tiny ML algorithms will increasingly be used for localized analysis of simple voice and gesture commands, common sounds such as a gunshot or a baby crying, assessing location and orientation, environmental conditions, vital signs, and so forth.

To manage all of this complexity at scale, we’ll lean heavily on industry standardization, which will help us focus on value on top of common building blocks. Open source AI interoperability projects, such as ONNX, show great promise in helping the industry coalesce around a format so that others can focus on developing and moving models across frameworks and from cloud to edge. The Linux Foundation’s Trust over IP effort and emerging Project Alvarium will also help ease the process of transporting trusted data from devices to applications. This notion of pervasive data trust will lead to what I call the “Holy Grail of Digital” — selling and/or sharing data resources and services to/with people you don’t even know. Now this is scale!

In Closing

As the edge AI space develops, it’s important to avoid being locked into a particular tool set, instead opting to build a future-proofed infrastructure that accommodates a rapidly changing technology landscape and that can scale as you interconnect your business with other ecosystems. Here at ZEDEDA, our mission is to provide enterprises with an optimal solution for deploying workloads at the IoT Edge where traditional data center solutions aren’t applicable, and we’re doing it based on an open, vendor-neutral model that provides freedom of choice for hardware, AI framework, apps and clouds. We’re even integrating with major cloud platforms such as Microsoft Azure to augment their data services.

Reach out if you’re interested in learning more about how ZEDEDA’s orchestration solution can help you deploy AI at the IoT Edge today while keeping your options open for the future. We also welcome you to join us in contributing to Project EVE within LF Edge which is the open source foundation for our commercial cloud offering. The goal of the EVE community is to build the “Android of the IoT Edge” that can serve as a universal abstraction layer for IoT Edge computing — the only foundation you need to securely deploy any workload on distributed compute resources. To this end, a key next step for Project EVE is to extend Kubernetes to the IoT Edge, while taking into account the unique needs of compute resources deployed outside of secure data centers.

The success of AI overall — and especially edge AI — will require our concerted collaboration and alignment to move the industry forward while protecting us from potential misuse along the way. The future of technology is about open collaboration on undifferentiated plumbing so we can focus on value and build increasingly interconnected ecosystems that drive new outcomes and revenue streams. As one political figure famously said — “it takes a village!”

If you have questions or would like to chat with leaders in the project, join us on the LF Edge Slack  (#eve or #eve-help) or subscribe to the email list. You can check out the documentation here.

On the “Edge” of Something Great

By Akraino, Announcement, Baetyl, Blog, EdgeX Foundry, Fledge, Home Edge, LF Edge, Open Horizon, Project EVE, Secure Device Onboard, State of the Edge

As we kick off Open Networking and Edge Summit today, we are celebrating the edge by sharing the results of our first-ever LF Edge Member Survey and insight into what our focuses are next year.

LF Edge, which will celebrate its 2nd birthday in January 2021, sent the survey to our more than 75 member companies and liaisons. The survey featured about 15 questions that collected details about open source and edge computing, how members of the LF Edge community are using edge computing and what project resources are most valuable. 

Why did you chose to participate in LF Edge?

The Results Are In

The Top 3 reasons to participate in LF Edge are market creation and adoption acceleration, collaboration with peers and industry influence. 

  • More than 71% joined LF Edge for market creation and adoption acceleration
  • More than 57% indicated they joined LF Edge for business development
  • More than 62% have either deployed products or services based on LF Edge Projects or they are planned by for later this year, next year or within the next 3-5 years

Have you deployed products or services based on LF Edge Projects?

This feedback corresponds with what we’re seeing in some of the LF Edge projects. For example, our Stage 3 Projects Akraino and EdgeX Foundry are already being deployed. Earlier this summer, Akraino launched its Release 3 (R3) that delivers a fully functional open source edge stack that enables a diversity of edge platforms across the globe. With R3, Akraino brings deployments and PoCs from a swath of global organizations including Aarna Networks, China Mobile, Equinix, Futurewei, Huawei, Intel, Juniper, Nokia, NVIDIA, Tencent, WeBank, WiPro, and more. 

Additionally, EdgeX Foundry has hit more than 7 million container downloads last month and a global ecosystem of complementary products and services that continues to increase. As a result, EdgeX Foundry is seeing more end-user case studies from big companies like Accenture, ThunderSoft and Jiangxing Intelligence

Have you gained insight into end user requirements through open collaboration?


Collaboration with peers

The edge today is a solution-specific story. Equipment and architectures are purpose-built for specific use cases, such as 5G and network function virtualization, next-generation CDNs and cloud, and streaming games. Which is why collaboration is key and more than 70% of respondents said they joined LF Edge to collaborate with peers. Here are a few activities at ONES that showcase the cross-project and members collaboration. 

Additionally, LF Edge created a LF Edge Vertical Solutions Group that is working to enable easily-customized deployments based on market/vertical requirements. In fact, we are hosting an LF Edge End User Community Event on October 1 that provides a platform for discussing the utilization of LF Edge Projects in real-world applications. The goal of these sessions is to educate the LF Edge community (both new and existing) to make sure we appropriately tailor the output of our project collaborations to meet end user needs. Learn more.

Industry Influence

More than 85% of members indicated they have gained insights into end user requirements through open collaboration. A common definition of the edge is gaining momentum. Community efforts such as LF Edge and State of the Edge’s assets, the Open Glossary of Edge Computing, and the Edge Computing Landscape are providing cohesion and unifying the industry. In fact,  LF Edge members in all nine of the projects collaborated to create an industry roadmap that is being supported by global tech giants and start-ups alike.

 

 

Where do we go from here? 

When asked, LF Edge members didn’t hold back. They want more. They want to see more of everything – cross-project collaboration, end user events and communication, use cases, open source collaboration with other liaisons. As we head into 2021, LF Edge will continue to lay the groundwork for markets like cloud native, 5G, and edge for  more open deployments and collaboration.  

 

LF Edge Member Spotlight: NetFoundry

By Blog, EdgeX Foundry, LF Edge, Member Spotlight

The LF Edge community comprises a diverse set of member companies and people that represent the IoT, Enterprise, Cloud and Telco Edge. The Member Spotlight blog series highlights these members and how they are contributing to and leveraging open source edge solutions. Today, we sit down with Jim Clardy, Co-Founder and Global Cloud Partners and Alliances at NetFoundry, to discuss the importance of open source, collaborating with industry leaders in edge computing and the impact of being a part of the LF Edge ecosystem.

Please tell us a little about your organization.

NetFoundry provides the leading zero trust networking platform offered as Network-as-a-Service (NaaS) to connect distributed applications, users, devices and locations through an optimized  global fabric. This enables: solutions and applications, ranging from edge to cloud, to easily embed zero trust networking inside the solution. Developers can embed secure, programmable, private, application-specific networking into their apps, using the open source Ziti software (Ziti.dev) which NetFoundry built and is the leading contributor to.

 

Why is your organization adopting an open source approach?

NetFoundry is built on open source Ziti. The next paradigm in networking is “Networking as code” and zero trust. With open source Ziti SDKs, developers can embed private networking into apps with a few lines of code. Ziti enables a new networking paradigm that greatly reduces the costs and simplifies the complexity of networking and implements zero-trust application embedded connectivity. Ziti is the leading open source platform for creating zero trust network connectivity over the Internet.

Why did you join LF Edge and what sort of impact do you think it has on the industry?

We believe open source communities have the power to shape technologies and markets. In addition to LF Edge, we are members of the Linux Foundation, EdgeX Foundry, and CNCF communities.

What do you see as the top benefits of being part of the LF Edge community?

Accelerating the next paradigm in networking where networking as code and zero trust become ubiquitous. We believe networking will be transformed with cloud-orchestrated interoperability fueled by open source communities like LF Edge.

What contributions has your team made (or plans to make) to the community/ecosystem through LF Edge participation?

NetFoundry built and is the leading contributor to open source Ziti software, and we are excited to build the open Ziti community. NetFoundry is contributing code to open Ziti regularly.

What do you think sets LF Edge apart from other industry alliances?

You are able to draw on the Linux Foundation and related ecosystem of communities and contributors – there is a massive and unstoppable network effect created by LF Edge.

How might LF Edge help your business?

Accelerate the development of the Ziti project and community.

 

What advice would you give to someone considering joining the LF Edge community?

Don’t wait – do it today.

Learn more about NetFoundry here.

Learn more about open Ziti here.

Get started with Ziti on GitHub.

To find out more about our members or how to join LF Edge, click here. Additionally, if you have questions or comments, visit the  LF Edge Slack to share your thoughts and engage with community members.

 

 

LF Edge Demos at Open Networking & Edge Summit

By Blog, EdgeX Foundry, Event, Fledge, LF Edge, Open Horizon, Project EVE, Secure Device Onboard

Open Networking & Edge Summit, which takes place virtually on September 28-30, is co-sponsored by LF Edge, the Linux Foundation and LF Networking. With thousands expected to attend, ONES will be the epicenter of edge, networking, cloud and IoT. If you aren’t registered yet – it takes two minutes to register for US$50 – click here.

Several LF Edge members will be at the conference leading discussions about trends, presenting use cases and sharing best practices. For a list of LF Edge focuses sessions, click here and add them to your schedule. LF Edge will also host a pavilion – in partnership with our sister organization LF Networking – that will showcase demos, including the debut of two new ones that feature a collaboration between Project EVE and Fledge and Open Horizon and Secure Device Onboarding. Check out the sneak peek of the demos below:

Managing Industrial IoT Data Using LF Edge (Fledge, EVE)

Presented by Flir, Dianomic, OSIsoft, ZEDEDA and making its debut at ONES, this demo showcases the strength of Project EVE and Fledge. The demo Fledge will show how the two open source projects work together to securely manage, connect, aggregate, process, buffer and forward any sensor, machine or PLC’s data to existing OT systems and any cloud. Specifically, it will show a FLIR IR Camera video and data feeds being managed as described.

 

Real-Time Sensor Fusion for Loss Detection (EdgeX Foundry):

Presented by LF Edge members HP, Intel and IOTech, this demo showcases the strength of the Open Retail Initiative and EdgeX Foundry. Learn how different sensor devices can use LF Edge’s EdgeX Foundry open-middleware framework to optimize retail operations and detect loss at checkout. The sensor fusion is implemented using a modular approach, combining point-of-sale , computer vision, RFID and scale data into a POC for loss prevention.

This demo was featured at the National Retail Federation Show in January. More details about the demo can be found in HP’s blog and  Intel blog.

               

Low-touch automated onboarding and application delivery with Open Horizon and Secure Device Onboard

Presented by IBM and Intel, this demo features two of the newest projects accepted into the LF Edge ecosystem – Secure Device Onboard was announced in July while Open Horizon was announced in April.

An OEM or ODM can generate a voucher with SDO utilities that is tied to a specific device. Upon purchase, they can send the voucher to the purchaser. With LF Edge’s Open Horizon Secure Device Onboard integration, an administrator can load the voucher into Open Horizon and pre-register the device. Once the device is powered on and connected to the network, it will automatically authenticate, download and install the Open Horizon agent, and begin negotiation to receive and run relevant workloads.

For more information about ONES, visit the main website: https://events.linuxfoundation.org/open-networking-edge-summit-north-america/. 

Pushing AI to the Edge (Part One): Key Considerations for AI at the Edge

By Blog, LF Edge, Project EVE, State of the Edge, Trend

Q&A with Jason Shepherd, LF Edge Governing Board member and VP of Ecosystem at ZEDEDA

This content originally ran on the ZEDEDA Medium Blog – visit their website for more content like this.

This two-part blog provides more insights into what’s becoming a hot topic in the AI market — the edge. To discuss more on this budding space, we sat down with our Vice President of ecosystem development, Jason Shepherd, to get his thoughts on the potential for AI at the edge, key considerations for broad adoption, examples of edge AI in practice and some trends for the future.


Chart defining the categories within the edge, as defined by LF Edge

Image courtesy of LF Edge

LF Edge’s Akraino Project Release 3 Now Available, Unifying Open Source Blueprints Across MEC, AI, Cloud and Telecom Edge

By Akraino, Announcement, LF Edge

    • 6 New R3 Blueprints (total of 20)  covering use cases across Telco, Enterprise, IoT, Cloud and more
    • Akraino Blueprints cover areas including MEC, AI/ML, Cloud, Connected Vehicle, AR/VR, Android Cloud Native, smartNICs, Telco Core & Open- RAN, with — ongoing support for R1-R2 blueprints and more
    • Community delivers open edge API specifications — to standardize across devices, applications (cloud native), orchestrations,  and multi-cloud — via new white paper

SAN FRANCISCO  August 12, 2020LF Edge, an umbrella organization within the Linux Foundation that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system, today announced the availability of Akraino Release 3 (“Akraino R3”).  Akraino’s third and most mature release to date delivers fully functional edge solutions– implemented across global organizations– to enable a diversity of edge deployments across the globe. New blueprints include a focus on  MEC, AI/ML, and Cloud edge. In addition, the community authored the first iteration of a new white paper to bring common open edge API standards to align the industry.

Launched in 2018, and now a Stage 3 (or “Impact” stage) project under the LF Edge umbrella, Akraino Edge Stack delivers an open source software stack that supports a high-availability cloud stack optimized for edge computing systems and applications. Designed to improve the state of carrier edge networks, edge cloud infrastructure for enterprise edge, and over-the-top (OTT) edge, it enables flexibility to scale edge cloud services quickly, maximize applications and functions supported at the edge, and to improve the reliability of systems that must be up at all times. 

“Akraino has evolved to unify edge blueprints across use cases,” said Arpit Joshipura, general manager, Networking, Automation, Edge and IoT, the Linux Foundation. “With a growing set of blueprints that enable more and more use cases, we are seeing the power of open source impact every aspect of the edge and how the world accesses and consumes information.”  

About Akraino R3

Akraino Release 3 (R3) delivers a fully functional open source edge stack that enables a diversity of edge platforms across the globe. With R3, Akraino brings deployments and PoCs from a swath of global organizations including Aarna Networks, China Mobile, Equinix, Futurewei, Huawei, Intel, Juniper, Nokia, NVIDIA, Tencent, WeBank, WiPro, and more.

Akraino enables innovative support for new levels of flexibility that scale 5G, industrial IoT, telco, and enterprise edge cloud services quickly, by delivering community-vetted and tested edge cloud blueprints to deploy edge services.  New use cases and new and existing blueprints provide an edge stack for Connected Vehicle, AR/VR, AI at the Edge, Android Cloud Native, SmartNICs, Telco Core and Open-RAN, NFV, IOT, SD-WAN, SDN, MEC, and more. 

 Akraino R3 includes  6 new blueprints for a total of 20,  all tested and validated on real hardware labs supported by users and community members — the Akraino community has established a full-stack, automated testing with strict community standards to ensure high-quality blueprints. 

The 20 “ready and proven” blueprints include both updates and long-term support to existing R1 & R2 blueprints, and the introduction of six new blueprints:

      • The AI Edge – School/Education Video Security Monitoring 
      • 5G MEC/Slice System–  Supports Cloud Gaming, HD Video, and Live Broadcasting
      • Enterprise Applications on Lightweight 5G Telco Edge (EATLEdge)
      • Micro-MEC (Multi-access Edge Computing) for SmartCity Use Cases
      • IEC Type 3: Android Cloud Native Applications on Arm®-based  Servers on the Edge 
      • IEC Type 5: Smart NIC: Edge hardware acceleration 

More information on Akraino R3, including links to documentation, code, installation docs for all Akraino Blueprints from R1-R3, can be found here. For details on how to get involved with LF Edge and its projects, visit https://www.lfedge.org/

API  White Paper

The Akraino community published the first iteration of a  new white paper to bring common open edge API standards to the industry. The new white paper makes available, for the first time, generic edge APIs for developers to standardize across devices, applications (cloud native), orchestrations,  and multi-cloud. The paper serves as a stepping stone for broad industry alignment on edge definitions, use cases, APIs. Download the paper here: https://www.lfedge.org/wp-content/uploads/2020/06/Akraino_Whitepaper.pdf

Looking Ahead

The community is already planning R4, which will include more implementation of open edge API guidelines, more automation of testing, increased alliance with upstream and downstream communities, and development of public cloud standard edge interfaces. Additionally, the community is expecting new blueprints as well as additional enhancements to existing blueprints. 

Don’t miss the Open Networking and Edge Summit (ONES) virtual event happening September 28-29, where Akraino and other LF Edge communities will collaborate on the latest open source edge developments. Registration is now open!

Ecosystem Support for Akraino R3

Arm
“The demands on compute, networking, and storage infrastructure are changing significantly as we connect billions of intelligent devices, many of which live at the edge of the 5G network,” said Kevin Ryan, senior director of software ecosystem development, Infrastructure Line of Business, Arm. “By working closely with the Akraino community on the release of Akraino R3, and through our efforts with Project Cassini for seamless cloud-native deployments, Arm remains committed to providing our partners with full- edge solutions primed to take on the 5G era.”

AT&T 
Mazin Gilbert, VP of Technology and Innovation, AT&T, said: “As a founding member of the Akraino platform, AT&T has seen first-hand the remarkable progress as a result of openness and industry collaboration. AI and edge computing are essential when it comes to creating an intelligent, autonomous 5G network, and we’re proud to work together with the community to deliver the best possible solutions for our customers.”

Baidu
In the 5G era, AI+ Edge Computing is not only an important guarantee for updating the consumer and industrial Internet experience (such as video consumption re-upgrading, scene-based AI capabilities, etc.), but also a necessary infrastructure for the development of the Internet industry,” said Ning Liu, Director of AI Cloud Group, Baidu. “Providing users with AI-capable edge computing platforms, products and services is one of Baidu’s core strategies. Looking towards the future, Baidu will continue to adhere to the core strategy of open source and cooperate with partners to build a more open and improved ecosystem.” 

China Unicom
“Commercial 5G is going live around the world. Edge computing will play an important role for large bandwidth and low delay services in the 5G era. The key to the success of edge computing is to provide integrated ICT PaaS capabilities, which is beneficial for the collaboration between networks and services, maximizing the value of 5G,” said Xiongyan Tang, Chief Scientist and CTO of the Network Technology Research Institute of China Unicom. “The PCEI Blueprint will define a set of open and common APIs, to promote the deep cooperation between operators and OTTs, and help to build a unified network edge ecosystem.”  

Huawei 
“High bandwidth, low latency, and massive connections are 5G typical features. Based on MEC’s edge computing and open capabilities, 5G network could build the connection, computing, and capabilities required by vertical industries and enables many applications. In the future, 5G MEC will be an open system that provides an application platform with rich atomic capabilities,” said by Bill Ren, Huawei Chief Open Source Liaison Officer. “Managing a large number of applications and devices on the MEC brings great challenges and increases learning costs for developers. We hope to make 5G available through open source, so that more industry partners and developers can easily develop and invoke 5G capabilities. Build a common foundation for carriers’ MEC through open source to ensure the consistency of open interfaces and models. Only in this way can 5G MEC bring tangible benefits to developers and users.”

Juniper Networks
“Juniper Networks is proud to have been an early member of the Akraino community and supportive of this important work. We congratulate this community for introducing new blueprints to expand the use cases for managed edge cloud with this successful third release,” said Raj Yavatkar, Chief Technology Officer at Juniper Networks. “Juniper is actively involved in the integration of multiple blueprints and we look forward to applying these solutions to evolve edge cloud and 5G private networks to spur new service innovations – from content streaming to autonomous vehicles.”

Tencent
“The new generation network is coming, IoT and Edge Computing are developing rapidly. At the same time, it also brings great challenges to technological innovation. High performance, low latency, high scalability, large-scale architecture is a must for all applications. TARS has released the latest version to meet the adjustment of 5G and Edge Computing. Massive devices can easily use TARS Microservice Architecture to realize the innovation of edge applications. The Connect Vehicle Blueprint and AR/VR Blueprint in Akraino are all using the TARS Architecture,” said Mark Shan, Chairman of Tencent Open Source Alliance, Chairman of TARS Foundation, and Akraino TSC Member. “The blueprints on the TARS Architecture solve the problem of high throughput and low latency. TARS is a neutral project in the Linux Foundation, which can be easily used and helped by anyone from the open-source community.”

Zenlayer
“We are proud to be part of the Edge Cloud community. Zenlayer is actively exploring edge solutions and integrating the solutions to our bare metal product. We hope the edge products will empower rapid customer innovation in video streaming, gaming, enterprise applications and more,” said Jim XU, chief engineering architect of Zenlayer.

About the Linux Foundation
Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.