How SUSE Is Bringing Multi-Cluster Kubernetes to the Edge
Kubernetes, which launched as an open source project in 2014, is not exactly new. What remains pretty new, however, is the practice of deploying Kubernetes to manage edge workloads. Kubernetes in its early days gained popularity mostly as a way to host apps in data centers, not to manage edge workloads like containerized network functions or medical IoT devices.
But edge computing is fast becoming a key use case for Kubernetes, and SUSE, a major player in the open source space since the early 1990s, is working to advance it through initiatives that simplify the task of deploying and managing Kubernetes clusters in edge locations.
Here's a look at what SUSE is doing in the realm of edge computing today and what's coming next as the company works to solidify open source's presence at the edge.
Kubernetes and the edge: A primer
Traditionally, a typical use case for Kubernetes looked like this: Engineers set up a handful of clusters – one for dev/test workloads, for example, and another for production – and hosted them in a conventional data center. In this context, Kubernetes offered the advantage of making it easy to manage workloads distributed across clusters of servers, but that was about it.
Today, however, businesses have a growing number of workloads that they need to deploy on edge infrastructure, meaning outside of traditional data centers. Telcos want to deploy networking apps and services close to their end-users to minimize latency, for example. Retailers want to process payments inside stores. Healthcare providers want to operate IoT devices in hospital rooms, or sometimes even inside their patients.
You don't strictly need Kubernetes to enable use cases like these; you can manage edge apps individually, without the help of Kubernetes-based orchestration. But when it comes to edge computing, Kubernetes offers an incredible feature: The ability to centralize management of apps and devices that are scattered across a wide geographic area. By setting up local clusters at each edge location and managing them through a central Kubernetes-based control plane, engineers get the operational convenience and efficiency that comes with hosting workloads in a single data center, while simultaneously enjoying the flexibility to deploy apps and services very close to end users.
SUSE's vision for Kubernetes at the edge
While deploying Kubernetes at the edge offers many benefits, it also presents some steep challenges. You need a lightweight operating system that can run on local edge devices, which may lack the CPU, memory and storage capacity of traditional servers. You also need to be able to optimize your network to minimize latency for traffic flowing between the edge and your control plane, and you must find a way to tame the complexity that comes with managing hundreds, or possibly even thousands, of local clusters, through a single Kubernetes deployment.
SUSE is addressing these challenges through several products. One is K3s, a lightweight Kubernetes distribution. By combining K3s with Rancher, SUSE's open source platform for managing Kubernetes clusters, businesses can deploy edge Kubernetes clusters without the heavy resource overhead of traditional Kubernetes. (Rancher can also manage Kubernetes clusters based on any other distribution, not just K3s, but the latter's light footprint makes it ideal for edge deployments.)
In addition, SUSE Linux Enterprise Micro OS (SLE Micro) offers a lightweight variant of SUSE's Linux-based operating system, providing an OS that plays nicely with low-resource edge devices.
On top of building edge computing products, SUSE is also pioneering best practices for deploying Kubernetes at the edge. "Pushing cloud-native and Kubernetes at the edge is brand new to everyone," says Keith Basil, Vice President of Product, Cloud Native Infrastructure at SUSE. "Everybody's looking for guidance and best practices," such as how best to operate control planes that manage hundreds of geographically distributed clusters or how to address the physical security challenges that arise in some edge computing contexts.
SUSE brings that guidance to its customers by offering a full suite of implementation, support and management services for edge workloads.
Getting the most from the bare-metal edge
SUSE's lightweight OS and Kubernetes distributions, combined with the management practices and processes it offers when supporting its products, make it possible to operate high-performing clusters on any infrastructure.
That's especially true when businesses run the SUSE stack on top of bare-metal hardware, which enables capabilities – such as GPU acceleration and TCP/IP offloading – that would not work on virtual servers.
To that end, SUSE partners with Equinix Metal. Metal delivers bare-metal infrastructure for hosting the Kubernetes control planes that manage edge workloads, providing three key benefits.
One is cost. Running bare-metal instances in a public cloud environment is expensive, and the total cost of ownership of Equinix Metal instances is lower, Basil says.
Performance is another key consideration. Bare-metal hardware, combined with interconnect services available in Equinix data centers, supercharge the performance of edge workloads distributed across a wide geographic area. "It's all about passing packets as fast as possible," Basil says.
Finally, the fact that Equinix offers a wide selection of data center locations across the world is an advantage for SUSE customers who want to operate Kubernetes edge control planes in locations close to their users. Public cloud data centers options are more limited.
Next steps for open source cloud-native edge computing
The solutions SUSE offers today for edge computing are only the beginning, according to Basil. Over the coming year, the company will double-down on its edge strategy by, for example, providing offerings to support validated designs (meaning carefully planned and tested configurations) for edge workloads like medical IoT devices, where failures could have fatal consequences. Supporting additional protocols for IoT network connectivity is another key focus that will allow SUSE to power even more edge workloads.
Whatever the future may hold for SUSE and edge computing, it's clear today that the company – and the open source space in general – has come a long way since the early 1990s, when SUSE got its start by selling a Linux distribution that arrived as a set of 40 floppy disks. Today, SUSE is helping to ensure that edge environments remain open – and that they are as secure, high-performing, cost-effective and flexible as possible.