Portworx sponsored this post.

Murli Thirumale, co-founder and CEO, Portworx
Murli Thirumale, co-founder and CEO, Portworx previously served as co-founder and CEO of Ocarina Networks, Inc. He also served as vice president and general manager, advanced solutions group, Citrix Systems, Inc. Thirumale holds an M.B.A. from Northwestern’s Kellogg Graduate School of Management as an F.C. Austin Distinguished Scholar.

Kubernetes emerged quickly as the de facto way to manage apps in containers. Its use is now expanding to manage the underlying infrastructure as well. CIOs should pay attention to this development because Kubernetes can simplify IT operations, reduce costs and increase agility for the entire data center — not just the small portion of apps running in containers. It’s also critical to getting the full value of modern application development.

Using Kubernetes to manage infrastructure isn’t just nice-to-have, it’s essential to fully realize the benefits of cloud native computing. App development has evolved quickly in recent years and the underlying infrastructure needs to evolve with it, in order to support the rapid scaling, automation and agility that Kubernetes brings. If it doesn’t, the infrastructure becomes a bottleneck to IT modernization.

Kubernetes is able to align your infrastructure with what your applications are now capable of. Using Kubernetes extensions, you can scale compute, networking and storage in line with scaling applications and manage infrastructure in a unified way across multiple public and private clouds. Put simply, modern applications use Kubernetes to orchestrate and manage IT infrastructure, as well as to manage containers — so that the whole stack, end-to-end and top-to-bottom, get the benefits of agility and cost-reduction.

Read More:   Update A Case for Databases on Kubernetes from a Former Skeptic

What does this look like in reality today? The Kubernetes community has developed a raft of interfaces, plugins and add-ons that extend the functionality of Kubernetes to provide a whole range of infrastructure capabilities and services. These include the Container Storage Interface (CSI), which can automatically provision and delete storage volumes as needed (it already has drivers for numerous vendors) and the Container Network Interface (CNI), which provides a similar function for network connectivity. There are also Kubernetes add-ons for monitoring, tracing, logging, load balancing, and more.

Kubernetes extensions bring the benefits of cloud native to virtualized environments as well. KubeVirt, for example, provides cloud native virtualization, making it possible to run VMs efficiently in a Kubernetes-managed environment. It combines the best of mature virtualization management with the agile application orchestration in Kubernetes.

Like the other Kubernetes extensions, Kubevirt also provides an open source, standards-based alternative to proprietary offerings — for example, VMware’s Project Pacific. This allows Kubernetes to manage infrastructure agnostically across any type of storage and compute equipment, from any vendor and across any public or private cloud.

Importantly, Kubernetes is not restricted to managing containerized environments. It can also manage infrastructure for legacy, non-containerized applications running in VMs. So even if only 20% of your enterprise apps are containerized, Kubernetes provides a single environment for managing hardware and network resources for all applications — meaning teams don’t need to deal with multiple management frameworks. Using these Kubernetes extensions, vendors are now providing storage or networking overlays that virtualize the underlying storage or networking, but bring it under Kubernetes management. This enables an enterprise to preserve it’s storage capital expense investments but manage them with Kubernetes.

The overarching idea here is to create an infrastructure that’s defined by your application needs, not the needs of the hardware underneath. In other words, the focus needs to shift from the manual provisioning performed by network and storage admin, to an environment where the infrastructure is application-aware and provisioned automatically according to the needs defined by end-users. This is what Kubernetes enables.

Read More:   How Adidas Manages for Scale – InApps Technology 2022

In short, using Kubernetes to control infrastructure takes the benefits that attracted CIOs to containers in the first place — intelligent automation, rapid scaling, reduced cost and agile deployment — and brings them to the underlying infrastructure. This is critical to prevent infrastructure from becoming a bottleneck to the rapid improvements in the modern applications layer.

Feature image via Pixabay.

VMware is a sponsor of InApps Technology.