November 13, 2023 By Stephanie Susnjara 5 min read

Kubernetes, the world’s most popular open-source container orchestration platform, is considered a major milestone in the history of cloud-native technologies. Developed internally at Google and released to the public in 2014, Kubernetes has enabled organizations to move away from traditional IT infrastructure and toward the automation of operational tasks tied to the deployment, scaling and managing of containerized applications (or microservices). While Kubernetes has become the de facto standard for container management, many companies also use the technology for a broader range of use cases.

Overview of Kubernetes

Containers—lightweight units of software that package code and all its dependencies to run in any environment—form the foundation of Kubernetes and are mission-critical for modern microservices, cloud-native software and DevOps workflows.

Docker was the first open-source software tool to popularize building, deploying and managing containerized applications. But Docker lacked an automated “orchestration” tool, which made it time-consuming and complex for data science teams to scale applications. Kubernetes, also referred to as K8s, was specifically created to address these challenges by automating the management of containerized applications.

In broad strokes, the Kubernetes orchestration platform runs via containers with pods and nodes. A pod operates one or more Linux containers and can run in multiples for scaling and failure resistance. Nodes run the pods and are usually grouped in a Kubernetes cluster, abstracting the underlying physical hardware resources. 

Kubernetes’s declarative, API-driven infrastructure has helped free up DevOps and other teams from manually driven processes so they can work more independently and efficiently to achieve their goals. In 2015, Google donated Kubernetes as a seed technology to the Cloud Native Computing Foundation (CNCF) (link resides outside ibm.com), the open-source, vendor-neutral hub of cloud-native computing.

Read about the history of Kubernetes

Today, Kubernetes is widely used in production to manage Docker and essentially any other type of container runtime. While Docker includes its own orchestration tool, called Docker Swarm, most developers choose Kubernetes container orchestration instead.

As an open-source system, Kubernetes services are supported by all the leading public cloud providers, including IBM, Amazon Web Services (AWS), Microsoft Azure and Google. Kubernetes can also run on bare metal servers and virtual machines (VMs) in private cloud, hybrid cloud and edge settings, provided the host OS is a version of Linux or Windows.

Six top Kubernetes use cases

Here’s a rundown of six top Kubernetes use cases that reveal how Kubernetes is transforming IT infrastructure.

1. Large-scale app deployment

Heavily trafficked websites and cloud computing applications receive millions of user requests each day. A key advantage of using Kubernetes for large-scale cloud app deployment is autoscaling. This process allows applications to adjust to demand changes automatically, with speed, efficiency and minimal downtime. For instance, when demand fluctuates, Kubernetes enables applications to run continuously and respond to changes in web traffic patterns This helps maintain the right amount of workload resources, without over- or under-provisioning.

Kubernetes employs horizontal pod autoscaling (HPA) (link resides outside ibm.com) to carry out load balancing (as for CPU usage or custom metrics) by scaling the number of pod replicas (clones that facilitate self-healing) related to a specific deployment. This mitigates potential issues like traffic surges, hardware problems or network disruptions.

Note: HPA is not to be confused with Kubernetes vertical pod autoscaling (VPA), which assigns additional resources, such as memory or CPU, to the pods that are already running for the workload.

2. High-performance computing

Industries including government, science, finance and engineering rely heavily on high-performance computing (HPC), the technology that processes big data to perform complex calculations. HPC uses powerful processors at extremely high speeds to make instantaneous data-driven decisions. Real-world uses of HPC include automating stock trading, weather prediction, DNA sequencing and aircraft flight simulation.

HPC-heavy industries use Kubernetes to manage the distribution of HPC calculations across hybrid and multicloud environments. Kubernetes can also serve as a flexible tool to support batch job processing involved in high-performance computing workloads, which enhances data and code portability.

3. AI and machine learning

Building and deploying artificial intelligence (AI) and machine learning (ML) systems requires huge volumes of data and complex processes like high-performance computing and big data analysis. Deploying machine learning on Kubernetes makes it easier for organizations to automate the management and scaling of ML lifecycles and reduces the need for manual intervention.

For example, the Kubernetes containerized orchestration platform can automate portions of AI and ML predictive maintenance workflows, including health checks and resource planning. And Kubernetes can scale ML workloads up or down to meet user demands, adjust resource usage and control costs.

Machine learning relies on large language models to perform high-level natural language processing (NLP) like text classification, sentiment analysis and machine translation, and Kubernetes helps speed the deployment of large language models to automate the NLP process. As more and more organizations turn to generative AI capabilities, they are using Kubernetes to run and scale generative AI models, providing high availability and fault tolerance.

Overall, Kubernetes provides the flexibility, portability and scalability needed to train, test, schedule and deploy ML and generative AI models.

4. Microservices management

Microservices (or microservices architecture) offer a modern cloud-native architecture approach where each application is comprised of numerous loosely connected and independently deployable smaller components, or services. For instance, large retail e-commerce websites consist of many microservices. These typically include an order service, payment service, shipping service and customer service. Each service has its own REST API, which the other services use to communicate with it.

Kubernetes was designed to handle the complexity involved to manage all the independent components running simultaneously within microservices architecture. For instance, Kubernetes’ built-in high availability (HA) feature ensures continuous operations even in the event of failure. And the Kubernetes self-healing feature kicks in if a containerized app or an application component goes down. The self-healing feature can instantly redeploy the app or application component, matching the desired state, which helps to maintain uptime and reliability.

5. Hybrid and multicloud deployments

Kubernetes is built to be used anywhere, making it easier for organizations to migrate applications from on-premises to hybrid cloud and multicloud environments. Kubernetes standardizes migration by providing software developers with built-in commands for effective app deployment. Kubernetes can also roll out changes to apps and scale them up and down depending on environment requirements.

Kubernetes offers portability across on-premises and cloud environments since it abstracts away infrastructure details from applications. This eliminates the need for platform-specific app dependencies and makes it easy to move applications between different cloud providers or data centers with minimal effort.

6. Enterprise DevOps

For enterprise DevOps teams, being able to update and deploy applications rapidly is critical for business success. Kubernetes provides teams with both software system development and maintenance to improve overall agility. And the Kubernetes API interface allows software developers and other DevOps stakeholders to easily view, access, deploy, update and optimize their container ecosystems.

CI/CD—which stands for continuous integration (CI) and continuous delivery (CD)—has become a key aspect of software development. In DevOps, CI/CD streamlines application coding, testing and deployment by giving teams a single repository for storing work and automation tools to consistently combine and test the code and ensure it works. Kubernetes plays an important role in cloud-native CI/CD pipelines by automating container deployment across cloud infrastructure environments and ensuring efficient use of resources.

The future of Kubernetes

Kubernetes plays a critical IT infrastructure role, as can be seen in its many value-driven use cases that go beyond container orchestration. This is why so many businesses continue to implement Kubernetes. In a 2021 Cloud Native Survey (link resides outside ibm.com) conducted by the CNCF, Kubernetes usage is shown to have reached its highest point ever, with 96% of organizations using or evaluating the containerized platform. According to the same study, Kubernetes usage continues to rise in emerging technology regions, such as Africa, where 73% of survey respondents are using Kubernetes in production.

IBM and Kubernetes

Kubernetes schedules and automates tasks integral to managing container-based architectures, spanning container deployment, updates, service discovery, storage provisioning, load balancing, health monitoring and more. At IBM we are helping clients modernize their applications and optimize their IT infrastructure with Kubernetes and other cloud-native solutions.

Deploy secure, highly available clusters in a native Kubernetes experience with IBM Cloud® Kubernetes Service.

Explore IBM Cloud Kubernetes Service

Containerize and deploy Kubernetes clusters for containerized platforms using Red Hat® OpenShift® on IBM Cloud.

Explore Red Hat OpenShift on IBM Cloud
Was this article helpful?
YesNo

More from Cloud

Bigger isn’t always better: How hybrid AI pattern enables smaller language models

5 min read - As large language models (LLMs) have entered the common vernacular, people have discovered how to use apps that access them. Modern AI tools can generate, create, summarize, translate, classify and even converse. Tools in the generative AI domain allow us to generate responses to prompts after learning from existing artifacts. One area that has not seen much innovation is at the far edge and on constrained devices. We see some versions of AI apps running locally on mobile devices with…

IBM Tech Now: April 8, 2024

< 1 min read - ​Welcome IBM Tech Now, our video web series featuring the latest and greatest news and announcements in the world of technology. Make sure you subscribe to our YouTube channel to be notified every time a new IBM Tech Now video is published. IBM Tech Now: Episode 96 On this episode, we're covering the following topics: IBM Cloud Logs A collaboration with IBM watsonx.ai and Anaconda IBM offerings in the G2 Spring Reports Stay plugged in You can check out the…

The advantages and disadvantages of private cloud 

6 min read - The popularity of private cloud is growing, primarily driven by the need for greater data security. Across industries like education, retail and government, organizations are choosing private cloud settings to conduct business use cases involving workloads with sensitive information and to comply with data privacy and compliance needs. In a report from Technavio (link resides outside ibm.com), the private cloud services market size is estimated to grow at a CAGR of 26.71% between 2023 and 2028, and it is forecast to increase by…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters