Course Overview

A developer introduction to building and managing containers with Podman for deploying applications on Red Hat OpenShift Container Platform.

Red Hat OpenShift Development I: Introduction to Containers with Podman (DO188) introduces students to building, running, and managing containers with Podman and Red Hat OpenShift Container Platform. This course helps students build the core skills for developing containerized applications through hands-on experience.

This course is based on Red Hat® Enterprise Linux® 8.6 and OpenShift® Container Platform 4.10.

Course Objectives

  • Introduction to containers
  • Run containers with Podman
  • Build custom container images
  • Manage container images
  • Remote debugging with containers
  • Basic container networking
  • Persist data with containers
  • Run multi-container applications
  • Troubleshoot Container Deployments
  • Orchestrate containers with OpenShift and Kubernetes

Course Content

Introduction and overview of containers
Describe how containers facilitate application development.

Podman basics
Manage and run containers with Podman.

Container images
Navigate container registries to find and manage container images.

Custom container images
Build custom container images to containerize applications.

Persisting data
Build persistent databases.

Container networking
Describe basic container networking and how to access containerized services.

Troubleshooting containers
Analyze container logs and configure a remote debugger.

Multi-container applications with compose
Run multi-container applications using Compose.

Container orchestration with Kubernetes and OpenShift
Orchestrate containerized applications with Kubernetes and OpenShift.

Course Overview

Design, build, and deploy containerized applications on Red Hat OpenShift

Red Hat OpenShift Development II: Containerizing Applications with exam (DO289) teaches you how to design, build, and deploy containerized software applications on an OpenShift cluster

Whether you are migrating existing applications or writing container-native applications, you will learn how to boost developer productivity powered by Red Hat® OpenShift Container Platform, a containerized application platform that allows enterprises to manage container deployments and scale their applications using Kubernetes.

The skills you learn in this course can be applied using all versions of Red Hat OpenShift, including Red Hat OpenShift on AWS (ROSA), Azure Red Hat OpenShift (ARO), and Red Hat OpenShift Container Platform.

This course is based on Red Hat OpenShift 4.12. The Red Hat Certified OpenShift Application Developer Exam (EX288) is included in this offering

Course Objectives

  • Features for developers in the Red Hat OpenShift web console
  • Building and publishing container images for Red Hat OpenShift
  • Managing container deployments on Red Hat OpenShift
  • Create and deploy multi-container applications on Red Hat OpenShift
  • Deploy multi-container applications using Helm Charts and Kustomize
  • Create health checks to monitor and improve application reliability
  • Creating CI/CD Workflows using Red Hat OpenShift Pipelines

Course Content

Red Hat OpenShift Container Platform for Developers

Define the Red Hat OpenShift architecture, concepts and terminology, and set up the developer environment.

Deploying Simple Applications

Deploy simple applications by using the Red Hat OpenShift web console and command-line tools.

Building and Publishing Container Images

Build, deploy and manage the lifecycle of container images by using a container registry.

Managing Red Hat OpenShift Builds

Describe the Red Hat OpenShift build process and build container images.

Managing Red Hat OpenShift Deployments

Describe the different Red Hat OpenShift deployment strategies and how to monitor the health of applications.

Deploying Multi-container Applications

Deploy multi-container applications by using Red Hat OpenShift templates, Helm charts, and Kustomize.

Continuous Deployment using Red Hat OpenShift Pipelines

Implement CI/CD workflows by using Red Hat OpenShift Pipelines.

Note: Course outline is subject to change with technology advances and as the nature of the underlying job evolves.

Course Overview

Course description

Plan, implement, and manage OpenShift clusters at scale

Red Hat OpenShift Administration III: Scaling Kubernetes Deployments in the Enterprise (DO380) expands upon the skills required to plan, implement, and manage OpenShift® clusters in the enterprise. You will learn how to support a growing number of stakeholders, applications, and users to achieve large-scale deployments.

This course is based on Red Hat® OpenShift Container Platform 4.10.

Note: This course is five days. Durations may vary based on the delivery. For full course details, scheduling, and pricing, select your location then “get started” on the right hand menu.

Course summary

– Manage OpenShift cluster operators and add operators.

– Automate OpenShift management tasks using Ansible® playbooks.

– Create and schedule cluster administration jobs.

– Implement GitOps workflows using Jenkins.

– Integrate OpenShift with enterprise authentication.

– Query and visualize cluster-wide logs, metrics, and alerts.

– Manage both shared, file-based storage and non-shared, block-based storage.

– Manage machine pools and machine configurations.

Course Objectives

This course builds upon the essential skills required to configure and manage an OpenShift 4.x cluster, teaching the enhanced skills needed to operate production environments at scale, including:

  • Automating Day 2 tasks to establish production clusters with higher performance and availability.
  • Integrating OpenShift with enterprise authentication, storage, CI/CD, and GitOps systems to improve productivity of IT operations and compliance with organization’s standards.
  • Troubleshooting techniques to identify issues with cluster operators and compute capacity.

Course Content

Move from Kubernetes to OpenShift

Demonstrate that OpenShift is Kubernetes by deploying Kubernetes-native applications on OpenShift.


Introduce automation on OpenShift

Automate OpenShift administration tasks using bash scripts and Ansible playbooks.


Manage operators with OpenShift

Deploy Kubernetes Operators and configure OpenShift cluster operators.


Implement GitOps with Jenkins

Implement a GitOps workflow using containerized Jenkins to administer an OpenShift cluster.


Configure enterprise authentication

Integrate OpenShift with enterprise identity providers.


Configure trusted TLS certificates

Configure OpenShift with trusted TLS certificates for external access to cluster services and applications.


Configure dedicated node pools

Configure a subset of the cluster nodes for special workloads.


Configure persistent storage

Configure storage providers and storage classes to ensure cluster user access to persistent storage.


Manage cluster monitoring and metrics

Configure and manage the OpenShift monitoring stack.


Provision and inspect cluster logging

Deploy, query, and troubleshoot cluster-wide logging.


Recover failed worker nodes

Inspect, troubleshoot, and remediate worker nodes in a variety of failure scenarios.


Note: Course outline is subject to change with technology advances and as the nature of the underlying job evolves. For questions or confirmation on a specific objective or topic, contact one of our Red Hatters..

Course Overview

Supporting the adoption of container technology through the development of container-native applications

The Container Adoption Boot Camp for Developers (DO720) immerses you in intensive, hands-on development of container-native applications deployed on Red Hat’s implementation of Kubernetes, Red Hat® OpenShift® Container Platform. As part of enrollment, you will receive one year of Red Hat Learning Subscription Standard, which gives you unlimited access to all of our courses online, plus up to five certification exams and two retakes. This boot camp is for those seeking to make a quantum leap in their journey toward digital transformation. Making this shift involves developing software in tight iterations so that business value can be realized sooner. In order to accomplish this goal, this offering can facilitate the adoption of container-native applications, including microservices.

This collection of courses is based on Red Hat OpenShift Container Platform 4.10.

Note: This course is five days. Durations may vary based on the delivery. For full course details, scheduling, and pricing, select your location then “get started” on the right hand menu.

Course content summary

– Introduction to containers, Kubernetes, and Red Hat OpenShift

– Deploy and manage applications on an OpenShift cluster

– Build and design containerized applications for OpenShift

– Create microservice-based applications with Quarkus

– Deploy microservices to an OpenShift cluster

– Build resilient services with Red Hat OpenShift Service Mesh

– Secure an OpenShift service mesh

Course Objectives

You should be able to demonstrate these skills:

  • Create and manage custom container images.
  • Deploy applications to OpenShift Container Platform.
  • Develop microservices using Quarkus.
  • Design container images to containerize applications.
  • Customize application builds and implement post-commit build hooks.
  • Create a multi-container application template.
  • Implement health checks to improve system reliability.
  • Implement unit and integration tests for microservices.
  • Use the Config specification to inject data into a microservice.
  • Implement fault tolerance in a microservice using OpenShift Service Mesh.
  • Secure an OpenShift Service Mesh.

Course Content

Introduction to container technology

Describe how software can run in containers orchestrated by OpenShift Container Platform.


Create containerized services

Provision a service using container technology.


Manage containers

Modify prebuilt container images to create and manage containerized services.


Manage container images

Manage the life cycle of a container image from creation to deletion.


Create custom container images

Design and code a Dockerfile to build a custom container image.


Deploy containerized applications

Deploy applications on OpenShift Container Platform.


Deploy multi-container applications

Deploy applications that are containerized using multiple container images.


Troubleshoot containerized applications

Troubleshoot a containerized application deployed on OpenShift.


Deploy and manage applications on an OpenShift cluster

Deploy applications using various application packaging methods to an OpenShift cluster and manage their resources.


Design containerized applications for OpenShift

Select a containerization method for an application and create a container to run on an OpenShift cluster.


Publish enterprise container images

Create an enterprise registry and publish container images to it.


Build applications

Describe the OpenShift build process, build triggers, and manage builds.


Create applications from OpenShift templates

Describe the elements of a template and create a multi-container application template.


Manage application deployments

Monitor application health and implement various deployment methods for cloud-native applications.


Implement continuous integration and continuous deployment pipelines in OpenShift

Create and deploy Jenkins pipelines to facilitate continuous integration and deployment with OpenShift.


Describe microservice architectures

Describe components and patterns of microservice-based application architectures.


Implement a microservice with Quarkus

Deploy Red Hat OpenShift Service Mesh on OpenShift Container Platform.


Test microservices

Implement unit and integration tests for microservices.


Deploy microservice-based applications

Deploy Quarkus microservice applications to an OpenShift cluster.


Build microservice applications with Quarkus

Build a persistent and configurable distributed quarkus microservices application.


Test microservices

Implement unit and integration tests for microservices.


Secure microservices

Secure a microservice using OAuth.


Monitor microservices

Monitor the operation of a microservice using metrics, distributed tracing, and log aggregation.


Introduction to Red Hat OpenShift Service Mesh

Describe the basic concepts of microservice architecture and OpenShift Service Mesh.


Observe a service mesh

Trace and visualize an OpenShift Service Mesh with Jaeger and Kiali.


Control service traffic

Manage and route traffic with OpenShift Service Mesh


Release applications with OpenShift Service Mesh

Release applications with canary and mirroring release strategies.


Test service resilience with chaos testing

Test the resiliency of an OpenShift Service Mesh with chaos testing.


Build resilient services

Use OpenShift Service Mesh strategies to create resilient services.


Secure an OpenShift Service Mesh

Secure and encrypt services in your application with OpenShift Service Mesh.

Course Overview

The Container Adoption Boot Camp (DO700) is for those seeking to make a quantum leap in their journey toward digital transformation. Making this shift involves developing software in tight iterations so that business value can be realized sooner. In order to accomplish this goal, this offering can facilitate the adoption of container-native applications, including microservices.

– Introduction to Containers, Kubernetes, and Red Hat OpenShift

– Configuring a Red Hat OpenShift cluster

– Describing advanced features of Red Hat OpenShift

– Containerizing software applications

– Developing microservices with MicroProfile

– Developing microservices with Red Hat® OpenShift Application Runtimes

Course Objectives

Impact on the organization

Microservices are a new alternative to designing modern applications, focused on working with less hardware resources and, therefore, reducing infrastructure costs. Many organizations are struggling with how to make the move from monolithic applications to applications based on microservices, as well as how to reorganize their development paradigm to reap the benefits of microservice development in a DevOps economy. In particular, many organizations are invested in Java programming frameworks and OpenShift.

This curriculum is intended to develop the skills needed to create microservices architectures using Red Hat OpenShift Container Platform, a cloud solution that leverages the usage of microservices running on containers. The curriculum develops the skills needed to install, configure, and manage OpenShift to deploy containerized applications that are highly available, resilient, and scalable. You will learn to containerize software applications and efficiently deploy them to an OpenShift cluster, allowing you to take advantage of a platform and architecture that fosters DevOps principles in your organization.

Red Hat has created this course in a way intended to benefit our customers, but each company and infrastructure is unique, and actual results or benefits may vary.

Impact on the individual

As a result of attending this course, you should be able to configure and manage a Red Hat OpenShift Container Platform cluster and know how to develop, monitor, test, and deploy microservice-based Java EE applications using Wildfly Swarm and OpenShift.

You should be able to demonstrate these skills:

  • Create containerized services using Docker.
  • Manage containers and container images.
  • Create custom container images.
  • Deploy containerized applications on Red Hat OpenShift.
  • Deploy multi-container applications.
  • Install Red Hat OpenShift Container Platform to create a simple cluster.
  • Configure and manage Red Hat OpenShift masters and nodes.
  • Secure Red Hat OpenShift with a simple internal authentication mechanism.
  • Control access to resources on Red Hat OpenShift.
  • Deploy applications on Red Hat OpenShift using source-to-image facility.
  • Configure and manage Red Hat OpenShift pods, services, routes, secrets, and other resources.
  • Deploy applications to a Red Hat OpenShift cluster and manage them with the command-line client and the web console.
  • Design and build containers for applications for successful deployment to a Red Hat OpenShift cluster.
  • Publish container images to an enterprise registry.
  • Build containerized applications using the source-to-image facility.
  • Create applications using Red Hat OpenShift templates.
  • Extract a service from a monolithic application and deploy it to the cluster as a microservice.
  • Migrate applications to run on a Red Hat OpenShift cluster.
  • Design a microservices-based architecture for an enterprise application.
  • Implement fault tolerance and health checks for microservices.
  • Secure microservices to prevent unauthorized access.

Course Content

Create custom container images

Create containers, manage containers, and manage container images.


Deploy containerized applications

Customize containers and deploy on Red Hat OpenShift.


Troubleshoot containerized applications

Troubleshoot Red Hat OpenShift deployments.


Explore Red Hat OpenShift networking concepts

Describe Red Hat OpenShift networking concepts and troubleshoot with CLI.


Manage Red Hat OpenShift resources

Control access to Red Hat OpenShift resources, implement persistent storage, and manage application deployments.


Containerize applications

Understand deployment methods, designing containers, and integrated registry and image streams.


Manage application deployments

Manage advanced application deployments and Red Hat OpenShift templates.


Design a highly available cluster

Design and install a highly available cluster, custom certificates, and log aggregation, in addition to gaining an understanding of Gluster container-native storage, managing system resources, and configuring advanced networking.


Implement microservice architecture

Describe microservice architectures, deploy microservices, and implement with MicroProfile.


Test microservices

Run microservices, inject configuration data, and perform health checks.


Implement fault tolerance

Apply fault tolerance, develop an API gateway for a series of microservices, and secure with JWT.


Secure microservices with JWT

Use the JSON Web Token specification to secure a microservice.


Create microservices with Red Hat OpenShift Application Runtimes

Receive an introduction to OpenShift Application Runtimes and Fabric8.


Install Red Hat OpenShift Container Platform

Install, monitor, and manage OpenShift Container Platform.


Customize source-to-image builds

Tailor source-to-image builds and migrate applications to Red Hat OpenShift.


Develop and deploy runtimes

Employ the WildFly Swarm, Vert.x, and Spring Boot runtimes to develop and deploy microservices.


Monitor microservices

Track the operation of a microservice using metrics, distributed tracing, and log aggregation.

Course Overview

In this product-focused course, you’ll deep dive into all the features of Mirantis Secure Registry, and discover how it can enhance the security of your container image production, storage and distribution both as a stand-alone registry, or integrated into a continuous integration pipeline. We’ll discuss installing and configuring MSR, managing MSR user permissions, enhancing registry security with content trust and binary security scanning, as well as registry management strategies like garbage collection, content caching, and webhook-driven third-party integrations.

Course Content

Mirantis Secure Registry Architecture

  • Production-grade deployment patterns
  • Containerized components of MSR
  • Networking & System requirements for MSR
  • Installing MSR via Launchpad for high availability
  • Integrating external storage into MSR

Access Control in MSR

  • MSR RBAC system

Content Trust

  • Defeating man in the middle attacks with The Update Framework & Notary
  • Content Trust usage in MSR

Security Scanning

  • Auditing container images for known vulnerabilities
  • Setting up MSR security scanning
  • Security scan integration in continuous integration

Repository Automation

  • Continuous integration pipeline architecture featuring MSR
  • Promoting and mirroring images through pipelines
  • Integrating MSR with external tooling via webhooks

Image Management

  • Image pruning and garbage collection strategies and automation
  • Registry sizing strategy
  • Content caching for distributed teams

MSR Troubleshooting

  • Correlating MSR symptoms with components
  • Probing and reading MSR state databases
  • Recovering failed MSR replicas
  • MSR backups & restore
  • Disaster recovery in event of critical MSR failure

Course Overview

In this product-focused course, you’ll deep dive into all the features of Mirantis Kubernetes Engine, and discover how it simplifies, secures and accelerates Kubernetes and Swarm cluster management at enterprise scale. We’ll discuss installing and configuring MKE, managing MKE user permissions and orchestrator resources, and advanced networking features included in the platform, as well as MKE troubleshooting and support.

Course Content

Mirantis Kubernetes Engine Architecture

  • Production-grade deployment patterns
  • Containerized components of MKE
  • Networking & System requirements for MKE
  • Installing MKE via Launchpad for high availability

Access Control in MKE

  • MKE RBAC systems
  • PKI, client bundle and API authentication
  • Swarm and Kubernetes access control comparison

L7 Networking Features

  • Interlock for Swarm
  • Istio for Kubernetes
  • Sticky sessions, canary or blue/green deployments, and cookie usage for both orchestrators

MKE Support Dumps

  • Generating and understanding MKE support dumps
  • Finding critical information in support dumps for troubleshooting MKE
  • Enabling and exporting API audit logs for disaster post-mortem

MKE Troubleshooting

  • Correlating MKE symptoms with components
  • Probing and reading MKE state databases
  • Recovering failed MKE managers
  • MKE backups & restore
  • Disaster recovery in event of critical MKE failure

Course Overview

In this intense bootcamp, you’ll encounter containers for the first time, learn to orchestrate them into scalable, highly available applications orchestrated by Kubernetes, and finally begin deploying production grade Kubernetes clusters through Mirantis Container Cloud. This bundle is ideal for students who are just starting out with containerization and want to leverage the full power of Kubernetes across multiple clusters and teams. Students will leave the workshop with a proof of concept Mirantis Container Cloud deployment on AWS.

Course Objectives

CN100

  • Containerization motivations and implementation
      – Usecases
      – Comparison to virtual machines
  • Creating, managing and auditing containers
      – Container implementation from the Linux kernel
      – Container lifecycle details
      – Core container creation, auditing and management CLI
  • Best practices in container image design
      – Layered filesystem implementation and performance implications
      – Creating images with Dockerfiles
      – Optimising image builds with multi-stage builds and image design best practices
  • Single-host container networking
      – Docker native networking model
      – Software defined networks for containers
      – Docker-native single-host service discovery and routing
  • Provisioning external storage
      – Docker volume creation and management
      – Best practices and usecases for container-external storage.

CN120

  • Make effective use of pod architecture
  • Deploy workloads as Kubernetes controllers
  • Provision configuration at runtime to Kubernetes workloads
  • Network pods together across a cluster using native services
  • Provision highly available storage to Kubernetes workloads
  • Package an application as a Helm chart

CN211

  • Mirantis Container Cloud Architecture
      – Management, regional, managed and attached cluster usage and architecture
      – Installation and setup of management and managed clusters
  • MCC User Management
      – Using Keycloak to manage user permissions
      – Integrating LDAP with MCC
      – Managing permissions for multitenancy
  • Cluster Logging & Monitoring
      – Stacklight configuration and cluster integration
      – Using Prometheus and Grafana dashboards
      – Customizing Stacklight configurations & third-party integrations
      – Exploring logs with Kibana

Course Overview

In this rapid introduction to Mirantis Container Cloud, students will learn how to deploy Kubernetes clusters to AWS using MCC, as well as how to manage MCC user permissions, Stacklight-based monitoring and logging tools, and third-party monitoring integrations. Students will leave the workshop with a proof-of-concept MCC deployment bootstrapped on their own AWS account for future exploration and study.

Course Objectives

  • Mirantis Container Cloud Architecture
        – Management, regional, managed and attached cluster usage and architecture
        – Installation and setup of management and managed clusters
  • MCC User Management
        – Using Keycloak to manage user permissions
        – Integrating LDAP with MCC
        – Managing permissions for multitenancy
  • Cluster Logging & Monitoring
        – Stacklight configuration and cluster integration
        – Using Prometheus and Grafana dashboards
        – Customizing Stacklight configurations & third-party integrations
        – Exploring logs with Kibana

Course Overview

In this intense bootcamp, you’ll encounter containers for the first time, learn to orchestrate them into scalable, highly available applications orchestrated by Docker Swarm, and finally discover how to enhance the security of your entire software supply chain and production environments using Mirantis Kubernetes Engine and Mirantis Secure Registry. This bundle is ideal for students who are just starting out with containerization and want to leverage the full power of Swarm and the Mirantis orchestration platform as soon as possible.

Course Content

This course combines all topics of CN100, CN110, CN212 and CN213

Containerization motivations and implementation

  • Usecases
  • Comparison to virtual machines

Creating, managing and auditing containers

  • Container implementation from the Linux kernel
  • Container lifecycle details
  • Core container creation, auditing and management CLI

Best practices in container image design

  • Layered filesystem implementation and performance implications
  • Creating images with Dockerfiles
  • Optimising image builds with multi-stage builds and image design best practices

Single-host container networking

  • Docker native networking model
  • Software defined networks for containers
  • Docker-native single-host service discovery and routing

Provisioning external storage

  • Docker volume creation and management
  • Best practices and usecases for container-external storage.

Setting up and configuring a Swarm

  • Operational priorities of container orchestration
  • Containerized application architecture
  • Swarm scheduling workflow & task model
  • Automatic failure mitigation
  • Swarm installation & advanced customization

Deploying workloads on Swarm

  • Defining workloads as services
  • Scaling workloads
  • Container scheduling control
  • Rolling application updates and rollback
  • Application healthchecks
  • Application troubleshooting
  • Deploying applications as Stacks

Networking Swarm workloads

  • Swarm service discovery and routing implementation
  • Routing strategies for stateful and stateless workloads
  • Swarm ingress traffic

Provisioning dynamic configuration

  • Application configuration design
  • Environment variable management
  • Configuration file management
  • Provisioning sensitive information

Provisioning persistent storage

  • Storage backend architecture patterns
  • NFS backed Swarms

Monitoring Swarm

  • What to monitor in production-grade Swarms
  • Potential Swarm failure modes & mitigations
  • Swarm workload monitoring

Mirantis Kubernetes Engine Architecture

  • Production-grade deployment patterns
  • Containerized components of MKE
  • Networking & System requirements for MKE
  • Installing MKE via Launchpad for high availability

Access Control in MKE

  • MKE RBAC systems
  • PKI, client bundle and API authentication
  • Swarm and Kubernetes access control comparison

L7 Networking Features

  • Interlock for Swarm
  • Istio for Kubernetes
  • Sticky sessions, canary or blue/green deployments, and cookie usage for both orchestrators

MKE Support Dumps

  • Generating and understanding MKE support dumps
  • Finding critical information in support dumps for troubleshooting MKE
  • Enabling and exporting API audit logs for disaster post-mortem

MKE Troubleshooting

  • Correlating MKE symptoms with components
  • Probing and reading MKE state databases
  • Recovering failed MKE managers
  • MKE backups & restore
  • Disaster recovery in event of critical MKE failure

Mirantis Secure Registry Architecture

  • Production-grade deployment patterns
  • Containerized components of MSR
  • Networking & System requirements for MSR
  • Installing MSR via Launchpad for high availability
  • Integrating external storage into MSR

Access Control in MSR

  • MSR RBAC system

Content Trust

  • Defeating man in the middle attacks with The Update Framework & Notary
  • Content Trust usage in MSR

Security Scanning

  • Auditing container images for known vulnerabilities
  • Setting up MSR security scanning
  • Security scan integration in continuous integration

Repository Automation

  • Continuous integration pipeline architecture featuring MSR
  • Promoting and mirroring images through pipelines
  • Integrating MSR with external tooling via webhooks

Image Management

  • Image pruning and garbage collection strategies and automation
  • Registry sizing strategy
  • Content caching for distributed teams

MSR Troubleshooting

  • Correlating MSR symptoms with components
  • Probing and reading MSR state databases
  • Recovering failed MSR replicas
  • MSR backups & restore
  • Disaster recovery in event of critical MSR failure