IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Linux Container Technology Explained (Contributed)

DevOps teams can develop applications more effectively when vital software code is packed and used within a single environment — regardless of the hosting situation — generating flexibility, resilience and efficiency.

State and local governments’ IT departments increasingly rely on DevOps practices and agile development methodologies to improve service delivery and to help maintain a culture of constant collaboration, iteration, and flexibility among all stakeholders and teams. 

However, when an IT department adopts agile and DevOps practices and methodologies, traditional IT problems still need to be solved. One long-standing problem is “environmental drift,” when the code and configurations for applications and their underlying infrastructure can vary between different environments. State and local IT teams often lack the tools necessary to mitigate the effects of environmental drift, which can hamper collaboration and agility efforts.  

The use of Linux containers can help address this challenge. Linux containers can facilitate application code and configuration sharing, while allowing for state and local governments to use DevOps methodologies to maintain their investments in their legacy infrastructures. 

Linux containers are open source software technology that allow developers to package applications within their entire environment. They’re easily transportable between development, testing and production, which makes them analogous to traditional shipping containers. 

Imagine, for example, the potential logistical nightmare of managing boxes containing different brands of flat-screen TVs, phones, laptops or whatever, all mixed together in the hull of a boat. Shipping containers bring order to this chaos; Linux containers do the same for application development.

In the IT space, Linux containers are increasingly used to pack everything an application needs to run in an isolated logical artifact, regardless of the hosting environments. Unlike a virtual machine, which might typically run several applications co-located together on a single virtual host, Linux containers remain isolated and may include the application, frameworks, and associated dependencies in different hosts. Once packaged into a single container image, an application and everything required to run the application can be shipped around in a standardized way. 

Linux containers are usually combined with microservices — services that power applications running within the Linux containers. Microservices also can serve as different threads with an application. An application can be configured, changed and tested within a single Linux container separately, without disrupting how an application is run or managed. 

Reusability and shareability

Traditionally, IT operators have to go through a lot of manual steps to install and upgrade software. For instance, they may once have had to peruse a Word document just to run through certain steps to deploy an application. Conversely, rolling back an application to a well-known state has also been a challenge and often involves resetting many configurations or even an entire environment. 

Leveraging containers helps reduce human error and makes applications less brittle by starting with well-known and working states. For instance, if a new version of a container causes something to break, teams can simply deploy a new container with a fix or roll back to a previously known working version. This process can help eliminate environmental drift and often takes seconds to complete instead of hours.

With Linux containers, the application can be easily built on a laptop and then shared among various disparate systems. The Linux container image includes everything the application needs to run, regardless of where it is deployed. 

Since the container images are easily shared and shipped, they can also be easily customized. Developers can add new artifacts to container images. IT operations and InfoSec teams can also easily add appropriate controls and tools to container images as necessary to comply with regulations and policies. 

In fact, any DevOps stakeholder can contribute to an application or service in a standardized way. This helps government agencies make the quick changes needed to satisfy new laws, regulations, and initiatives and then get those changes out to different hosting environments and physical locations.

A Platform for Efficiency

While greater portability improves shareability and efficiency among team members, Linux containers can help state and local governments make more efficient use of their existing technology resources. In addition to better utilization of CPU and memory, the added computing density of Linux containers works well when compared to running applications on VMs or individual servers and workstations. 

Furthermore, resource-constrained DevOps teams can take advantage of other new and emerging open source platforms to efficiently manage container deployments. For example, teams can use the Kubernetes container orchestration system to run many containers in a standardized manner. 

Better Damage Control

One unfortunate truism in IT is apps will always fail. However, unlike when an application bug might bring down an entire virtual machine, a container’s isolated boundaries means an app running on a server or workstation will not impact neighboring containers that are running on the same physical or virtual server. Essentially, the “blast radius” remains limited to just that application and the resources allocated to it. 

In each of these ways, containers have the potential to provide state and local government DevOps teams with an enormous amount of flexibility, efficiency, and resiliency. They are an ideal solution for agencies that want to facilitate better collaboration and accelerated development of citizen-centric applications and services. 

John Osborne is a principal openshift architect with Red Hat Public Sector. He has been largely focused on the role of Kubernetes in government IT modernization for more than three years. Before his arrival at Red Hat, he worked at a start-up and then spent seven years with the U.S. Navy developing high-performance applications and deploying them to several mission-critical areas across the globe. He is also co-author of OpenShift In Action.