What do you understand by application containerization, its purpose and benefits over virtualization for Devops testing?

(Techno ) #1

Earlier we have worked on virtualization using Hyper-V for configuring VMs for client requirements. Now, We got to know about the new Devops related concept known as containerization. So, how containerization is recommended over virtualization for Devops testing?

2 Likes
(Ola) #2

The main benefit with containerization is the management of the runtime environment. With VMs you typically bundle your code and then run that code in different runtimes as in operating systems, patch levels and if you are not careful with your dependency management, different versions of your dependencies. Something that we did was running a dll in a 32 bit environment in production but a 64 bit environment in production. With containers instead of bundling the code you bundle the OS / users / access / dependencies and code in a container. This means for your application the runtime environment will be the same for local development, test environments and in production.

Regarding testing specifically you are more sure that your test environment is closer to the production like qualities in terms of runtime.

Another benefit for me have been that you can also much easier setup and maintain a lot of different test environments or run a more complete system locally.

If you want to know more about how and why this may be helpful. https://12factor.net/

3 Likes
(Vishal Dutt) #3

The present-day custom of virtualization articulated in application containerization. A container is nothing but executable packages or suites of software and application containerization is an operating system level virtualization technique used to deploy & run applications without launching the entire VM for individual application.

Containerization: In QA services, the deployment and configuration made easy by use of containerization, it not only saves time but also speed up the deployment procedure unlike traditional time-consuming methods. During DevOps testing, by using containers such as Docker, a single master image used to deploy rapidly whenever required. For example, Primary difference is that containers provide a way to virtualize an OS so that multiple workloads can run on a single OS instance.

Virtualization: It is a technique to create virtual form of system resources. For example, operating system, memory or network resources etc. Virtualization enables software applications to run on virtual hardware by means of virtual machines and Hyper-V. For example, with VMs, the hardware is being virtualized to run multiple OS instances.

Hope this information is helpful for you.

1 Like
(Darrell) #4

Originally, operations would rack a computer, install the OS, patch it, etc… Software development would deploy their application on the computer and testers would test it. When we started doing automation, there would be tools which would test the application. For enterprise, web-based applications, we’d hit the application using a browser. Requesting an environment might take weeks to set up. Sometimes it might even be months.

There were also tests to ensure the OS was installed correctly, the application deployed, etc… There would also be monitoring and alerting solutions to ensure the environment continued working.

When we created a virtual computer, how we deploy the application to the server and test it didn’t really change much. The whole environment, from testing, monitoring and alerting tools wouldn’t be much different from a real computer.

A VM would be completely independent of the host OS. But this means the libraries, security patches, etc. could be different on a VM versus the host OS. Testing the host OS wasn’t necessary. The VM would take up as much resources as a real computer. If your host OS required 30G hard drive and 8G RAM and the VM required 30G hard drive and 8G of RAM, you would need the host computer with 16G of RAM and a 60G hard drive.

With containers, the container would run an OS but it would use shareable resources from the host OS. This means that the container might need only 5 to 10% as much RAM and disk space. The computers which could run 4 to 8 VMs might now run 40 containers.

Testing that the container exposes all the correct ports is now more important. You might also want to make sure that containers can talk to one another. This used to be routing, network, DNS, etc. from the old system but now it is part of the containerization. Containers are now programmable and therefore the onus is on the DevOps team to test things are set up correctly. For me this has become something I know look at as a tester.

Additionally, there have been little quirks around containers. A system library might be considered reentrant but turns out it is not. So the host OS might be using a system library and the container is using that library. If the library isn’t reentrant then you can have failures which are hard to reproduce. Knowing how to spot these are be tricking and something which has fallen on me as a tester. Reproducing these failures can be quite difficult. Keeping current on containers and know issues is now more important.

Another issue I have seen is people buying into the hype that if I use containers I can build once and run it locally, development environments, production, etc. and everything will be the same. The problem which this is how a container runs on a production host could be very different from a development computer. So if you are running the containers on RedHat Linux in production but against CentOS locally and development, there might be small difference. A failure in one environment might not be the same in another environment.

If you are developing on a Windows computer and deploying to a Linux environment in production, things can be very different. It is like testing Excel on Windows is very different from testing Excel on a Mac.

There is no doubt that using a container locally is going to be closer to running the same container in production but don’t assume they are identical.

I have also found that the move to DevOps testing means ensuring the environment remains stable. A container might set up the environment and initially testing the environment might confirm things are working. Every time you deploy a new application, you have tests to ensure old requirements have not regressed. However, you now need something to ensure the environment hasn’t changed. An environment could be altered by an application or altered by someone hopping on the box and tweaking it. So you need monitoring solutions to occasionally check that nothing in the environment has changed,