Introduction about Deployment

Deployment is the process of copying everything from your local machine to a remote server. For over twenty years, the client-server architecture still is used and there is no signal shows it will be changed in the near future. Based on that, in the very high level of deployment, there are no changes from the beginning.

If you look a little deep inside, there are lots of changes in terms of the “deployment process”. But the differences are only the powerful tools we used. So dividing the history by the tools we used is somehow a joke! We need some clear criteria. This is one of the hardest criteria I have ever made. My criteria are “Cloud”. In clearer words “infrastructure management”.

Let’s get through the first stage where we need to manually manage our remote server.

The first stage of web deployment

Many years ago, people usually consider between a dedicated server and shared host, or VPS, we called a “remote server”. They have some differences in some points, but the same thing is, the developer MUST manually set up and manage. I assume that you have already known what is the difference between a remote server and the local one.

There are some smaller stages where the deployment process has been dramatically improved. I divided based on the automatic techniques we used:

  • The first small stage: there is no automatic tool at all, we need to manually do everything.
  • The second small stage: version control where rollback and switch versions became easier.
  • The last small stage: container era. It is okay if you have not known what is “container era” yet. It is a kind of cultural word rather than an official technical definition.

The Simplest deployment form

We used to do the following particular steps to deploy a website into a production environment:

  • Testing everything and make sure it works on the local environment.
  • The final code should be copied and paste into the remote server via SSH or FTP.

This process is error-prone, hard to control, and manage. There are more problems like code conflict, user management, and so on. But the most terrible thing was, during the time one guy deploying the website, all others need to pray and hope the production will work exactly the same as the local machine where in fact, they are totally different.

Version control system joins the playground

The version control system is one of the greatest inventions for software development with some features:
– Many people can work together, no need to wait for another to complete their jobs first. It works similarly to the multi-thread or parallel concept in programming.
– Rollback and conflict resolve process become much easier.

One of the core concepts of the version control system is still based on client-server architecture. It has a “centralized version control server”, and many clients. When a client pushes something to that “centralized version control server”, other clients can get it, that is all.

So the deployment process now has another way to achieve by using this point. Let call our remote server is X and the “centralized version control server” is Y. The particular steps are:
– Install version control system on X.
– Push the final code into Y from the local machine.
– Inside X, pull down the desired code from Y and copy all into the desired place.

With the help of multiple tools like Git (a version control system), Jenkins (an automation server), we got lots of benefits:
– Reduce error-prone by the wrong actions by humans.
– Jenkins made a dream about an automatic deployment process come true.

But still, there a critical problem they can not resolve: different environments, or in exactly, Operating System environments. And the next one, container help us resolve this problem.

In short, before version control system and automation tools, we have to:
– Manually manage codebase.
– Manually dirty works like copy code base on the remote server.

Container Era

You may probably hear this terminology at least once somewhere. It usually comes with Docker nowadays.

The container concept resolves many problems but in the scope of this article, I just want to talk about the “deploy and pray” problem that we mentioned before. The point is, the container allows us to choose the desired environment we want. No more different environments, no more pray.

In case you have not known what is a container yet, it just an isolated and virtual space where the inside content does not matter by the outer OS environment. For example, you can run some specific Linux commands inside a container where the OS is Windows. You can imagine simply it is like a minimal virtual machine that you can install and run it everywhere. In fact, it much more complex.

Docker is a group of techniques around the container. For the deployment purpose, it works like the version control system in the previous system. We have a “centralized container server” and many clients. When a client pushes something to that “centralized container server”, other clients can get it, that is all. Docker called its “centralized container server” with the name **Docker Hub**.

So again, our deployment process now based on this point. Let call our remote server is X and the “centralized container server” is Y. For example, we use docker, Y will be Docker Hub. The particular steps are:
– Install Docker on X.
– Push the final container into Y from the local machine.
– Inside X, pull down the desired code from Y and copy all into the desired place.

Sorry that I duplicated the boring process from the above section. But it works almost the same except everything you did with the code, you will do with the container instead:
– You now have to “push” the container which has your code inside. It is the way Docker works.
– You now have to copy the container itself instead of code inside the container.

Of course, we do not need (should not actually) to manually do the process. Please use an automation tool like Jenkins instead. There are many tools like Jenkins nowadays but I do not think we should take time on that.

Let summarize a bit about Container Era from a deployment point of view: there is no deploy and pray anymore!

The current stage of deployment with Cloud

Like many other technical solutions, we should choose carefully. When I said “the current stage”, it does not mean you should use it everywhere, in every project. No, everything has its own price.

From the previous section, I hope you can see how the container concept and automation tools dramatically help the deployment process. But one problem, we still need to manage our server by ourselves. Cloud helps us solve this problem.

Cloud is our big future which has enormous functions along with an insane number of containers inside. I will not list out all the benefits of Cloud, but it has some problems with its infrastructure. One of them is now you MUST use the Cloud solution to solve Cloud problems**.

There is no SSH directly to the Cloud! Because we do not need it anymore.

Now, to connect to the Cloud, we must use it’s prebuilt tool. For example we have currently three Cloud providers are:
– AWS from Amazon.
– Google Cloud Platform from Google.
– Azure from Microsoft.

All they are backed by large companies. They are almost the same and it is extremely hard to choose between them. Just in case you need my advice, just try and go with the thing you like most. All the comparison in this world, for everything, just relative only. Because all they are enough for your requirement. If not, I believe that you experience enough and do not even need somebody else like me for advice.

Back to our deployment process. So all the above providers offer a tool for you for multiple purposes with support containers and scalable by default. You need to sign up for an account, follow instructions, click on something that makes sense, and it just works like a charm. There is no particular step now. Amazing! You now have a reliability, scalability system in your hand without touching any dirty works. I believe this is the best experience of product building you want to try ever.

Some more thoughts

Unlike programming to make a product, the deployment process requires perfection. One small issue can make a system go down. That is the reason why we need more DevOps nowadays. If you think they are just the guys who just show how they fast typing on the ugly terminals, it is seriously wrong. DevOps has more impact than they look and play a crucial role in modern web architecture.

Hopefully, I describe a landscape about the history of software deployment. Thank you for your valuable reading time.

Some references

https://css-tricks.com/deployment/
https://www.ddev.com/devops/dev-to-deploy-the-evolution-landscape-of-modern-web-development/
https://www.edureka.co/blog/what-is-jenkins/
https://kariera.future-processing.pl/blog/building-and-deployment-multi-branch-web-application/
https://www.digitalocean.com/community/tutorials/how-to-build-a-node-js-application-with-docker
https://medium.com/@habibridho/docker-as-deployment-tools-5a6de294a5ff
https://docs.docker.com/get-started/kube-deploy/
https://docs.docker.com/engine/swarm/stack-deploy/

 

Let our talented developers implement your apps so you can focus on running your business

Contact Us