ArganoのWanです。 この記事では、VPSを使ってコンテナをデプロイする方法について紹介します。
Introduction
In this post, we will discuss how to deploy your website using container-based deployment and a Virtual Private Server (VPS). Once the manual foundation is set, we will dive into automating the entire deployment process using a CI/CD pipeline. By the end of this guide, you will have a nice workflow that updates your site every time you make any changes to your source code.
About container
What is container
A container is an isolated environment that allows our software to run consistently.
To create a container, we first prepare a container image, which is a blueprint containing everything needed to run the application—such as the source code, system libraries, and the runtime.
When we run this image on an engine like Docker, it becomes a container that behaves exactly as it did in the local development environment.
Why are we using container
It might seem complex, so why not just put the source code on a Virtual Machine (VM) and build it there?
The primary reason we use containers is to solve the infamous “It works on my machine” problem. In the past, it was common for code to work on a developer’s laptop but fail in production due to different Node.js versions, OS discrepancies, missing libraries, or forgotten environment variables like API keys. Without these exact dependencies, the app would crash.
How did container solve the problem
The container image solves this by “locking in” everything: the OS, your specific React libraries, and every dependency required for the app to function.
By storing everything in the image, we ensure the same results regardless of where the app is running. This significantly increases production stability and simplifies deployment; instead of configuring a VM from scratch, we can simply deploy the image, knowing it will work exactly as it did during development.
About deploy
What is Deployment?
Deployment generally means making your website available on the public internet.
This allows people all over the world—not just you on your laptop—to access your web pages and see what you have created. It’s an amazing process!
To deploy a website, you need a computer that is constantly connected to the internet. While you could use your own laptop, it isn’t practical because you need to carry it around, and the connection may drop. You might consider using a dedicated local PC, but today, we are going to discuss the most reliable method: Cloud Deployment.
What is Cloud Deployment?
The “Cloud” is essentially just someone else’s computer. However, what makes cloud providers special is their core mission: ensuring their servers are always running, highly secure, and connected to high-speed internet. This is exactly what is required to host a professional website.
You may have heard of Amazon Web Services (AWS), which is world-renowned and used by many major corporations. However, AWS is often criticized for its steep learning curve, complex permission settings (IAM), and high costs. In this guide, we will explore how to deploy your website efficiently without relying on AWS.
Deploy
Docker Image
Prepare your source code and double-check it. Ensure it is working. Then, use Docker (install it beforehand if you haven’t) to create the container image. Next, we need to store our container image in the cloud. Create a Docker Hub account and then push your image from your local machine to your Docker Hub.
cd your-project
docker login
# Create your container image
docker build -t your-username/my-app:latest .
# Push your container image to the docker hub (Cloud)
docker push your-username/my-app:latest
Server
This time, we will use a Virtual Private Server (VPS) to deploy our website. It provides us with a Linux environment and a static IP—and that’s all. No complicated settings (or hidden costs). To use the service, we first have to sign up for a VPS provider. We are going to use DigitalOcean, a reputable VPS provider. Use other VPS provider if you got any preference. The beauty of container-based deployment is that the steps remain identical across different providers; once your Docker environment is set up, your deployment process is fully portable.
Yet, we will use DigitalOcean in this passage for easy understanding. To use the service, create your DigitalOcean account, then choose the Droplet service. “Droplet” is just their marketing name for a Linux virtual machine. At first, start with the cheapest option. We can adjust this later according to our needs. Regarding the operating system, we recommend Ubuntu if you don’t have a personal preference. The reason is that with its massive user base, there are plenty of tutorials compared to other operating systems, which is very helpful for anyone new to Linux. For authentication, we will use SSH this time, as it is more secure than using a password.
# Create your SSH key
cd your-project
# Use the -t option to choose the algorithm
# Use the -b option to set the key size
# Use the -f option to choose where to save the key
ssh-keygen -t rsa -b 4096 -f ./do_key
# You will see the below prompt:
# Enter passphrase for "./do_key" (empty for no passphrase):
# You may enter extra passphrase to boost the security of the SSH key
# Once you are done, you will get two new files, do_key and do_key.pub
# do_key is your private key, keep it secret!
# Add "do_key" to your .gitignore so it isn't uploaded to GitHub.
# do_key.pub is your public key, give this to the service provider you are registering, in this case, DigitalOcean
When your virtual Linux room is ready, enter your virtual machine and let’s get ready to deploy our website. First, we need to get our container image from Docker Hub. Install Docker first; then, get the image that was previously stored there. After that, run the container image. That’s it! By now, your website is visible to everyone in the world who has internet access.
Congratulations! You have deployed a website using a VPS! Just type your Droplet’s IP address into your browser’s address bar to see your work!
# Enter your Virtual Machine from Local through SSH
ssh root@your_vps_ip
# Install Docker
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
# Pull your container image
docker pull your-username/my-app:latest
# Run your container image
docker run -d --name my-app-container -p 80:3000 --restart always your-username/my-app:latest
Deploy Pipeline
Why do we need a pipeline
Now we got our website on the internet, which is a very important step! Yet, over time, our website will have updates, or some bug fix. It would be time-consuming for us to do the above steps all again just to update the website. So for now, let’s build a pipeline to update our website whenever we make changes to the website.
What is a pipeline
A pipeline is a software infrastructure that will update your deployed website whenever you make changes. What a pipeline will do is just like what we have achieved, build the container image, push it into the Docker Hub, pull it back in our VPS, then run our container image. What is different is just we automate it so that we do not need to handle it every time. By automating it, we can also reduce the chance of error as human involvement is reduced.
How does a pipeline work
Just like normal pipeline connecting water tank to our bathroom, code pipeline should connect where we put the code to where we use the code. What even better is the pipeline will turn the code into container image. You can add any other steps in the pipeline as you like, for example testing on the code, or slack notification. But for now, let’s focus on the basic function of the pipeline, update our website whenever changes are made.
Setting up the pipeline
Ensure your project is currently on GitHub, then, create a GitHub action that will run whenever you push any commit to your main branch.
name: Deploy pipeline
# Run whenever you push anything onto the main branch
on:
push:
branches:
- main
Next, we need to start putting what the pipeline does whenever we update the main branch. The first thing we want is to create the latest container image of the website. Therefore, we put that into the GitHub action yaml. First, let’s checkout to our repository and log in to Docker.
# What you want the GitHub action to do
jobs:
build-and-push:
# Use the latest Ubuntu runner provided by GitHub
runs-on: ubuntu-latest
steps:
# Step 1: Check out the repository source code
- name: Checkout code
uses: actions/checkout@v4
# Step 2: Authenticate with Docker Hub
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
# CAUTION: Never hardcode credentials.
# Use GitHub Secrets to prevent sensitive information leaks.
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
Then, we will now automate the workflow of building a Docker image from the source code and push it to Docker Hub (a container registry). This ensures that our deployment image is always aligned with our GitHub repository and ready to be pulled by our VPS at any time.
# Step 3: Build the image and push it to the registry
- name: Build and push
uses: docker/build-push-action@v5
with:
# Use the root directory as the build context
context: .
push: true
# Tags allow us to version and track images in the registry
tags: ${{ secrets.DOCKERHUB_USERNAME }}/my-app:latest
Next, we need to pull the image from the VPS, this allow our linux environment in cloud access to our latest Docker image.
# Step 4: Pull the latest docker image from our VPS
- name: Deploy to DigitalOcean VPS
uses: appleboy/ssh-action@v1.0.3
with:
host: ${{ secrets.VPS_IP }}
username: ${{ secrets.VPS_USERNAME }}
key: ${{ secrets.VPS_SSH_KEY }}
script: |
# 1. Pull the latest image
docker pull ${{ secrets.DOCKERHUB_USERNAME }}/my-app:latest
# 2. Stop and remove the old container (if it exists)
docker stop my-app-container || true
docker rm my-app-container || true
# 3. Run the new container
docker run -d \
--name my-app-container \
-p 80:3000 \
--restart always \
${{ secrets.DOCKERHUB_USERNAME }}/my-app:latest
At this stage, you have successfully configured a GitHub Action that will automatically re-deploy your application whenever changes are pushed to the main branch.
As a final step, ensure all credentials required by the workflow are stored securely in GitHub Secrets. Without these, the pipeline will fail during the authentication stage. With this setup complete, you now have a fully automated CI/CD pipeline running from your repository to your VPS.
Now, feel free to push any changes to your main branch. You will then see the GitHub action is running and the changes you have made will be updated to your deployed site in a few minutes.
Conclusion
In this post, we have discussed the benefits of container-based deployment, the basics of managing a VPS, and how to set up an automated CI/CD pipeline. These are useful for making our lives easier in deployment.
However, this is just the beginning. A pipeline can grow with your project. You can add as much as you like inside the pipeline, for example automated testing to ensure code quality, or add a slack/ discord notifications to alert your team when a new version is live. The possibilities for automation are endless, so feel free to explore and customize the workflow to fit your needs!
