CONTAINERISING A BACK-END WEB APPLICATION USING DOCKER COMPOSE
CONTAINERISING A BACK-END WEB
APPLICATION USING DOCKER COMPOSE
INTRODUCTION:
In
today's digital world, applications like websites, mobile apps, and online
services rely on many small pieces working together — such as the code that
runs the app, the database that stores the information, and the environment
where everything runs smoothly. Managing all of these parts can be difficult,
especially when moving from one computer to another or from a developer's
laptop to a server in the cloud.
That’s where containerization
comes in.
In simple words, containerization is like putting your entire
application — along with everything it needs to run — into a tightly sealed box
(called a container). This box can
then run the same way on any computer, anywhere in the world. It removes the
“it works on my machine but not on yours” problem.
One of the most popular tools used for containerization is Docker. It allows developers to create
and manage containers easily. Another helpful tool, Docker Compose, makes it even more efficient by allowing developers
to define and run multiple containers together using a single configuration
file.
In this article, we will:
●
Build
a simple backend service using a popular tool called Express.js (which helps create web servers)
●
Connect
it to a database called MongoDB
(which stores and manages data)
●
Use
a tool called Docker Compose to run
both of these parts together in containers, with just one command
By the end, we will have a working backend application that
is packaged, portable, and ready to run on any system — without any manual
setup or errors.
This is a practical example of how modern developers build and manage applications efficiently using container technology.
METHODOLOGY:
STEP- 1
Create a folder named my-docker-app (or any name of your
choice) at the root of the C: drive on your Windows system. This folder will
act as the root directory for the entire project.
Fig 1: Project Directory Structure in File Explorer
Inside
the my-docker-app directory
●
Create a
subdirectory named server
●
Create a file
named docker-compose.yml in the root
directory.
STEP-2
Within the server
directory, create the following three files:
a) Dockerfile
●
This file
contains step-by-step instructions that Docker uses to build the backend
application image.
●
It typically
defines the base image, working directory, dependencies to install, and the
command to run the application.
●
The filename must
be Dockerfile (with no extension) by convention, although Docker allows it to
be customized if needed.
b) index.js
●
This JavaScript
file serves as the main entry point for the backend server.
●
It uses Express.js to handle API routes and Mongoose to connect to a MongoDB
database.
●
The filename is
not fixed and can be named differently, but it must retain the .js extension.
c) package.json
●
This file is
essential in every Node.js project. It defines the project metadata,
dependencies, scripts, and main entry point.
●
The filename must
be package.json — this is mandatory for Node.js to recognize and process it.
●
In this project,
the required dependencies are:
●
express: A web
framework for building server routes and handling HTTP requests.
●
mongoose: An ODM
(Object Data Modeling) library for MongoDB used to model and interact with the
database.
●
The .json
extension stands for JavaScript Object Notation — a lightweight,
human-readable, and machine-parsable data format.
d) docker-compose.yml
●
This YAML
configuration file is used to define and manage multi-container Docker
applications.
●
It specifies
services (such as the backend server and MongoDB), their dependencies,
environment variables, networking, and volume configuration.
●
With Docker
Compose, the entire application stack can be launched using a single command.
●
The .yml
extension refers to YAML (YAML Ain’t
Markup Language) — a user-friendly data serialization language used for
configuration files.
DIRECTORY STRUCTURE:
my-docker-app
│------server
│ |------Dockerfile
│ |------index.js
│ |-------package.json
└-----
docker-compose.yml
Fig 2: Contents of
the server Directory
NOTE – While following, it is crucial to
use the exact extensions as specified. This ensures that all components of the
Docker Compose setup function correctly and are recognized by the system during
execution.
STEP – 3
Open the index.js
file in Visual Studio Code, enter the code, and save the file.
CODE:
const
express = require("express");
const
mongoose = require("mongoose");
const
app = express();
const
PORT = 3000;
app.use(express.json());
mongoose.connect("mongodb://mongo-server:27017/testdb",
{
useNewUrlParser: true,
useUnifiedTopology:
true,
});
const
userSchema = new mongoose.Schema({
name:
String,
});
const
User = mongoose.model("User", userSchema);
app.post("/add",
async (req, res) => {
const
newUser = new User(req.body);
await newUser.save();
res.send("User
added!");
});
app.get("/users",
async (req, res) => {
const
users = await User.find();
res.json(users);
});
app.listen(PORT,
() => {
console.log(`Server running at http://localhost:${PORT}`);
});
The index.js
file is like the main control room of our backend application. It tells the
computer how to start the server, how to connect to the database, and what to
do when someone visits the app.
1. Starts a Web Server
The file
uses a tool called Express, which helps in creating web servers easily.
Think of a web server like a receptionist at a help desk — it waits for people
(users or browsers) to send requests and then replies with the right message or
data.
This server
listens on a specific port number, in this case, port 3000.
A port is like a door number — it tells the computer where the server is
located so visitors can reach it.
2. Connects to a Database (MongoDB)
The file
also uses a tool called Mongoose to connect to MongoDB, which is a type of
database that stores data in a flexible, easy-to-use format (like digital
notebooks).
But how does
it know where the database is?
It uses a special MongoDB URL which is given through an environment variable
(like a secret note passed from outside the code). This helps the app connect
to the correct database whether it’s running on your laptop or on a server far
away.
If the
connection is successful, it shows a message like “Connected to MongoDB” — if
not, it prints an error.
3. Sets Up a Route to Show a Message
Once the
server is running and connected to the database, it sets up a route — like a
small path that people can visit in their browser.
In this case, the route is / (the home or starting page).
When someone
visits http://localhost:3000/ in their browser, the server replies with a
welcome message like:
“Welcome!
Express + MongoDB running inside Docker.”
This shows
that everything is working fine.
4.
Keeps the Server Running
Finally, the
file tells the computer to keep the server running, so it can continue to
receive and respond to requests.
STEP-4
Open the package.json
file in Visual Studio Code, enter the code, and save the file.
CODE:
{
"name": "myapp",
"version":
"1.0.0",
"main": "index.js",
"dependencies":
{
"express": "^4.18.2",
"mongoose": "^6.0.12" }
}
The
package.json file is one of the most important files in a Node.js project. You
can think of it as the "blueprint"
or "instruction manual" for your backend application. It tells
the computer everything it needs to know to run the project correctly.
It does not
contain any actual code, but it contains important details like:
●
What
the project is called
●
What
tools (called packages) are needed
●
How
to start the application
●
Which
file to run first
●
What
version the project is
When someone
else (or even Docker) tries to run your project, the package.json file helps
them automatically install everything needed and know how to start the server.
STEP-5
Open the Dockerfile
in Visual Studio Code, enter the code, and save the file.
CODE:
FROM
node:14
WORKDIR
/app
COPY
package*.json ./
RUN npm
install
COPY .
.
CMD
["node", "index.js"]
The
Dockerfile is like a set of instructions
that tells Docker how to prepare and
package your backend project into a container — a special sealed
environment where your app runs safely and the same way on any computer.
Think of it
like making an instant food packet:
Everything your app needs — its tools, files, and setup — are written as steps
inside the Dockerfile. Docker then reads these steps and builds a ready-to-run image.
STEP – 6
Open the docker-compose.yml
file in Visual Studio Code, enter the code, and save the file.
CODE:
version:
"3"
services:
express-server:
build:
./server
ports:
-
"3000:3000”
depends_on:
-
mongo-server
mongo-server:
image:
mongo
ports:
- "27017:27017"
Instead of
manually starting each part one by one, the docker-compose.yml file allows us
to define both services in a single
place. Then, with just one command, Docker sets everything up
automatically.
Here’s how
it works overall:
●
It
tells Docker to build the backend server
using the Dockerfile present in the backend folder.
●
It
also tells Docker to pull and run
MongoDB from Docker Hub, which is an online store for ready-made container
images.
●
The
file ensures that both containers can communicate
with each other using internal networking. For example, the backend can
access MongoDB just by using its name (mongo) instead of any IP address.
●
It
defines which ports should be exposed
so that we can open the app in our browser (e.g., visiting
http://localhost:3000).
●
It
ensures the correct startup order —
MongoDB starts first, and then the backend starts after that, so everything
runs smoothly.
STEP - 7
Open my-docker-app in visual studio code.
Now go to terminal and click on new terminal
A new
terminal gets opened.
STEP – 8
In the terminal, execute the following command to start the
application stack:
docker-compose up
This command initializes and runs both the Express.js backend
service and the MongoDB database container as defined in the docker-compose.yml
file.
●
Before executing this command, ensure the following prerequisites
are met:
1)
Docker Desktop is running and the Docker Engine is active.
2)
Docker is configured to use Linux containers, as this project
is intended to run in a Linux-based container environment.
To check if
Docker Desktop is running, open the Docker Desktop app and make sure it shows
that Docker Engine is active.
Fig 3: Docker desktop
running and the docker engine is active
Ensure that Docker is running in Linux container mode,
as this project uses Linux-based images. To verify, run
docker info
in the command prompt and check the output as given in
below image.
Fig 4: Docker
running in Linux container
Fig 5:
Express and MongoDB Containers Running via Docker Compose in VS Code
STEP – 8 (Testing)
To verify that the backend server is operational, open
a web browser and navigate to:
http://localhost:3000/users
If the server is running correctly,
Fig 6: Server running
STEP – 9 (Test the Backend API using
Postman)
Download
postman application in desktop
●
Postman is a tool used to test
and interact with our backend server built using Express.js and MongoDB.
Since this project doesn’t include a frontend, Postman acts like a client that
sends requests (such as GET and POST) to the server. It helps us check
if the server is running properly, the database is connected, and the routes
are working as expected. We can also send data in JSON format, view responses, and check for any errors. In short,
Postman lets us test the backend easily without needing a user interface.
Open the postman application to test the API.
Create a New Request
● Click "New" → "Request" or click "+" tab to open a new request tab.
Fig 7: Creating new request in postman
Enter the
URL:
http://localhost:3000/add
Set the Request Body
●
Go
to the Body tab.
●
Select
raw and choose JSON from the dropdown (on the right side).
Fig 8: Enter the URL and set the request body
Set the Request Type and URL
●
Select
POST from the dropdown.
Use the
following JSON to add a user: {"name": "Jyoshika"}.
Fig 9: Inserting user
into the database
After
submitting the request, the response message “User added!” confirms that the user has been successfully inserted
into the database.
Fig 10: Sending the
request
After
submitting the request, the response message “User added!” confirms that the user has been successfully inserted
into the database.
Navigate to
the URL in your browser to verify that the user has been successfully added to
the database.
Fig 11: User added to
the database
CONCLUSION:
This project
clearly demonstrates the power and practicality of containerization in modern
software development. By using Docker and Docker Compose, we were able to
build, package, and run a complete backend application—consisting of an
Express.js server and a MongoDB database—inside isolated containers that work
the same on any system.
We started
by setting up a well-organized project structure, created the necessary files
for the server, and defined our dependencies using package.json. We then wrote
backend logic in index.js to handle basic operations like adding and retrieving
users. With the help of Docker, we converted our code into a container image,
and through Docker Compose, we linked our server with a MongoDB container so
they could run together as one connected application.
By running
just one command, we launched both services, without worrying about software
installation, configuration issues, or environment mismatches. This not only
saved time but also made the entire system portable, scalable, and easy to
deploy on any machine—whether it’s a personal laptop, a testing server, or a
cloud platform.
We also used
Postman to test our backend API, verifying that our server was running
correctly, data was being saved to the database, and everything worked smoothly
even without a frontend interface.
In summary,
we successfully:
●
Built
a working backend service using Express.js and MongoDB
●
Containerized
both parts into independent, reusable units
●
Used
Docker Compose to manage and run them together effortlessly
●
Tested
the API functionality using Postman
●
Created
a development environment that is consistent, error-free, and ready to deploy
anywhere
This
hands-on project not only taught us how containerization simplifies backend
development but also gave us a solid foundation for exploring more advanced
DevOps practices in the future. It reflects how modern developers build, test,
and manage applications more efficiently using tools like Docker in real-world
industry settings.

Comments
Post a Comment