Docker for Beginners: Containerize and Deploy Node.js Apps
Learn Docker basics, install on Linux, and deploy a Node.js app with Dockerfile, environment variables, and port mapping. Complete beginner guide

"But It Works on My Machine!" (A Developer's Nightmare)
"But it works on my machine!" I exclaimed to my team lead after my Node.js app crashed in production for the third time that week.
She smiled knowingly. "Let me guessâyou're using Node 16 locally, but production has Node 14? And you forgot to install that one npm package on the server?"
I nodded sheepishly. Different Node versions, missing dependencies, conflicting system librariesâevery deployment was a gamble. That's when she introduced me to Docker.
"With Docker, your machine IS the production environment," she said.
That conversation changed everything. Within a week, I was containerizing all my applications. Deployments went from hours of debugging to a single command. Today, I'm sharing what I learned, starting from absolute zero.
In this guide, you'll learn what Docker is, how to install it on Linux, and most importantlyâhow to deploy your first Node.js application with a proper Dockerfile, environment variables, and port mapping.
What Is Docker (And Why Should You Care)?
Docker is a platform that packages your application and all its dependencies into a standardized unit called a container.
Think of containers like shipping containers for software:
- Your code runs identically everywhereâlaptop, server, cloud
- Everything needed is packaged inside: code, runtime, libraries, configs
- Containers are lightweight and start in seconds
- You can run multiple isolated containers on one machine
Docker vs Virtual Machines
I was confused about this at first. Here's the key difference:
Virtual Machines (VMs):
- Each VM has its own complete OS (Windows, Linux)
- Heavy (GBs of disk space)
- Slow to start (minutes)
- Resource-intensive
Docker Containers:
- Share the host OS kernel
- Lightweight (MBs of disk space)
- Start instantly (seconds)
- Much more efficient
Imagine VMs as separate houses with their own plumbing and electricity. Containers are apartments in one building sharing utilitiesâmuch more efficient!
Real-World Example: How Netflix Uses Docker
Netflix runs thousands of microservices, each in Docker containers. When they need to scale during peak hours, they spin up hundreds of containers in seconds. Try doing that with VMs!
Docker Core Concepts (The Essentials)
Before we install Docker, let's understand the key concepts:
1. Image
A blueprint for your container. It contains:
- Your application code
- Runtime environment (Node.js, Python, etc.)
- System libraries and dependencies
- Default configurations
Think of it like a recipe for your application.
2. Container
A running instance of an image. It's the actual "box" where your app runs, isolated from everything else.
One image can create multiple containersâlike baking multiple cakes from one recipe!
3. Dockerfile
A text file with instructions to build an image. You specify:
- Base image (e.g.,
node:18) - Copy your code
- Install dependencies
- Set entry point
4. Docker Hub
Like GitHub for Docker images. You can:
- Download official images (Node.js, Python, MySQL)
- Share your custom images
- Browse thousands of pre-built images
Installing Docker on Linux (Step-by-Step)
I'm using Ubuntu/Debian in this guide, but the process is similar for other distributions.
Step 1: Update System Packages
First, update your package index:
sudo apt update
sudo apt upgrade -y
Step 2: Install Prerequisites
Docker needs a few packages to work properly:
sudo apt install -y \
apt-transport-https \
ca-certificates \
curl \
gnupg \
lsb-release
Step 3: Add Docker's Official GPG Key
This ensures you're downloading the authentic Docker packages:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
Step 4: Set Up Docker Repository
Add Docker's stable repository:
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
Step 5: Install Docker Engine
Now install Docker:
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
Step 6: Verify Installation
Check if Docker is running:
sudo docker --version
You should see something like: Docker version 24.0.7, build afdd53b
Test Docker with a simple container:
sudo docker run hello-world
If you see "Hello from Docker!", you're all set!
Step 7: Run Docker Without Sudo (Optional but Recommended)
Typing sudo every time is annoying. Add your user to the Docker group:
sudo usermod -aG docker $USER
Important: Log out and log back in for this to take effect!
Now you can run Docker commands without sudo:
docker --version
Essential Docker Commands (Your Daily Toolkit)
Before we build our app, let's learn the commands you'll use constantly:
Working with Images
# List all images on your system
docker images
# Pull an image from Docker Hub
docker pull node:18
# Remove an image
docker rmi image-name
# Search for images on Docker Hub
docker search nginx
Working with Containers
# List running containers
docker ps
# List all containers (including stopped)
docker ps -a
# Start a container
docker start container-id
# Stop a container
docker stop container-id
# Remove a container
docker rm container-id
# View container logs
docker logs container-id
# Execute command inside running container
docker exec -it container-id bash
Quick Cleanup
# Remove all stopped containers
docker container prune
# Remove unused images
docker image prune
# Remove everything unused (be careful!)
docker system prune -a
Pro tip: I run docker system prune weekly to free up disk space!
Building Your First Docker Node.js Application
Now for the exciting partâlet's containerize a real Node.js application!
Step 1: Create a Simple Node.js App
Create a new directory and initialize a Node project:
mkdir docker-node-app
cd docker-node-app
npm init -y
Install Express:
npm install express
Create app.js:
const express = require('express');
const app = express();
// Get port from environment variable or default to 3000
const PORT = process.env.PORT || 3000;
const ENV = process.env.NODE_ENV || 'development';
app.get('/', (req, res) => {
res.json({
message: 'Hello from Docker!',
environment: ENV,
port: PORT,
timestamp: new Date().toISOString()
});
});
app.get('/health', (req, res) => {
res.json({ status: 'healthy' });
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT} in ${ENV} mode`);
});
Test it locally:
node app.js
Visit http://localhost:3000 and you should see the JSON response!
Step 2: Create a .dockerignore File
Just like .gitignore, this tells Docker what to exclude:
node_modules
npm-debug.log
.env
.git
.gitignore
README.md
.dockerignore
Why ignore node_modules? We'll install dependencies inside the container, not copy from local!
Step 3: Create Your Dockerfile
This is where the magic happens! Create a file named Dockerfile:
# Use official Node.js 18 image as base
FROM node:18-alpine
# Set working directory inside container
WORKDIR /app
# Copy package files first (for better caching)
COPY package*.json ./
# Install dependencies
RUN npm install --production
# Copy application code
COPY . .
# Expose port 3000 (documentation purposes)
EXPOSE 3000
# Set environment variables
ENV NODE_ENV=production
ENV PORT=3000
# Command to run the application
CMD ["node", "app.js"]
Let me explain each line:
FROM node:18-alpine: Start with Node 18 on Alpine Linux (super lightweight!)WORKDIR /app: All commands run inside/appdirectoryCOPY package*.json ./: Copy package files first for Docker layer cachingRUN npm install --production: Install only production dependenciesCOPY . .: Copy our application codeEXPOSE 3000: Document which port the app uses (doesn't actually open it)ENV: Set environment variablesCMD: Command to run when container starts
Step 4: Build Your Docker Image
Build the image with a name tag:
docker build -t my-node-app:1.0 .
-t my-node-app:1.0: Tag the image with name and version.: Use current directory as build context
This takes a minute the first time as Docker downloads the Node image and installs dependencies.
Verify your image:
docker images
You should see my-node-app in the list!
Step 5: Run Your Container
Now run a container from your image:
docker run -d \
--name my-app-container \
-p 8080:3000 \
my-node-app:1.0
Breaking down the flags:
-d: Run in detached mode (background)--name my-app-container: Give container a friendly name-p 8080:3000: Port mapping - map host port 8080 to container port 3000my-node-app:1.0: The image to use
Visit http://localhost:8080 and your app is running in Docker! đ
Step 6: Working with Environment Variables
Let's pass custom environment variables:
docker run -d \
--name my-app-env \
-p 8081:3000 \
-e NODE_ENV=production \
-e PORT=3000 \
-e API_KEY=secret123 \
my-node-app:1.0
The -e flag sets environment variables inside the container.
For multiple variables, use an env file:
Create .env:
NODE_ENV=production
PORT=3000
API_KEY=my-secret-key
DATABASE_URL=postgres://user:pass@db:5432/mydb
Run with env file:
docker run -d \
--name my-app-env-file \
-p 8082:3000 \
--env-file .env \
my-node-app:1.0
Much cleaner!
Advanced Dockerfile Techniques
After using Docker for a while, I learned some optimizations:
Multi-Stage Builds (For Smaller Images)
# Stage 1: Build
FROM node:18 AS builder
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
# Stage 2: Production
FROM node:18-alpine
WORKDIR /app
COPY --from=builder /app .
EXPOSE 3000
CMD ["node", "app.js"]
This creates a smaller final image by copying only what's needed!
Using Non-Root User (Security Best Practice)
FROM node:18-alpine
# Create app directory
WORKDIR /app
# Copy package files
COPY package*.json ./
# Install dependencies
RUN npm install --production
# Copy app source
COPY . .
# Create non-root user
RUN addgroup -g 1001 -S nodejs
RUN adduser -S nodejs -u 1001
# Change ownership
RUN chown -R nodejs:nodejs /app
# Switch to non-root user
USER nodejs
EXPOSE 3000
CMD ["node", "app.js"]
Never run containers as root in production!
Debugging Docker Containers
Things don't always work the first time. Here's how I debug:
View Logs
docker logs my-app-container
# Follow logs in real-time
docker logs -f my-app-container
Execute Commands Inside Container
# Open bash shell inside container
docker exec -it my-app-container sh
# Alpine uses 'sh', other images use 'bash'
Inside the container, you can:
- Check files:
ls -la - View environment:
env - Test the app:
curl localhost:3000
Inspect Container Details
docker inspect my-app-container
This shows everythingâIP address, volumes, environment variables, etc.
Common Docker Mistakes (And How I Fixed Them)
Mistake 1: Forgetting Port Mapping
I built my image, ran the container, but couldn't access it!
Problem: Forgot -p flag for port mapping.
Solution: Always map ports: -p host:container
Mistake 2: Cached Layers Not Updating
Changed my code, rebuilt, but still seeing old version!
Problem: Docker cached layers.
Solution: Rebuild without cache: docker build --no-cache -t my-app .
Mistake 3: Container Exits Immediately
Container starts then stops instantly.
Problem: The CMD process exited or crashed.
Solution: Check logs: docker logs container-name
Mistake 4: Large Image Sizes
My image was 1.2GB for a simple Node app!
Problem: Used node:18 instead of node:18-alpine, copied node_modules from local.
Solution:
- Use Alpine images (much smaller)
- Add
node_modulesto.dockerignore - Use multi-stage builds
Docker Compose (Bonus: Running Multiple Containers)
Real apps need databases, Redis, etc. Docker Compose manages multi-container apps.
Create docker-compose.yml:
version: '3.8'
services:
app:
build: .
ports:
- "8080:3000"
environment:
- NODE_ENV=production
- PORT=3000
- DB_HOST=postgres
depends_on:
- postgres
postgres:
image: postgres:15-alpine
environment:
- POSTGRES_USER=myuser
- POSTGRES_PASSWORD=mypassword
- POSTGRES_DB=mydb
volumes:
- postgres-data:/var/lib/postgresql/data
volumes:
postgres-data:
Run everything with one command:
docker-compose up -d
Stop everything:
docker-compose down
Best Practices I Follow
After containerizing dozens of apps, here are my golden rules:
- Use specific image tags:
node:18-alpinenotnode:latest - Keep images small: Use Alpine variants, multi-stage builds
- One process per container: Don't run multiple services in one container
- Use .dockerignore: Keep build context clean
- Don't store secrets in images: Use environment variables or secrets management
- Run as non-root user: Security first!
- Health checks: Add
/healthendpoints and use Docker health checks - Tag images with versions:
my-app:1.0.0not justmy-app:latest
What's Next?
You've learned Docker fundamentals, but there's so much more:
- Docker Networking: Connect containers together
- Docker Volumes: Persist data between container restarts
- Docker Swarm: Orchestrate containers across multiple machines
- Kubernetes: Industry-standard container orchestration (more complex but powerful)
- CI/CD Integration: Automate Docker builds in GitHub Actions, GitLab CI
For learning more, I recommend:
- Official Docker Documentation
- Play with Docker - Practice Docker in browser
- Docker Hub - Explore thousands of images
Conclusion: Your Docker Journey Starts Now
Remember my "it works on my machine" problem? Docker solved it completely. Today, I deploy confidently knowing my app runs identically everywhere.
Start small: Containerize one simple app (like the Node.js example above). Then gradually containerize your other projects. Within a few weeks, Docker becomes second nature.
The beauty of Docker is that once you understand the basicsâimages, containers, Dockerfiles, port mappingâyou can containerize any application, whether it's Python, Java, Go, or even complex multi-service architectures.
Your future self (and your team) will thank you for learning Docker. No more "works on my machine" excusesâjust reliable, reproducible deployments!
Happy containerizing! đł
If this Docker tutorial helped you deploy your first containerized app, I'd love to hear about it! Share your Docker journey or any questions you have. Connect with me on Twitter or LinkedIn for more DevOps and web development tips.
Support My Work
If this guide helped you understand Docker, install it successfully, or deploy your first containerized Node, I'd really appreciate your support! Creating comprehensive, free content like this takes significant time and effort. Your support helps me continue sharing knowledge and creating more helpful resources for developers.
â Buy me a coffee - Every contribution, big or small, means the world to me and keeps me motivated to create more content!