Docker Multilang: One Compose File, Three Runtimes
Most Docker tutorials either feel like abstract theory or giant production stacks you’ll never build just to learn the basics. This guide does neither. It builds a small system that actually feels like a real application: an API, a background worker, and a web front end all running together.
The point is not the code itself. The point is seeing three different runtimes packaged, wired, and launched as one system with a single command. Once you’ve done that once, containerization stops feeling mysterious and starts feeling mechanical.
By the end, you will not just have read about containers. You will have a running multi‑service stack on your machine that you can break, modify, and extend.
What you are building
This lab creates a minimal multi-language stack. Each service has one responsibility so the container boundaries are obvious.
| Service | Runtime | Purpose | Port |
|---|---|---|---|
| API | Node + Express | Exposes /numbers and /sorted endpoints returning random and sorted values | 3001 |
| Worker | Python | Background loop fetching and logging sorted numbers from API | none |
| Web | Nginx | Serves a static page that fetches and displays sorted data | 8080 → 80 |
At the end of the guide, all three will run with a single command.
Prerequisites
Install Docker Desktop and verify it runs before continuing.
docker version
If this prints a version, Docker is ready. Node and Python will be installed inside containers, so you do not need them locally.
Step 1: Create the project structure
Start by creating a clean workspace. Each folder will become a Docker build context.
mkdir multilangcd multilangmkdir api worker web
Confirm the structure exists.
ls
Expected output:
api worker web
This separation prevents accidental file leakage between containers and keeps each runtime isolated.
Step 2: Create the Node API service
The API exposes /numbers and /sorted endpoints returning random and sorted values.
Create the API source file.
cat > api/index.js <<'EOF'const express = require("express");const app = express();function generateNumbers() {return Array.from({ length: 15 }, () => Math.floor(Math.random() * 100));}app.get("/numbers", (_req, res) => {const nums = generateNumbers();res.json({ numbers: nums });});app.get("/sorted", (_req, res) => {const nums = generateNumbers();const sorted = [...nums].sort((a, b) => a - b);res.json({ original: nums, sorted });});app.listen(3001, () => {console.log("api running on 3001");});EOF
Create the package manifest.
cat > api/package.json <<'EOF'{"name": "api","version": "1.0.0","main": "index.js","scripts": {"start": "node index.js"},"dependencies": {"express": "^4.19.2"}}EOF
Install dependencies to generate the lockfile.
cd apinpm installcd ..
The API code is now complete and ready for containerization.
Step 3: Create the Python worker service
The worker periodically fetches from the API and logs sorted results.
Create the worker script.
cat > worker/main.py <<'EOF'import timeimport requestsAPI_URL = "http://api:3001/sorted"while True:try:r = requests.get(API_URL, timeout=5)data = r.json()print("original:", data["original"])print("sorted: ", data["sorted"])print("---")except Exception as e:print("worker error:", e)time.sleep(5)EOF
Create the requirements file with requests.
cat > worker/requirements.txt <<'EOF'requestsEOF
No local install is required. Dependencies will be installed inside the container.
Step 4: Create the static web service
The web page fetches sorted data from the API and displays it.
cat > web/index.html <<'EOF'<!doctype html><html><head><title>Docker Multilang Sorting Demo</title></head><body><h1>Sorting Demo</h1><button onclick="loadData()">Generate & Sort</button><pre id="output"></pre><script>async function loadData() {const res = await fetch('http://localhost:3001/sorted');const data = await res.json();document.getElementById('output').textContent ='Original: ' + JSON.stringify(data.original) + '\n' +'Sorted: ' + JSON.stringify(data.sorted);}</script></body></html>EOF
This file is enough to verify Nginx container behavior.
Step 5: Create Dockerfiles for each service
Each service now needs a container build definition.
Node API Dockerfile:
cat > api/Dockerfile <<'EOF'FROM node:18-alpineWORKDIR /appCOPY package.json package-lock.json ./RUN npm install --productionCOPY . .EXPOSE 3001CMD ["npm", "start"]EOF
Python worker Dockerfile:
cat > worker/Dockerfile <<'EOF'FROM python:3.11-slimWORKDIR /appCOPY requirements.txt ./RUN pip install -r requirements.txtCOPY . .CMD ["python", "main.py"]EOF
Web Dockerfile:
cat > web/Dockerfile <<'EOF'FROM nginx:alpineCOPY index.html /usr/share/nginx/html/index.htmlEOF
At this point, each folder contains runnable source code and a Dockerfile that defines how to package it.
Step 6: Create the Compose file
Docker Compose will build and run all services together.
cat > docker-compose.yml <<'EOF'services:api:build: ./apiports:- "3001:3001"worker:build: ./workerweb:build: ./webports:- "8080:80"EOF
This file declares three services, two exposed ports, and local build contexts.
Step 7: Build and run the stack
Run the full stack with one command.
docker compose up --build
You should see logs showing:
- API container printing
api running on 3001 - Worker container printing original and sorted arrays fetched from the API
- Web container starting Nginx
Leave this running and open a second terminal for verification.
Step 8: Verify each service
Verify the web service in a browser.
Open:
http://localhost:8080
Expected result: the page displays a button. Clicking it fetches and shows original and sorted arrays from the API.
Verify the API service.
curl http://localhost:3001/sorted
Expected response:
{"original":[34,7,88,...],"sorted":[7,34,88,...]}
Verify the worker service.
Look at the Compose logs and confirm repeated output:
original: [34, 7, 88, ...]sorted: [7, 34, 88, ...]---
If all three checks succeed, the multi-runtime container stack is functioning correctly.
Common failure points
| Symptom | Cause | Fix |
|---|---|---|
| API container exits immediately | package-lock.json missing | Run npm install in api/ and rebuild |
| Web page returns 404 | Incorrect COPY path in web Dockerfile | Confirm index.html exists in web/ |
| Ports unavailable | Port already in use | Change host ports in compose file |
| Worker prints errors | File name mismatch | Ensure worker/main.py exists |
| Web page shows CORS error | Browser blocks API call | Run browser with disabled CORS for demo or proxy API through web container |
These issues account for most failures when assembling small multi-container stacks.
What you now have
At this point, your project directory contains:
multilang/api/index.jspackage.jsonpackage-lock.jsonDockerfileworker/main.pyrequirements.txtDockerfileweb/index.htmlDockerfiledocker-compose.yml
You can stop the stack with Ctrl+C and restart it any time with docker compose up.
Closing
At this point you have something most Docker introductions never give you: a working multi‑runtime system you can actually play with.
You can change the API response and watch the container rebuild. You can slow the worker loop and see log behavior. You can replace the static page with a real frontend. None of that requires guessing how the pieces connect anymore — you already wired them once.
That is the real value of this lab. Not memorizing commands. Not reading diagrams. Actually building a small system that behaves like a real one.