I used to be the person who installed PostgreSQL, Redis, and Elasticsearch directly on my laptop. Different projects needed different versions. Config files lived in random system directories. Upgrading one thing broke another. Switching between projects meant starting and stopping services manually. It was a mess.
Then I started using Docker Compose for local development and I'm never going back.
The Before Times
My old setup was fragile in ways I didn't appreciate until it broke. I had Postgres 13 installed via Homebrew for one project, but another project needed Postgres 14. Redis was running as a background service all the time, eating memory even when I wasn't using it. And every time I set up a new machine, I'd spend half a day installing and configuring services.
The worst part was onboarding. When a new developer joined a project, the setup instructions were a full page of "install this, configure that, create this database, run these migrations." Inevitably something would go wrong and we'd spend hours debugging environment differences.
The Docker Compose File
Here's a simplified version of what I use for a typical web project:
version: '3.8'
services:
db:
image: postgres:14-alpine
environment:
POSTGRES_DB: myapp_dev
POSTGRES_USER: dev
POSTGRES_PASSWORD: devpass
ports:
- "5432:5432"
volumes:
- pgdata:/var/lib/postgresql/data
redis:
image: redis:7-alpine
ports:
- "6379:6379"
mailhog:
image: mailhog/mailhog
ports:
- "1025:1025"
- "8025:8025"
volumes:
pgdata:
That's it. Run docker compose up -d and you have Postgres, Redis, and a mail catcher running. Run docker compose down when you're done. The data persists in named volumes so you don't lose your dev database between sessions.
Why This Is Better
Isolation. Each project has its own compose file with its own services. Project A can use Postgres 13 while Project B uses Postgres 15. They don't interfere with each other. No port conflicts because you can map to different host ports.
Reproducibility. The compose file is checked into the repo. Every developer gets the exact same services with the exact same configuration. New team member setup goes from a page of instructions to "clone the repo and run docker compose up." Combine this with a CI/CD pipeline that uses the same Docker images and you get true dev-prod parity. I've seen onboarding time drop from hours to minutes.
Clean machine. Nothing is installed on my host system except Docker itself. When I'm not working on a project, its services aren't running. My laptop's resources go to what I'm actually doing.
Practical Tips I've Learned
Use Alpine images. They're significantly smaller. postgres:14-alpine is about 80MB compared to 370MB for the full image. When you're pulling multiple service images, the size difference adds up.
Always use named volumes for data. Anonymous volumes are harder to manage and easier to accidentally delete. Named volumes like pgdata in my example above survive docker compose down and are easy to back up or remove intentionally.
Add a Makefile. I wrap common operations in Make targets so I don't have to remember compose commands. This pairs well with a solid terminal setup where you have aliases and shortcuts for the tools you use most:
up:
docker compose up -d
down:
docker compose down
reset-db:
docker compose down -v
docker compose up -d db
sleep 2
npm run migrate
Health checks matter. If your app container depends on the database being ready, add a health check to the db service. Without it, your app might start before Postgres is accepting connections and crash on the first database query.
Don't containerize your app code for development. I run the actual application on my host machine, not in Docker. The services (database, cache, queue) run in Docker. This gives me fast hot reloading, easy debugging, and native file system performance. I only containerize the app itself for production builds.
When Not to Use It
For simple projects that only need a SQLite database, Docker Compose is overkill. If your project has no external service dependencies, you don't need this. And if you're on a machine with limited RAM (under 8GB), running multiple containers alongside your IDE and browser can get tight.
But for any project with real infrastructure dependencies, Docker Compose is the answer. It's one of those tools where I genuinely wonder how I worked without it. The time investment to learn it is maybe an afternoon. The time it saves over the following months is enormous.