Launching a new project and need Postgres for NestJS development, but don’t want to commit to a production DB provider (yet)? Running a local Postgres instance in Docker is your best friend. Simple. Reliable. No system clutter. Below, I’ll walk you through my go-to setup for spinning up NestJS and PostgreSQL using Docker - minimal friction, fully scriptable, always reproducible. You’ll get practical configuration, commands for direct container access, and a sample NestJS database config. Why This Setup? Why This Setup? Early development is all about moving fast: changing schemas, resetting data, running migrations, sometimes all in the same day. Managed cloud databases (like Neon) are a great final destination, but for local hacking and testing, Docker wins every time. It keeps Postgres off your host machine, avoids “works on my machine” surprises. This is true plug-and-play for local dev. Project Structure and Required Files Project Structure and Required Files Here’s what we’ll set up: Dockerfile for the NestJS app docker-compose.yml to wire up Node and Postgres .env file for environment variables Sample NestJS config and scripts Practical commands for common workflows Dockerfile for the NestJS app Dockerfile docker-compose.yml to wire up Node and Postgres docker-compose.yml .env file for environment variables .env Sample NestJS config and scripts Practical commands for common workflows Dockerfile: Simple Node Environment Dockerfile: Simple Node Environment FROM node:18 WORKDIR /app COPY package*.json ./ RUN npm install COPY . . EXPOSE 3000 CMD ["npm", "run", "start:dev"] FROM node:18 WORKDIR /app COPY package*.json ./ RUN npm install COPY . . EXPOSE 3000 CMD ["npm", "run", "start:dev"] docker-compose.yml: Node + Postgres Side-by-Side docker-compose.yml: Node + Postgres Side-by-Side This is the magic sauce that glues your Node API and a disposable Postgres instance together. version: "3.8" services: db: image: postgres:13 restart: always env_file: - .env ports: - "5432:5432" volumes: - db-data:/var/lib/postgresql/data api: build: context: . dockerfile: Dockerfile ports: - "3000:3000" depends_on: - db env_file: - .env command: sh -c "npm run migration:run && npm run start:dev" volumes: db-data: version: "3.8" services: db: image: postgres:13 restart: always env_file: - .env ports: - "5432:5432" volumes: - db-data:/var/lib/postgresql/data api: build: context: . dockerfile: Dockerfile ports: - "3000:3000" depends_on: - db env_file: - .env command: sh -c "npm run migration:run && npm run start:dev" volumes: db-data: Tip: The volumes key lets your database survive reboots without losing data. .env .env Create a .env file at your project root: .env POSTGRES_USER=postgres POSTGRES_PASSWORD=changeme POSTGRES_DB=app_db POSTGRES_HOST=db POSTGRES_PORT=5432 PORT=3000 POSTGRES_USER=postgres POSTGRES_PASSWORD=changeme POSTGRES_DB=app_db POSTGRES_HOST=db POSTGRES_PORT=5432 PORT=3000 Keep your secrets out of Git! .env goes in .gitignore. Keep your secrets out of Git! .env .gitignore Package.json Scripts: Interactive Containers Package.json Scripts: Interactive Containers Why remember container IDs? Add this to your package.json scripts for quick access: package.json "scripts": { "db": "docker exec -it $(docker-compose ps -q db) bash", "api": "docker exec -it $(docker-compose ps -q api) bash" } "scripts": { "db": "docker exec -it $(docker-compose ps -q db) bash", "api": "docker exec -it $(docker-compose ps -q api) bash" } Now, just run npm run db for a database container shell, or npm run api for the app. npm run db npm run api NestJS: Connecting to Your Dockerized Database NestJS: Connecting to Your Dockerized Database In your main startup (e.g. main.ts): main.ts async function bootstrap() { const app = await NestFactory.create(AppModule); await app.listen(process.env.PORT); } bootstrap(); async function bootstrap() { const app = await NestFactory.create(AppModule); await app.listen(process.env.PORT); } bootstrap(); Database Config: Database Config: Here’s a common config file for TypeORM : const config = { type: "postgres", host: process.env.POSTGRES_HOST, port: parseInt(process.env.POSTGRES_PORT, 10), username: process.env.POSTGRES_USER, password: process.env.POSTGRES_PASSWORD, database: process.env.POSTGRES_DB, entities: [__dirname + "/**/*.entity{.ts,.js}"], synchronize: false, // safer for non-prod migrations: [__dirname + "/migrations/**/*{.ts,.js}"], autoLoadEntities: true, }; const config = { type: "postgres", host: process.env.POSTGRES_HOST, port: parseInt(process.env.POSTGRES_PORT, 10), username: process.env.POSTGRES_USER, password: process.env.POSTGRES_PASSWORD, database: process.env.POSTGRES_DB, entities: [__dirname + "/**/*.entity{.ts,.js}"], synchronize: false, // safer for non-prod migrations: [__dirname + "/migrations/**/*{.ts,.js}"], autoLoadEntities: true, }; Development Workflow: Day-to-Day Commands Development Workflow: Day-to-Day Commands Start everything:docker-compose up --build (first time) or just docker-compose up View logs:docker-compose logs -f api Tear it down (remove containers):docker-compose down Hop in the DB shell:npm run db Hop in the app container:npm run api Start everything:docker-compose up --build (first time) or just docker-compose up docker-compose up --build docker-compose up View logs:docker-compose logs -f api docker-compose logs -f api Tear it down (remove containers):docker-compose down docker-compose down Hop in the DB shell:npm run db npm run db Hop in the app container:npm run api npm run api