Launching a new project that requires PostgreSQL for NestJS development without committing to a production database provider can be a challenge. Fortunately, running a local PostgreSQL instance in Docker offers a simple, reliable solution that keeps your system uncluttered. Below is a streamlined approach to setting up NestJS and PostgreSQL using Docker, ensuring minimal friction and full scriptability for reproducibility. This guide includes practical configurations, commands for direct container access, and a sample NestJS database configuration.
Why This Setup?
In the early stages of development, speed is of the essence. Frequent changes to schemas, data resets, and migrations can occur within the same day. While managed cloud databases like Neon serve as excellent final destinations, Docker proves to be the superior choice for local development. It isolates PostgreSQL from your host machine, effectively eliminating the dreaded “works on my machine” surprises. This setup embodies true plug-and-play functionality for local development.
Project Structure and Required Files
To get started, we will establish the following components:
- Dockerfile for the NestJS application
- docker-compose.yml to connect Node and PostgreSQL
- .env file for environment variables
- Sample NestJS configuration and scripts
- Practical commands for common workflows
Dockerfile: Simple Node Environment
FROM node:18
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "run", "start:dev"]
docker-compose.yml: Node + Postgres Side-by-Side
This configuration serves as the essential link between your Node API and a disposable PostgreSQL instance.
version: "3.8"
services:
db:
image: postgres:13
restart: always
env_file:
- .env
ports:
- "5432:5432"
volumes:
- db-data:/var/lib/postgresql/data
api:
build:
context: .
dockerfile: Dockerfile
ports:
- "3000:3000"
depends_on:
- db
env_file:
- .env
command: sh -c "npm run migration:run && npm run start:dev"
volumes:
db-data:
Tip: The volumes key ensures that your database persists through reboots without losing data.
.env
Create a .env
file at the root of your project:
POSTGRES_USER=postgres
POSTGRES_PASSWORD=changeme
POSTGRES_DB=app_db
POSTGRES_HOST=db
POSTGRES_PORT=5432
PORT=3000
Important: Keep your secrets secure by adding .env
to your .gitignore
.
Package.json Scripts: Interactive Containers
Why memorize container IDs? Enhance your package.json
scripts for quick access:
"scripts": {
"db": "docker exec -it $(docker-compose ps -q db) bash",
"api": "docker exec -it $(docker-compose ps -q api) bash"
}
Now, simply run npm run db
to access the database container shell, or npm run api
for the application.
NestJS: Connecting to Your Dockerized Database
In your main startup file (e.g., main.ts
), include the following:
async function bootstrap() {
const app = await NestFactory.create(AppModule);
await app.listen(process.env.PORT);
}
bootstrap();
Database Configuration: Below is a typical configuration file for TypeORM:
const config = {
type: "postgres",
host: process.env.POSTGRES_HOST,
port: parseInt(process.env.POSTGRES_PORT, 10),
username: process.env.POSTGRES_USER,
password: process.env.POSTGRES_PASSWORD,
database: process.env.POSTGRES_DB,
entities: [__dirname + "/**/*.entity{.ts,.js}"],
synchronize: false, // safer for non-prod
migrations: [__dirname + "/migrations/**/*{.ts,.js}"],
autoLoadEntities: true,
};
Development Workflow: Day-to-Day Commands
- To start everything:
docker-compose up --build
(for the first time) or simplydocker-compose up
- To view logs:
docker-compose logs -f api
- To tear it down (remove containers):
docker-compose down
- To access the DB shell:
npm run db
- To access the app container:
npm run api