, , ,

Add Postgres Database to Dockerized Django Project

In this tutorial I’ll show you how to add a PostgreSQL database to a Dockerized Django project.

We’ll be building on top of an existing project that we created in our recent posts: How to Dockerize a React Project and Add a Backend to a Dockerized React Project tutorials.

The project contains a React frontend and Django backend, all Dockerized.

You can follow those first if you wish, or jump right in with the starting source code provided.

In this guide, we’ll be focusing on adding a PostgreSQL database to the project.

Prerequisites

You’ll need Docker and Docker Compose – I recommend using Docker Desktop or Colima

Once you have that installed, let’s get started.

Add Database Service

(Full diff on GitHub)

Let’s start by adding a new service to our Docker Compose config.

Open docker-compose.yml and add the following to the bottom of the file:


  db:
    image: postgres:17.1
    volumes:
      - db-data:/var/lib/postgresql/data
    ports:
      - 5432:5432
    environment:
      - POSTGRES_DB=db
      - POSTGRES_USER=user
      - POSTGRES_PASSWORD=localdevpw
    healthcheck:
      test: ["CMD", "pg_isready", "-q", "-d", "db", "-U", "user"]
      interval: 5s
      timeout: 5s
      retries: 5

volumes:
  db-data:

This does the following:

  • Adds a new service called “db” that uses the postgres:17.1 image
  • Sets up a named volume called db-data (bottom of the file) and maps it to /var/lib/postgres/data inside our container. This is so we can persist the data stored in the database so it will still be there if we stop and re-create our Postgres instance.
  • Maps port 5432 on the host to the same port on the container – this is only required if we want to connect to the database using a tool such as PGAdmin.
  • Sets the environment variables to specify the database name, username and password – these are the credentials that we can use to access the database. We’ll be configuring Django to use them later on.
  • Add a healthcheck which checks if the database is ready – this is to avoid a race condition when starting the services at the same time caused by Django trying to access the database before Postgres has had a chance to load.

Install PostgreSQL Driver (psycopg)

(Full diff on GitHub)

Next we need to install the driver which Django can use to connect to Postgres.

The library is called psycopg.

In order to make this change, we’ll also need to modify our Dockerfile to add the appropriate operating system level requirements needed for installing it.

Open requirements.txt and add the following line:

psycopg[c]==3.2.3

This will specify the requirement and pin it to version 3.2.3. The [c] bit tells pip to compile this library from scratch which is recommended for production as mentioned in the official docs.

Most people will prefer to use the [binary] version instead which doesn’t require the c compiler and other dependencies. However, this is recommended for development use only.

Then, update the Dockerfile to look like this:

FROM python:3.13.0-alpine3.20

ENV PYTHONUNBUFFERED 1

COPY ./requirements.txt /requirements.txt

ENV PATH="/py/bin:$PATH"
RUN python -m venv /py && \
    pip install --upgrade pip && \
    apk add --update --upgrade --no-cache postgresql-client && \
    apk add --update --upgrade --no-cache --virtual .tmp \
        build-base postgresql-dev

RUN pip install -r /requirements.txt && apk del .tmp

COPY ./backend /backend
WORKDIR /backend

CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]

Note: You can find the diff on GitHub.

Let me summarise the changes we made here:

  1. I removed the “env” from the python -m venv command which I added accidentally in the previous tutorials (oops, sorry).
  2. I moved the PATH above the RUN command – this means we can use pip instead of /py/bin/pip to install things in our virtual environment (another oversight on the previous tutorial – code can always be improved!)
  3. Run apk add to install postgresql-client which is needed by psycopg.
  4. Run apk to install build-base postgresql-dev which is needed to install and compile psycopg – Note, I add --virtual .tmp to this command, which allows us to remove these packages after we installed the pip requirement. This is best practice because these packages are only needed to install psycopg, so we don’t need to keep them in our Dockerfile after they have been used. This helps keep our image footprint small.
  5. Add a separate RUN statement to install our pip requirements – I separated this to a different RUN statement so that Docker doesn’t re-run all the previous steps if we add any more dependencies (minor improvement on caching). I also add apk del .tmp which will delete the temporary dependencies after pip install is complete.
  6. Moved the COPY ./backend /backend to below the RUN blocks – this is another cache optimisation – it means Docker won’t need to run all the RUN statements again if we make changes to our code in the backend/ directory.

Now run the following to test our new configuration:

docker compose build

This should complete successfully.

Configure Django

(Full diff on GitHub)

Next we need to configure Django to use our new database.

Open docker-compose.yml and add the following below the backend service block:

    command: >
      sh -c "python manage.py migrate &&
             python manage.py runserver 0.0.0.0:8000"
    environment:
      - DB_HOST=db
      - DB_NAME=db
      - DB_USER=user
      - DB_PASS=localdevpw
    depends_on:
      db:
        condition: service_healthy

Note: Indentation is important here, you can see the full diff on GitHub.

This does the following:

  1. Overrides the default command defined in the Dockerfile to include the migrate command – this will ensure our database migrations (that we’ll add later) are applied every time we start our service.
  2. Adds environment variables for our database – the DB_HOST is the name of the database service because Docker Compose sets up an internal network which allows services to resolve the address of other services using the service name. The DB_NAME, DB_USER and DB_PASS variables should match the credentials defined in our db service that we added previously.
  3. Adds a depends_on block that will make Docker Compose wait for the db service to be healthy because it starts our backend service. This uses the health_check we defined in our db service to ensure the database is ready.

Next, open backend/backend/settings.py and make the following changes…

Import the os module at the top of the file:

For the full list of settings and their values, see
https://docs.djangoproject.com/en/5.1/ref/settings/
"""

import os

We’ll need this module to access environment variables set on our project.

Locate the DATABASES block and change it to look like this:

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql",
        "NAME": os.environ.get("DB_NAME"),
        "USER": os.environ.get("DB_USER"),
        "PASSWORD": os.environ.get("DB_PASS"),
        "HOST": os.environ.get("DB_HOST"),
        "PORT": "5432",
    }
}

The full diff for these changes is visible on GitHub.

This updates the database configuration to tell Django to use Postgres.

We specify the ENGINE to the built-in PostgreSQL backend.

Then we pull the NAME, USER, PASSWORD and HOST from our environment variables that we set in our docker-compose.yml file.

Now let’s test running our app to see if our migrations are applied.

If your services are already running, stop them using CMD + C (macOS) or CTRL + C (Windows).

Then run:

docker compose up

If it works, you should see Django apply the default migrations.

Look for somehting like this in the logs:

Create Core Django App

When working with Django, I like to define all my database models in an app called “core”.

This is because I find it easier to keep everything database related in one place.

Also, it’s useful if you ever need to extract the database layer to another library.

Run the following command to create a new core app:

docker compose run --rm backend sh -c "python manage.py startapp core"

This will run our backend service and pass in the command for creating a new app. We use –rm to remove the container after execution is complete.

Then, open backend/backend/settings.py and locate the INSTALLED_APPS line and add 'core' as a new item to the end:

INSTALLED_APPS = [
    # ...    
    'corsheaders',
    'rest_framework',
    'core'
]

This will enable our core app for our Django project.

Add Model

Next we’ll add a very simple database model to test our changes.

Open backend/core/models.py and add the following:

class Recipe(models.Model):
    """Represents a recipe in the system."""
    name = models.CharField(max_length=255)
    steps = models.TextField()

    def __str__(self):
        return self.name

This creates a new database model called Recipe.

When we create and apply our migration for this, it will represent a table called “recipe” in our database.

We give it two simple fields:

  • name: the name of the recipe
  • steps: a text field where we can add steps for this recipe

We also define the __str__ method to tell Django how we want instance of this model to look when we convert them to a string.

This is useful for when we use the Django admin.

Django uses “migrations” to manage changes to our database (one of the many amazing features of Django).

This means, every time we add/modify/remove a model in our project, we need to create the migration code that will tell Django what changes to make to our underlying database.

Django provides a command to generate these migrations.

Run the following:

docker compose run --rm backend sh -c "python manage.py makemigrations"

This should create a new file inside core/migrations/.

We will want to access this model from the Django admin, so open backend/core/admin.py and make it look like this:

from django.contrib import admin
from core import models

admin.site.register(models.Recipe)

This will import our models module and register our model in the Django admin.

Login to Django Admin

Now we are ready to test.

Ensure our app is running with the latest migrations by stopping the server (if it’s running) using CMD + C (macOS) or CTRL + C (Windows).

Then, run:

docker compose up --build --watch

Once running, navigate to http://127.0.0.1:8000/admin

You should see the Django admin login page:

And you’re probably asking yourself: What’s my username and password?

We can create the credentials to login by running the following command:

docker compose run --rm backend sh -c "python manage.py createsuperuser"

Follow the in-terminal instructions to create a new superuser.

Then, use the superuser to log into the Django admin.

You should see the Recipes model and be able to make changes to it.

And there we go! You should have a Django project with a working database!

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *