Google Cloud Platform, Docker and Django.

In this guide I’m going to show you how to deploy a Django application to Google App Engine (GAE) using Docker.

Specifically, we’ll be doing this:

  1. Creating a new Django project with Docker and Docker Compose
  2. Creating a placeholder app with an image and CSS to demonstrate handling static files
  3. Setting up a project on Google Cloud
  4. Configuring the project for deployment
  5. Deploying the app

Terminology

Before we dive in, I’ll be using the following acronyms in this post:

  • GAE – Google App Engine
  • GCP – Google Cloud Platform

Why use Docker?

I always deploy to Google App Engine via Docker containers for three reasons.

Reason 1 – Onboarding speed

Time and again I’ve experienced projects where it takes new developers hours (or even days) just to get the project running on their local machine.

This can happen for a number of reasons such as: different version of tools, different configurations, conflicting dependencies and so on.

Using Docker on your project significantly reduces these issues and as a result helps new developers get up and running in minutes instead of days.

If used correctly, Docker can ensure all project dependencies are captured inside the project code which means all that setup (faffing with versions, tedious configs, python interpreters etc…) can be achieved by running a single command: docker-compose up.

Reason 2 – Consistency

Using Docker creates consistency between all your developer environment meaning you never have to hear “but it works on my machine!” again.

Reason 3 – Version management

Managing versions of various tools on your machine can be a pain.

What happens if you need to maintain an ancient project that needs Cloud SDK v160, but want to start a new project that requires v337?

You could install multiple versions, add custom executables to the PATH, develop some kind of version management system, uninstall and re-install it every time you need it?

Or, you could use Docker to manage the versions and upgrade projects when you’re ready!

Project Setup

We’ll start by creating a new Django project using Docker.

This configuration is purely for our local development environment, and won’t be used by Google App Engine itself.

Create Docker development server

First we need to setup the Docker components.

Create a new directory for storing your files (eg: deploy-django-gae-docker/) and initialise it as a Git repo.

Create a file called requirements.txt in the root of the project, and add the following contents:

Django>=3.2,<3.3
gunicorn>=20.1.0,<20.2

This will install Django with the latest patch version of 3.2 and Gunicorn with the latest patch version of 20.1.

We need Gunicorn because it’s what GAE uses to serve our project.

Then create a new empty directory in the project called app/ which will be used to store our Django code which we will create later.

Next create a file called Dockerfile, and popular it with the following contents:

FROM python:3.9-alpine
LABEL maintainer="londonappdeveloper.com"

COPY ./requirements.txt /requirements.txt
COPY ./app /app
WORKDIR /app

RUN python -m venv /py && \
    /py/bin/pip install --upgrade pip && \
    /py/bin/pip install -r /requirements.txt && \
    adduser --disabled-password --no-create-home django-user

ENV PATH="/py/bin:$PATH"

USER django-user

This is a standard Dockerfile which I use for my Django projects.

It does the following:

  1. Uses the python:3.9-alpine image, which provides the Python interpreter on a very lightweight image.
  2. Sets the maintainer label to londonappdeveloper.com (feel free to change this).
  3. Copies the requirements.txt file from our project to the image.
  4. Copies the app/ directory into the image.
  5. Sets the default working directory to /app so we can run the Django manage.py commands directly in our Docker container.
  6. Create a new virtual environment for our dependencies, upgrade pip, install requirements and create a user for our app.
  7. Adds the new virtual environment to the image’s path.
  8. Set the container user to django-user.

Now create a docker-compose.yml file which we’ll use for running our local development server:

version: '3.9'

services:
  app:
    build:
      context: .
    ports:
      - 8000:8000
    volumes:
        - ./app:/app
    command: python manage.py runserver 0.0.0.0:8000

This defines a service we can use for development, which will be based on the Dockerfile we created above.

It maps the /app directory as a volume, so our container can access realtime updates as we’re making changes to our project.

It also runs the Django development server on port 8000, and maps this port to our host machine.

(Diff of above changes)

Create Django app

Now we have setup Docker, we can use docker-compose to create a new Django project by running the following command:

docker-compose run --rm app sh -c "django-admin startproject app ."

If everything worked, you should see some Django project files appear in the app/ directory like this:

Screenshot of Django project templates files.
Django project template files.

(Diff of changes)

Run development server

You can run the development server with the following command:

docker-compose up

The output should look like this:

Screenshot of running docker-compose up
Running docker-compose up

Then you can access the app by visiting http://127.0.0.1:8000.

Screenshot of placeholder for Django project
Placeholder for Django project

Create Django app

Next we’re going to create an app in our Django project so we can create a sample web page.

Start by creating a new app using the following command:

docker-compose run --rm app sh -c "python manage.py startapp demo"

You should see some new files appear in the project like this:

You can remove the following, because we won’t be needing them for this tutorial:

  • migrations/
  • admin.py
  • models.py
  • tests.py

Then open settings.py and add demo to the INSTALLED_APPS list:

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'demo',
]

(Diff of changes)

We’re going to create a page that contains an image and some CSS, to demonstrate how to handle static files with Django when using GAE.

For this to work, we need to make some configuration changes to our project.

Configuring static files

In Django you place static files in a directory called static/ within the appropriate app.

Since a Django project often consists of multiple apps, we need to gather static files into one location, which we can then serve via GAE.

Django provides a built-in command for doing this called collectstatic (see docs).

The collectstatic command goes through all the apps enabled in our project, collects the contents of the static/ subdirectories, and places them in a central location.

We can define the destination of these collected static files using the STATIC_ROOT option in settings.py.

Go ahead and update settings.py by adding the following to the end of the file:

STATIC_ROOT = '/static'

Since we’re using Docker, we need to map a volume for handling these files.

We can do this by adding - ./static:/static to the volumes block in docker-compose.yml, making it look like this:

version: '3.9'

services:
  app:
    build:
      context: .
    ports:
      - 8000:8000
    volumes:
        - ./app:/app
        - ./static:/static
    command: python manage.py runserver 0.0.0.0:8000

(Diff of changes)

Create placeholder page

Next we’ll create a page that we can deploy to GAE.

If you haven’t already, create a directory called static/ inside the demo app (the full path will be app/demo/static/).

Then add a new file at app/demo/static/style.css and populate it with the following:

body {
    font-family:Arial, Helvetica, sans-serif;
    color: #ffffff;
    text-align: center;
    background-color: #3d3d3d;
}

img {
    border-radius: 3px;
    border: 1px solid white;
}

Find an image (any jpeg will do) and also add it to the static directory with the name river-sunrise.jpg.

Feel free to use this one:

River sunrise image
river-sunrise.jpg

Then add a template at app/demo/templates/demo/demo.html (you will need to create some of the sub directories), and populate it with the following:

{% load static %}
<!doctype html>
<html lang="en">
  <head>
    <title>Django GAE Placeholder</title>
    <link rel="stylesheet" href="{% static 'style.css' %}" />
  </head>
  <body>
    <h1>Django GAE Placeholder Page</h1>
    <p>Here's a sunrise...</p>
    <img src="{% static 'river-sunrise.jpg' %}" />
  </body>
</html>

Next we’ll create a view inside app/demo/views.py that looks like this:

def demo(request):
    return render(request, 'demo/demo.html')

This will render our new template…

Now wire a URL up for this view, by updating app/app/urls.py to read the following:

from django.contrib import admin
from django.urls import path

from demo import views


urlpatterns = [
    path('admin/', admin.site.urls),
    path('', views.demo),
]

(Diff of changes)

We can now start our dev server by running docker-compose up, and navigate to http://127.0.0.1:8000 where we should find this:

Toggle debug mode

Django projects feature a DEBUG setting which can be set too True to display useful information when the app crashes.

This is great for solving issues, but not something you want enabled on a deployed application due to the security implications of revealing too much to attackers.

We need a way to disable this on GAE while keeping it enabled locally…

To do this, we’ll use environment variables.

In settings.py, add the following to the imports:

import os

Then, locate the DEBUG = True line and replace it with this:

DEBUG = bool(int(os.environ.get('DEBUG', 0)))

This will pull the value from an environment variable called DEBUG. If the value is set to 1, then debug mode will be enabled, otherwise it will not.

(The bool() and int() functions are required because environment variables are strings by default)

When debug mode is disabled, we need to specify ALLOWED_HOSTS, which is a Django security featured to prevent header attacks (see docs).

Typically the value would be set to the hostname you are using for your app.

However, according to Google’s official demo project, GAE comes with built in header attack prevention, so it’s safe to use the asterisk character to set a wildcard for allowing all hosts:

ALLOWED_HOSTS = ['*']

Important: This is only safe for projects deployed to GAE. If you’re deploying your app using another service, you must specify the host(s).

Next, update docker-compose.yml to include the following within the app service:

    environment:
      - DEBUG=1

(Diff of changes)

If you’re Django development server is already running, close it by pressing CTRL + C and then run it again by running docker-compose up again.

Create Google Cloud Project

Now we’ve setup our app, we can create a Google Cloud project.

For these steps you’re going to need a Google Cloud Platform (GCP) account with full administrator access.

If you don’t have one already, sign up at https://cloud.google.com/.

Once registered, you should see something like this:

You can only have one App Engine deployment per project on google cloud.

If you already had an account, I suggest creating a new project by clicking on the project dropdown in the top navigation bar and choosing NEW PROJECT.

Screenshot of project dropdown
Project dropdown
Screenshot of new project
New project option

Then, fill out the steps to create a new project.

For this tutorial, I’m going to use My First Project which is automatically created with new GCP accounts.

Once you have the project you wish to deploy to, the next step is to locate and select App Engine on the left navigation:

Screenshot of the App Engine option on the left hand navigation.
App Engine option on the left hand navigation

Tip: It can be difficult to locate due to all the options. If you can’t find it, use CMD + F (or CTRL + F on Windows) and type “App Engine” in the search.

Once selected, you should see the Welcome to App Engine page, where you can choose Create Application.

Screenshot of create application option on Welcome to App Engine page.
Create Application option

On the next page you will need to select a region to base your app.

It’s best to choose a region that is closest to the majority of your users. If you’re just learning how to use App Engine, I suggest choosing the region closest to where you’re based.

Note: Unfortunately, at the time of writing this post, it’s not possible to change your region once it’s set, so choose carefully.

Screenshot of selecting a region for App Engine.
Selecting a region for App Engine

Once you’ve decided on a region, click Create app.

Next we can choose Python as the language and Standard as the environment and click Next.

Screenshot of App Engine language and environment option
App Engine language and environment option

The language is self explanatory because we’re deploying Python code.

The Environment offers two options:

  • Flexible – This allows you to manage your own hosts which is recommended in cases where you need access to the operating system that runs your app (advanced).
  • Standard – This is a serverless option, where Google manage the servers for you and you don’t have access to them.

The Standard environment is preferred in most cases because it means you can totally forget about the underlying servers, and just focus on building your app.

Once done, you should see the Next steps screen.

You can select I’LL DO THIS LATER at the bottom (not sure why they need to shout?):

Screenshot of App Engine next steps.
App Engine next steps screen

You may see a URL not found screen on the next page.

This is because we haven’t deployed our app yet…

Screenshot of URL not found screen
URL not found

Now our app is created, we can move onto deploying our project.

Configure project for Google App Engine

The next step is to add the Cloud SDK components to our project.

Create a new file in the root of the project called docker-compose-deploy.yml, and add the following contents:

version: '3.9'

services:
  gcloud:
    image: google/cloud-sdk:338.0.0
    volumes:
      - gcp-creds:/creds
      - .:/app
    working_dir: /app
    environment:
      - CLOUDSDK_CONFIG=/creds

volumes:
  gcp-creds:

I’ll break down the contents of the file below:

  • version: '3.9' is the version of the Docker Compose syntax we are using.
  • services defines the new service block.
  • gcloud is the name of our service which we will use when we need the Cloud SDK.
  • image: google/cloud-sdk:338.0.0 references the official cloud-sdk Docker image hosted on Docker Hub. We are pinning the version to 338.0.0, however you can replace this with latest to always use the latest version of the SDK (this may cause issues if they push breaking changes to the SDK).
  • volumes defines the new volumes block
  • - gcp-creds:/creds – this is the volume we will use to store our GCP credentials when working with the project. Because we’re using Docker, any credentials we create are stored in ephemeral containers by default. It’s often useful to map a volume to avoid needing to authenticate every time to run a deployment.
  • - .:/app maps the project directory to the container, so the Cloud SDK can access our project files.
  • working_dir: /app sets the working directory to the volume mapped above so we can run commands directly from this location.
  • environment defines the environment variables we’ll be using
  • - CLOUDSDK_CONFIG=/creds tells the Cloud SDK to store our credentials in the /creds directory, which will be mapped to the volume defined previously.
  • volumes is the block used to define named volumes
  • gcp-creds: is a volume we’re creating on our system to store our credentials. This will persist even after we remove the containers, so we can keep our authenticate details saved.

Next we need to create our app.yaml file, which tells GAE how to deploy and run our project.

Create a file called app.yaml in the root of the project, and add the following contents:

runtime: python39
entrypoint: gunicorn -b :$PORT --chdir app/ app.wsgi:application

handlers:
  - url: /static
    static_dir: static/
  - url: /.*
    script: auto

I’ll break down the contents of this file below:

  • runtime: python39 defines the version of Python which will be used for GAE (Python 3.9).
  • entrypoint: gunicorn -b :$PORT --chdir app/ app.wsgi:application tells GAE how to start our project. gunicorn is the command for running Gunicorn. -b :$PORT tells it to bind to the port specified by the PORT environment var (which is set automatically by GAE). --chdir /app tells Gunicorn to work from the app/ directory where our project code is stored, and app.wsgi:application points to our wsgi.py file that is auto generated by Django in our project.
  • Then we have the handlers: block where we define URL mappings for our project.
  • - url: /static specifies the URL prefix for our static files, which by default should be /static
  • static_dir: static/ tells GAE to serve paths beginning with /static from the static/ directory, which will be uploaded to GAE.
  • -url: /.* is a wildcard path which catches everything else that doesn’t start with /static
  • script: auto means all requests matching the above URL should be handled by the entrypoint script.

This is a minimal configuration required for Django. There are many other configuration options available which are described in the official docs for app.yaml.

Now create a new file called .gcloudignore and populate it with the following:

.git
.gitignore
__pycache__/
/setup.cfg
docker-compose.yml
docker-compose-deploy.yml
Dockerfile
Makefile
README.md

This is a list of files that should be excluded from GAE when we run our deployment. It’s best to cover everything except the minimum required files such as app/ (for project code), requirements.txt, static/ and app.yaml.

(Diff of changes)

Once done, we’re ready to deploy…

Deploying the application

First we need to authenticate our Cloud SDK with GCP.

Because we’re using docker-compose, each command we run needs to be wrapped with the appropriate syntax for running our container.

To authenticate, run the following:

docker-compose -f docker-compose-deploy.yml run --rm gcloud sh -c "gcloud auth login"

This does the following:

  • docker-compose is the command we are running.
  • -f docker-compose-deploy.yml tells Docker Compose to use our deployment configuration file.
  • run is the docker-compose command for running a container.
  • --rm tells Docker Compose to remove the container after it’s finished running (this is to keep our system clear of old lingering containers).
  • gcloud is the name of the service we want to run.
  • sh -c "..." is the command we’re running on the container, which is going to be a single line shell command.
  • gcloud auth login is the Cloud SDK command we’re running which will initiate the login process.

The first time you run the command, the cloud-sdk image will be downloaded from Docker Hub. This can take a few minutes depending on your internet connection, but won’t be required for subsequent run commands, because the image will be cached on our system.

Once the command finished running, you’ll be presented with an output like this:

You need to copy the URL provided and paste it into the browser, where you can authenticate using the Google Account that has access to your Google Cloud project:

Screenshot of Cloud SDK Auth
Cloud SDK Auth

Click Allow, and you should be taken to a page with a code like this:

Copy the contents of the code, and paste it into the Enter verification code prompt in your Terminal or Command Prompt window:

Hit enter, and you’ll be logged into the Cloud SDK.

Next, head over to your Google Cloud console (the web interface), click on the project dropdown on the top nav and locate the ID for your selected project:

Copy the ID because we’re going to need it in the next command…

Now we’re ready to deploy our project by running the following two commands, replacing PROJECT_ID with the ID you copied:

docker-compose run --rm app sh -c "python manage.py collectstatic"
docker-compose -f docker-compose-deploy.yml run --rm gcloud sh -c "gcloud app deploy --project PROJECT_ID"

The first command will gather our static files into the static/ directory.

The second command will start the deployment to GAE.

After running them, you should see the following prompt where you can enter Y to deploy:

Screenshot of GAE deployment confirmation.
GAE Deployment confirmation

The deployment should now start.

It will take a few minutes and once it’s done you should see something like this:

Deployment in progress

You might see this error appear:

If this happens, open the URL listed in the output and enable the API.

Note: For me, when I went to the page the API was already enabled, so I think this is a glitch with the Cloud SDK. If your API is also enabled, simply run the deploy command again and it should work the second time. I think this could be caused by a delay in enabling the API the first time you run the command, so it shouldn’t happen again.

Once deployment is successful, you will see this:

Unfortunately the gcloud app browse command will not work because we are running the Cloud SDK in Docker.

However, you can manually copy the URL included in the above output and paste it into your browser manually (it’s probably less work than typing the browse command anyway…)

That’s how you deploy a Django application to Google App Engine!

Managing long commands

I expect a lot of readers will be wondering: Do I really need to remember and type these long, cumbersome commands each time I need to run a deployment?

There are two popular solutions for this.

One is to create a file called Makefile in your project and add the following:

.PHONY: build
build:
    docker-compose run --rm app sh -c "python manage.py collectstatic"

.PHONY: deploy
deploy: build
    docker-compose -f docker-compose-deploy.yml run --rm gcloud sh -c "gcloud app deploy --project PROJECT_ID"

Providing you have Make installed, you can use the above file to shorten the commands to:

  • make build – collects static files
  • make deploy – collects static files and deploys app

The second option is to handle the deploying using a CI/CD tool like GitHub Actions.

For this, you will need to modify the commands as follows so they don’t ask for the user prompt:

docker-compose run --rm app sh -c "python manage.py collectstatic --noinput"
docker-compose -f docker-compose-deploy.yml run --rm gcloud sh -c "gcloud app deploy --project PROJECT_ID --quiet"

Next steps

I hope you found this tutorial useful.

If you have any feedback then please feel free to leave it in the comments section below so we can all learn from eachother.

Also, please let me know if there are any other topics you would like covered!

For example let me know if you would like to learn how to use App Engine with Cloud SQL or Datastore to store data for your project.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published.