Learn docker with Django and Postgres in 10 mints
This is the step-by-step tutorial that how to configure Django to run on Docker along with Postgres.
Dependencies:
Let’s set up our project
Assuming Pipenv installed in your machine, start by creating a new Django project.
$ mkdir django-on-docker && cd django-on-docker
$ mkdir app && cd app
$ pipenv install django==2.1
$ pipenv shell
(django-on-docker)$ django-admin.py startproject hello_django .
(django-on-docker)$ python manage.py migrate
(django-on-docker)$ python manage.py runserver
Visit http://localhost:8000 to see the welcome page of Django. Kill the Django server and exit from Pipenv shell.
Project Directory
└── app
├── Pipfile
├── Pipfile.lock
├── hello_django
│ ├── __init__.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
└── manage.py
Docker
Install docker, if you don’t have it, add ‘Dockerfile’ to the app directory
# pull official base image
FROM python:3.7-alpine
# set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# set work directory
WORKDIR /usr/src/app
# install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
COPY ./Pipfile /usr/src/app/Pipfile
RUN pipenv install --skip-lock --system --dev
# copy project
COPY . /usr/src/app/
So, we start with an Alpine-based Docker image for Python 3.7. We then set some environment variables along with a working directory. Finally, we install Pipenv, copy over the local Pipfile, install the dependencies, and copy over the Django project itself.
Next, add docker-compose.yml or docker-compose.yaml to the project root directory.
version: '3.7'
services:
web:
build: ./app
command: python /usr/src/app/manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:8000
environment:
- SECRET_KEY=please_change_me
Update the SECRET_KEY
in settings.py:
SECRET_KEY = os.getenv('SECRET_KEY')
Build the image:
$ docker-compose build
Once the image is built, run the container:
$ docker-compose up -d
Navigate to http://localhost:8000/ to again view the welcome screen.
Postgres
To configure Postgres, we’ll need to add a new service to the docker-compose.yml file, update the Django settings, and install Psycopg2.
First, add a new service called docker-compose.yml:
version: '3.7'services:
web:
build: ./app
command: python /usr/src/app/manage.py runserver 0.0.0.0:8000
volumes:
- ./app/:/usr/src/app/
ports:
- 8000:8000
environment:
- SECRET_KEY=please_change_me
- SQL_ENGINE=django.db.backends.postgresql
- SQL_DATABASE=postgres
- SQL_USER=postgres
- SQL_PASSWORD=postgres
- SQL_HOST=db
- SQL_PORT=5432
depends_on:
- db
db:
image: postgres:10.5-alpine
volumes:
- postgres_data:/var/lib/postgresql/data/volumes:
postgres_data:
To persist the data beyond the life of the container we configure a volume. This config will bind postgres_data
to the "/var/lib/postgresql/data/" directory in the container.
Update the DATABASES
dict in settings.py:
DATABASES = {
'default': {
'ENGINE': os.getenv('SQL_ENGINE', 'django.db.backends.sqlite3'),
'NAME': os.getenv('SQL_DATABASE', os.path.join(BASE_DIR, 'db.sqlite3')),
'USER': os.getenv('SQL_USER', 'user'),
'PASSWORD': os.getenv('SQL_PASSWORD', 'password'),
'HOST': os.getenv('SQL_HOST', 'localhost'),
'PORT': os.getenv('SQL_PORT', '5432'),
}
}
Update the Dockerfile to install the appropriate packages along with Psycopg2:
# pull official base image
FROM python:3.7-alpine# set environment varibles
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1# set work directory
WORKDIR /usr/src/app# install psycopg2
RUN apk update \
&& apk add --virtual build-deps gcc python3-dev musl-dev \
&& apk add postgresql-dev \
&& pip install psycopg2 \
&& apk del build-deps# install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
COPY ./Pipfile /usr/src/app/Pipfile
RUN pipenv install --skip-lock --system --dev# copy project
COPY . /usr/src/app/
Build a new image and spin up the two containers:
$ docker-compose up -d --buildhg
Run Django commands using docker-compose
$ docker-compose run web python manage.py createsuperuser
$ docker-compose run web python manage.py makemigrations
$ docker-compose run web python manage.py migrate
Run the migrations:
$ docker-compose exec web python manage.py migrate --noinput
Ensure the default Django tables were created:
$ docker-compose exec db psql -U postgrespsql (10.5)
Type "help" for help.postgres=# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+----------+------------+------------+-----------------------
postgres | postgres | UTF8 | en_US.utf8 | en_US.utf8 |
template0 | postgres | UTF8 | en_US.utf8 | en_US.utf8 | =c/postgres +
| | | | | postgres=CTc/postgres
template1 | postgres | UTF8 | en_US.utf8 | en_US.utf8 | =c/postgres +
| | | | | postgres=CTc/postgres
(3 rows)postgres=# \c postgres
You are now connected to database "postgres" as user "postgres".
postgres=# \dt
List of relations
Schema | Name | Type | Owner
--------+----------------------------+-------+----------
public | auth_group | table | postgres
public | auth_group_permissions | table | postgres
public | auth_permission | table | postgres
public | auth_user | table | postgres
public | auth_user_groups | table | postgres
public | auth_user_user_permissions | table | postgres
public | django_admin_log | table | postgres
public | django_content_type | table | postgres
public | django_migrations | table | postgres
public | django_session | table | postgres
(10 rows)postgres=# \q
You can check that the volume was created as well by running:
$ docker volume inspect django-on-docker_postgres_data
You should see something similar to:
[
{
"CreatedAt": "2018-11-10T21:27:47Z",
"Driver": "local",
"Labels": {
"com.docker.compose.project": "django-on-docker",
"com.docker.compose.version": "1.22.0",
"com.docker.compose.volume": "postgres_data"
},
"Mountpoint": "/var/lib/docker/volumes/django-on-docker_postgres_data/_data",
"Name": "django-on-docker_postgres_data",
"Options": null,
"Scope": "local"
}
]