Skip to content

Backend for the new space and resource reservation platform for the City of Helsinki.

License

Notifications You must be signed in to change notification settings

City-of-Helsinki/tilavarauspalvelu-core

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tilavarauspalvelu Core

Maintainability Rating Reliability Rating Security Rating Coverage License

Table of Contents

Overview

This repository contains the backend of the new reservation platform for city of Helsinki. Its main purpose is to act as a backend for tilavarauspalvelu-ui through the GraphQL API.

For more detailed information, please refer to the Tilavarauspalvelu page in Confluence. This is also where you can find the list of members of the project. The preferred contact method is through Helsinki City Slack.

Tech Stack

Integrations

Setup

With Docker

These instructions will set up the backend for local development using Docker. This is recommended for especially for frontend developers, as it requires fewer dependencies.

Before we start, make sure Docker and Make are installed your system. Then, follow the steps below.

  1. Copy .env.example to .env.
cp .env.example .env
  1. Build and run backend with Docker.
make run

You should now be able to log into Django admin panel at localhost:8000/admin/. GraphQL endpoint is at localhost:8000/graphql/.

To generate test data, follow the steps below.

  1. Connect to running container.
make bash
  1. Generate test data.
make generate

Without Docker

These instructions will set up the backend for local development without Docker. This is mainly for backend developers, as it requires more dependencies and setup.

Before you start, you'll need the following dependencies:

Installation instructions for the dependencies will depend on the OS and can change over time, so please refer to the official documentation for each dependency on how to set them up correctly.

You can skip the dependencies for Postgres and Redis by running them using Docker. To do this, install Docker and run make services.

Now, follow the steps below.

  1. Copy .env.example to .env.
cp .env.example .env

This file contains environment variables used by the project. You can modify these to suit your local development environment.

  1. Copy local_settings_example.py to local_settings.py.
cp local_settings_example.py local_settings.py

These can be used to modify settings for local development without changing the main settings file.

  1. Create a virtual environment & install dependencies.
poetry install
  1. Add pre-commit hooks
poetry run make hooks
  1. Run migrations
poetry run make migrate
  1. Generate test data
poetry run make generate
  1. Start the server
poetry run make dev

Backend should now be running at localhost:8000.

Testing

Tests are run with pytest.

Some flags that can save time when running tests:

  • To skip slow-running tests: pytest --skip-slow
  • To retain test database between runs: pytest --reuse-db
  • To skip migration-checks at the start of tests: pytest --no-migrations
  • To run tests in parallel: pytest -n 8 --dist=loadscope (=8 cores, use -n auto to use all available cores)

You can use a pytest.ini file to set up flags for local development.

Updating dependencies

Dependencies are managed by Poetry. Normally, they are automatically updated by dependabot without any manual intervention (given updates don't fail any automated tests).

However, if you want to update them manually, you can do so by running:

poetry update

This will update all dependencies according to the rules defined in pyproject.toml. To see all outdated dependencies, run:

poetry show --outdated

Note that this will also include any sub-dependencies that are not directly defined in pyproject.toml.

Miscellaneous

Background processing

Scheduled & background tasks are run with Celery.

When developing locally, you can run these tasks in a Celery worker with make celery. This uses the filesystem as the message broker. You'll need to create queue and processed folders according to the CELERY_QUEUE_FOLDER_OUT, CELERY_QUEUE_FOLDER_IN, CELERY_PROCESSED_FOLDER environment variables (see .env.example).

If you want to run background tasks synchronously without Celery, set the environment variable CELERY_ENABLED to False. Scheduled tasks still need the worker in order to run.

Authentication

Authentication is handled by Tunnistamo using the django-helusers library. You'll need to get the TUNNISTAMO_ADMIN_SECRET from the Azure Pipelines library or from a colleague and set that in your .env file.

Instead of JWTs, authentication is managed with via sessions. See this ADR in confluence for why this decision was made.

Static files

Static files are served by the Whitenoise package. These are all files that are not uploaded by the users in Django Admin pages.

Media files are served by the uWSGI static files implementation, offloaded to threads. These are all files uploaded by users in Django Admin pages.

If there are performance issues (I.E. 502 errors from the Application Gateway) it is very likely process count and or process scale-up must be tweaked higher.

Translations

Translations are handled by Django's built-in translation system. GitHub Actions CI will check that all translations are up-to-date during PRs. To update translations, run make translations. This will update the .po files located in the locale directory.

For model field translations, we use django-modeltranslation. The package has integrations in all the relevant parts of the project (serializers, admin, etc.). See code for more details.

Debugging

For debugging during development, the Django Debug Toolbar package can be used. The Django GraphQL Debug Toolbar extension is used for the GraphQL endpoint.

You should add a local_settings.py on the root level of the project and add three classes called LocalMixin, DockerMixin and AutomatedTestMixin to it. These can be used to override settings for local development and automated tests respectively.

Note that in order for development settings to work correctly, you need to set the DJANGO_SETTINGS_ENVIRONMENT environment variable to Local when running the server.

Development environment

It's recommended to set up ruff-lsp to enable Ruff linting and formatting support in your editor.

Image cache

In production, Varnish cache is used for reservation unit and purpose images. When new image is uploaded, existing images are removed from the cache using purge task. For more details about how purge is done, check the image cache utility.

In settings there are four configurations:

  • IMAGE_CACHE_ENABLED = Toggle caching on/off
  • IMAGE_CACHE_VARNISH_HOST = Varnish hostname
  • IMAGE_CACHE_PURGE_KEY = Secret key for doing purge requests
  • IMAGE_CACHE_HOST_HEADER = Host header value in purge request