You have been tasked with building a new event-based application that tracks the movement of a user's mouse cursor in a web browser, calculating the velocity of the mouse cursor and the distance it has travelled, and display the results to the user in near real-time on a chart.
- The application must be built using Python and Django
- Mouse move events must be captured in the browser and sent to the server using a REST API
- Events must be stored in the database for later analysis
- Processing of events must be done asynchronously using Celery tasks
- The results must be displayed to the user in near real-time on a chart (this can be on the same page that captures the mouse movements)
Note
- You are free to use any additional libraries or frameworks you wish
- You may chart the results using any style of chart you see fit
A starter repository has been provided for you to get up and running quickly. It includes a basic Django app and is configured to use Celery for running asynchronous tasks. A Docker compose.yml file is provided to run the supporting application services such as a PostgreSQL database and Redis message broker.
Before jumping in you'll need the following installed:
With the prerequisites installed you'll now want to do the following:
Start the database and message broker services
docker compose -p fs_starter up
Install the Python dependencies and shell into the virtual environment
poetry install
poetry shell
Rename the .env.example
file to .env
and update the values as needed
mv .env.example .env
Run the Django migrations
python manage.py migrate
Start the server
python manage.py runserver
Start the Celery worker
celery -A core worker -l debug
Note
Unlike the Django server, the Celery worker does not auto-reload when changes are made. You will need to restart the worker manually to pick up any changes.
The solution consists of 3 main parts:
- The Analytics app containing the models, serializers and views of the analytics data, which is PointerPositionEvent here.
- The Dashboard app consists of the models, serializers and views of the data which are related to the dashboard and the browser the data is shown in.
- The Worker and Periodic task manager which analyse and persists the results in async manner, using Celery Worker and Celery Beat
Some data like analytics and statuses (calculated event) has the time-series manner which the Timescale library is used to demonstrate and store them.
For providing API the used package is Django Rest Framework (DRF) and for the simplicity the views are just function views.
All the analytics and statuses data related to a browser instance (a page which loads the front-end application) with cascade on_delete because they have a composition relation.
All the running section in the "Scenario" description is valid except for the celery worker part which is modified below to run the celery beat as well:
Start the Celery Worker and Beat
celery -A core worker -l info -B
And for the Test cases just run
python manage.py test analytics dashboard
- Using Class-Based Views and implement a Full RESTful API set.
- Adding API Documentation and Design Tools like Swagger.
- Adding throttling mechanism
- Using Django Session for User and Session Management.
- Design and Implementing an N-tier Design instead of Model-View.
- Design and Implement an Exception Management mechanism.
- Fully design based on Domain Driven Design.
- Implementing Full Event-Driven Micro-service Architecture (like Event Sourcing or Transactional Outbox).