It is a model wrapped into the container, and orchestrated by Airflow, which predicts 'Rented Bike Count' based on bike sharing data. Data and MLflow artifacts are stored on S3 (localstack).
mkdir -p ./dags ./logs ./plugins ./config
Set environmental variables with your AWS secrets:
export AWS_ACCESS_KEY_ID="your_access_key_id" AWS_SECRET_ACCESS_KEY="your_secret_access_key"
and with Airflow user ID:
export AIRFLOW_UID=$(id -u)
Prepare and start containers:
docker compose build
docker compose up airflow-init
docker compose up -d
Wait about a minute until all services will be up.
- Airflow: http://localhost:8080/ (user/password: "airflow")
- MLflow: http://localhost:5000/
- Prediction API: http://localhost:8000/
The prediction service provides the following endpoints:
GET /health
: Check service healthPOST /predict
: Make predictions using the following JSON format:
Feature descriptions:
hour
: Hour of the day (0-23)temperature
: Temperature in Celsiushumidity
: Humidity percentage (0-100)wind_speed
: Wind speed in m/svisibility
: Visibility in kmdew_point_temperature
: Dew point temperature in Celsiussolar_radiation
: Solar radiation in MJ/m2rainfall
: Rainfall in mmsnowfall
: Snowfall in cm
Example curl request:
curl -X POST "http://localhost:8000/predict" \
-H "Content-Type: application/json" \
-d '{
"hour": 12,
"temperature": 20.5,
"humidity": 65,
"wind_speed": 2.5,
"visibility": 2000,
"dew_point_temperature": 15.5,
"solar_radiation": 1.2,
"rainfall": 0,
"snowfall": 0
}'
To stop all containers:
docker compose down --volumes