Skip to content

Commit

Permalink
docs: add docker instructions
Browse files Browse the repository at this point in the history
  • Loading branch information
vishalmhjn committed Oct 23, 2024
1 parent 9cda06f commit b453339
Showing 1 changed file with 21 additions and 13 deletions.
34 changes: 21 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,22 @@ Traffic-Waves is a voluntary project focused on daily traffic predictions in Par

The project leverages ML and DL techniques to analyze historical traffic data and make predictions for daily traffic patterns in Paris. This aids in providing insights for commuters and city planners alike.

## Download and Install
## Pipeline components

The above command runs the following components:

**Data**:
- **[Data collection](src/call_data_api.py)**: Call the Open data Paris API and save the data in batches.
- **[Data processing](src/process_data.py)**: Merge the data and apply preprocessing steps to prepare data for batch predictions.

**Machine learning**:
- **[Model training](src/train.py)**: Import and train the ML model on the historical data.
- **[Predictions](src/predict.py)**: Get the one-day ahead predictions using the trained model and batch data.

**Visualization**:
- **[Dashboard](src/app.py)**: Start a flask app to display the input data and predictions for all the links.

## 1. Download and Install

To install the requirements, run the following command in the parent:

Expand All @@ -16,7 +31,7 @@ cd traffic-waves
make install
```

## Usage
### Usage

Run the data collection, processing and machine learing pipeline (with default options):
```bash
Expand All @@ -29,20 +44,13 @@ make app
```
Visit `http://127.0.0.1:5000/` in the web-browser to open the visualization dashboard.

## Pipeline components
## 2. Building and running your application using Docker

The above command runs the following components:
When you're ready with Docker installed, start your application by running:
`docker compose up --build`.

**Data**:
- **[Data collection](src/call_data_api.py)**: Call the Open data Paris API and save the data in batches.
- **[Data processing](src/process_data.py)**: Merge the data and apply preprocessing steps to prepare data for batch predictions.
Your application will be available at http://localhost:5000.

**Machine learning**:
- **[Model training](src/train.py)**: Import and train the ML model on the historical data.
- **[Predictions](src/predict.py)**: Get the one-day ahead predictions using the trained model and batch data.

**Visualization**:
- **[Dashboard](src/app.py)**: Start a flask app to display the input data and predictions for all the links.

## License
**to be updated**

0 comments on commit b453339

Please sign in to comment.