This project implements a real-time cryptocurrency data pipeline that collects trade data from the Binance WebSocket API, processes it using Kafka, and stores it in QuestDB for fast time-series querying and analysis.
- Real-time Data Fetching from the Binance WebSocket for the BTC/USDT trading pair.
- Kafka Integration for decoupling data producer and consumer processes.
- QuestDB Storage for storing and querying trade data.
- Written in Rust for high performance and low-latency processing.
The pipeline consists of the following components:
-
WebSocket Fetcher (Rust):
- Connects to the Binance WebSocket API and listens for real-time trade updates for the BTC/USDT pair.
- Sends the fetched trade data to a Kafka topic.
-
Kafka Producer (Rust):
- Sends trade data to Kafka after receiving it from the WebSocket fetcher.
-
Kafka Consumer (Rust):
- Consumes the trade data from the Kafka topic.
- Processes and inserts the data into QuestDB for storage and querying.
-
QuestDB:
- A high-performance database optimized for storing and querying time-series data.
- Stores the incoming trades for fast access via SQL queries.
Follow these steps to get the project running on your local machine.
Before running the project, ensure you have the following installed:
- Docker: Used for running Kafka, Zookeeper, and QuestDB in containers.
- Rust: The programming language used for the fetcher, producer, and consumer components.
- Kafka and Zookeeper: These are used as the messaging system for the data pipeline.
-
Clone the repository:
git clone https://github.com/tylerd0211/inglorious_crypto.git cd inglorious_crypto
-
Set up Docker containers:
Run the following command to start the containers for Kafka, Zookeeper, and QuestDB:
docker-compose up -d This will bring up the necessary services in the background.
The project has three main components that need to run simultaneously: Fetcher, Producer, and Consumer.
-
Build the Rust project:
Build the entire project using Cargo:
cargo build
-
Run the Fetcher (WebSocket to Kafka):
This component connects to the Binance WebSocket and streams the data to Kafka.
cargo run -p fetcher
-
Run the Producer (Kafka Producer):
The producer listens to the WebSocket data, processes it, and sends it to Kafka.
cargo run -p producer
-
Run the Consumer (Kafka Consumer to QuestDB):
The consumer listens to Kafka, processes the trade data, and inserts it into QuestDB.
cargo run -p consumer
To verify that the data is being inserted into QuestDB, open the QuestDB web interface:
Log in with the default credentials:
- Username:
admin
- Password:
quest
Once logged in, run the following query to see the inserted data:
sql SELECT * FROM btc_usdt_trades;
Here’s an overview of the project directory:
- docker-compose.yml: Docker configuration to set up Kafka, Zookeeper, and QuestDB.
- src/fetcher: Rust code to fetch data from Binance WebSocket.
- src/producer: Rust code to publish data to Kafka.
- src/consumer: Rust code to consume data from Kafka and insert it into QuestDB.
- Error Handling: Improve error handling and retries in case of failures.
- Scalability: Scale the pipeline by adding more Kafka consumers or distributing the system across multiple machines.
- Visualization: Integrate with real-time visualization tools like Grafana to monitor and visualize the BTC/USDT data.
- Backtesting: Use the stored data for backtesting cryptocurrency trading strategies.
- Binance WebSocket API: Provides real-time cryptocurrency market data.
- Kafka: Used as a message broker to decouple the data producer and consumer.
- QuestDB: A high-performance time-series database for storing cryptocurrency trade data.
- Rust: Chosen for its high performance and memory safety.