It's a chatbot for Telegram utilizing genius llama.cpp. Try live instance here @telellamabot
llama-telegram-bot is written in Go and uses go-llama.cpp which is binding to llama.cpp
Let's start! Everything is simple!
Parameters are passed as env variables. Currently there are only 5 params:
MODEL_PATH=/path/to/model
TG_TOKEN=your_telegram_bot_token_here
Q_SIZE=1000
- task queue limit (optional: default 1000)N_TOKENS=1024
- tokens to predict (optional: default 1024)N_CPU=4
- number of cpu to use (optional: default max available)
git clone --recurse-submodules https://github.com/thedmdim/llama-telegram-bot
cp .env.example .env
and edit.env
as you needdocker compose up -d
You need to have Go and CMake installed
git clone --recurse-submodules https://github.com/thedmdim/llama-telegram-bot
cd llama-telegram-bot && make
go build .
env TG_TOKEN=<your_telegram_bot_token> MODEL_PATH=/path/to/your/model ./llama-telegram-bot