The chat app is a chainlit app that interacts with a user either through a twilio number or through a chainlit web interface.
The data ingestion is a streamlit app that takes a PDF or a DOCX file, chunks and embeds it and stores the embeddings in a MongoDB Atlas vector store.
Both apps share a great deal of common dependencies. The main difference is that the chat app uses chainlit while data ingestion uses streamlit because file uploading is handled much more cleanly on streamlit than chainlit.
The chat app implements an algorithm introduced in Wang, L., Xu, W., Lan, Y., Hu, Z., Lan, Y., Lee, R. K. W., & Lim, E. P. (2023). Plan-and-solve prompting: Improving zero-shot chain-of-thought reasoning by large language models. arXiv preprint arXiv:2305.04091. [PDF] to generate an answer with high accuracy.
The current implementation is as follows:
- Python 3.11
- Clone this repo
- Setup python and install deps
python3.11 -m venv venv source ./venv/bin/activate pip install -r ./requirements.txt
- Copy
.env.example
to.env
and replace with real values
To use the chat app, run the following command from the ./chatbot
directory.
chainlit run ./src/bot.py
To reach the API endpoints, use /sms
:
http://localhost:8000/sms
To get an answer back in a GET request, use /answer/?question=YOUR-QUESTION
. An example:
http://localhost:8000/answer/?question=what time is checkout?
To ingest data to use the RAG system in the chatbot, run the following command from the ./ingestion
directory.
streamlit run ./app.py