Skip to content

Latest commit

 

History

History
36 lines (27 loc) · 1.79 KB

README.md

File metadata and controls

36 lines (27 loc) · 1.79 KB

Python / LangChain / Ollama RAG Demo

This is the code that we used in the June 28 Crafting Intelligent Python Apps with Retrieval-Augmented Generation Open Space. This is a command line interface (CLI) app but you can expand it to be a web service. This should run on a Windows PC but these instructions assume that you are using MacOS.

Steps to run this locally

  1. Install Ollama on your computer
  2. Pull down this project from Github
  3. Navigate to the project folder in iterm2 (or Terminal)
  4. Pull down phi3
    1. ollama pull phi3
  5. Pull down nomic-embed-text
    1. ollama pull nomic-embed-text
  6. Set up your Python virtual environment
    1. pip install virtualenv (if needed)
    2. python3 -m venv ragdemo
    3. source ragdemo/bin/activate
  7. Install LangChain, beautifulsoup4, tiktoken, and Chroma DB
    1. pip install langchain_community
    2. pip install beautifulsoup4
    3. pip install tiktoken
    4. pip install chromadb
  8. Run the app (making an actual query)
    1. python3 app.py "What did apple announce?"

Are you having trouble running this?

Feel free to contact me on Mastodon or Signal

Can you do this without LangChain?

Yeah, as of last month, you can. I wrote an article about how to do it.

What if I missed this Open Space?

I am planning on presenting "The Scoop on Embedding: Teaching Large Language Models the 'Flavor of the Day' at Culvers" on August 23 at DevCon Midwest