Skip to content

steinbring/python-langchain-rag-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Python / LangChain / Ollama RAG Demo

This is the code that we used in the June 28 Crafting Intelligent Python Apps with Retrieval-Augmented Generation Open Space. This is a command line interface (CLI) app but you can expand it to be a web service. This should run on a Windows PC but these instructions assume that you are using MacOS.

Steps to run this locally

  1. Install Ollama on your computer
  2. Pull down this project from Github
  3. Navigate to the project folder in iterm2 (or Terminal)
  4. Pull down phi3
    1. ollama pull phi3
  5. Pull down nomic-embed-text
    1. ollama pull nomic-embed-text
  6. Set up your Python virtual environment
    1. pip install virtualenv (if needed)
    2. python3 -m venv ragdemo
    3. source ragdemo/bin/activate
  7. Install LangChain, beautifulsoup4, tiktoken, and Chroma DB
    1. pip install langchain_community
    2. pip install beautifulsoup4
    3. pip install tiktoken
    4. pip install chromadb
  8. Run the app (making an actual query)
    1. python3 app.py "What did apple announce?"

Are you having trouble running this?

Feel free to contact me on Mastodon or Signal

Can you do this without LangChain?

Yeah, as of last month, you can. I wrote an article about how to do it.

What if I missed this Open Space?

I am planning on presenting "The Scoop on Embedding: Teaching Large Language Models the 'Flavor of the Day' at Culvers" on August 23 at DevCon Midwest

About

A demo of a CLI RAG app that uses LangChain and Ollama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages