Skip to content

Graphlet AI Chatbot Class: Large Language Models, Vector Search, Generative AI

License

Notifications You must be signed in to change notification settings

Graphlet-AI/chatbot-class

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Graphlet AI - Chatbot Class

This course covers generative AI, large language models (LLMs), vector search, retrieval aided generation (RAG), LLM fine-tuning, and more. The course is designed for software engineers, data scientists, and machine learning engineers who want to learn how to build AI-powered chatbots.

Shout Outs to Ivan Reznikov

This course owes great debt to the LangChain 101 course by Ivan Reznikov. I started with it and other content I created and combined them into a single course covering the topics below.

Course Essentials

Skill Prerequisites

  • Basic knowledge of Python
  • Basic CS and math skills

Course Outline

  • Introduction to Generative AI
  • Introduction to Large Language Models (LLMs)
  • Introduction to Vector Search
  • Introduction to Retrieval Aided Generation (RAG)
  • Introduction to LLM Fine-Tuning

Course Projects

  • Build a generative chatbot
  • Build a retrieval-based chatbot
  • Build a generative chatbot with retrieval-aided generation (RAG)
  • Build a generative chatbot with LLM fine-tuning

Course Materials

Code Environment Setup

I provide a Docker image for this course that uses Jupyter Notebooks. Docker allows you to run the class's code in an environment precisely matching the one in which the code was developed and tested. You can also use the Docker image to run the course code in VSCode or another editor (see below).

In addition to Docker you can also setup an environment locally using the instructions below.

Install Docker

Install docker and then check the Get Started page if you aren't familiar.

There are several docker containers used in this course:

  • jupyter: Jupyter Notebook server where we will interactively write and run code.
  • neo4j: Neo4j graph database server where we will store and query graph data for prompt engineering and fine-tuning LLMs.
  • opensearch: OpenSearch server where we will store and query documents for RAG.

Docker Compose

Bring up the course environment with the following command:

docker compose up -d

Find the Jupyter Notebook url via this command:

docker logs jupyter -f --tail 100

Look for the url with 127.0.0.1 and open it. You should see the Jupyter Notebook home page.

NOTE: Insert an image of Jupyter home page for this course.

Docker and VSCode

NOTE: add instructions.

Code-Level Environment Setup

We use a Docker image to run the course, but you can also setup the environment so the code will work in VSCode or another editor. We provide a development tools setup using black, flake8, isort, mypy and pre-commit for you to modify and use as you see fit.

Install Anaconda Python

We use Anaconda Python, Python version 3.10.0, for this course. You can download Anaconda Python from here. Once you have installed Anaconda Python, you can create a new environment for this course by running the following command:

conda create -n chatbot-class python=3.10 -y

When you create a new environment or start a new shell, you will need to activate the chatbot-class conda environment with the following command:

conda activate chatbot-class

Now you are running Python 3.10 in the chatbot-class environment. To use this Python in VSCode, hit SHIFT-CMD-P (on Mac) and select Python: Select Interpreter. Then select the chatbot-class environment's Python.

To deactivate this environment, run:

conda deactivate

Other Virtual Environments

Note: I don't support other environments, but you can actually use any Python 3.10 if you are smart enough to make that work. :) You will need to manage your own virtual environments. Python 3's venv are easy to use.

To create a venv for the project, run:

python3 -m venv chatbot-class

To activate this venv run:

source chatbot-class/bin/activate

To deactivate this environment, run:

deactivate

Install Poetry for Dependency Management

We use Poetry for dependency management, as it makes things fairly painless.

Verify Poetry installation instructions here so you know the URL https://install.python-poetry.org is legitimate to execute in python3.

Then install Poetry with the following command:

curl -sSL https://install.python-poetry.org | python3 -

It is less "clean" in terms of environmental isolation, but alternatively you can install poetry via pip:

pip install poetry

Install Dependencies via Poetry

poetry install

Essential Tools

LangChain is a framework for developing applications powered by language models. It enables applications that:

  • Are context-aware: connect a language model to sources of context (prompt instructions, few shot examples, content to ground its response in, etc.)
  • Reason: rely on a language model to reason (about how to answer based on provided context, what actions to take, etc.)

The main value props of LangChain are:

  • Components: abstractions for working with language models, along with a collection of implementations for each abstraction. Components are modular and easy-to-use, whether you are using the rest of the LangChain framework or not
  • Off-the-shelf chains: a structured assembly of components for accomplishing specific higher-level tasks Off-the-shelf chains make it easy to get started. For complex applications, components make it easy to customize existing chains and build new ones.

Taking inspiration from Hugging Face Hub, LangChainHub is collection of all artifacts useful for working with LangChain primitives such as prompts, chains and agents. The goal of this repository is to be a central resource for sharing and discovering high quality prompts, chains and agents that combine together to form complex LLM applications.

See example usage here.

LlamaIndex is a "data framework" to help you build LLM apps. It provides the following tools:

  • Offers data connectors to ingest your existing data sources and data formats (APIs, PDFs, docs, SQL, etc.)
  • Provides ways to structure your data (indices, graphs) so that this data can be easily used with LLMs.
  • Provides an advanced retrieval/query interface over your data: Feed in any LLM input prompt, get back retrieved context and knowledge-augmented output.
  • Allows easy integrations with your outer application framework (e.g. with LangChain, Flask, Docker, ChatGPT, anything else).

LlamaIndex provides tools for both beginner users and advanced users. Our high-level API allows beginner users to use LlamaIndex to ingest and query their data in 5 lines of code. Our lower-level APIs allow advanced users to customize and extend any module (data connectors, indices, retrievers, query engines, reranking modules), to fit their needs.

About

Graphlet AI Chatbot Class: Large Language Models, Vector Search, Generative AI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages