Skip to content

Commit

Permalink
improve example notebook langchain
Browse files Browse the repository at this point in the history
  • Loading branch information
vemonet committed Jan 17, 2024
1 parent 2675729 commit 84b9288
Show file tree
Hide file tree
Showing 2 changed files with 68 additions and 9 deletions.
27 changes: 20 additions & 7 deletions docs/docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,13 +70,26 @@ All files required for querying the model are stored and accessed locally using

## 🗺️ More mature projects

If you are looking for more mature tools to play with LLMs locally we recommend to look into those really good projects:
If you are looking for more mature tools to play with LLMs locally we recommend to look into those really good projects.

Web UI for chat:

* [HuggingFace chat-ui](https://github.com/huggingface/chat-ui): a Svelte chat web UI. With multiple conversation history, and OIDC login
* [chatbot-ui](https://github.com/mckaywrigley/chatbot-ui): a React chat web UI. With multiple conversation history, no login
* [chat-langchain](https://github.com/langchain-ai/chat-langchain): a React chat web UI for LangChain. Connect well with LangSmith to show trace. No login, no multiple conversation history.
* [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui): A Gradio web UI for Large Language Models, with panels to config the LLM params adapted for experimentation.
* [chainlit](https://github.com/Chainlit/chainlit): build LLM app with your own business logic, with React web UI
* [FastChat](https://github.com/lm-sys/FastChat): platform for training, serving, and evaluating LLMs in an arena, with Gradio web UI.
* [GPT4All](https://gpt4all.io): open-source LLM chatbots that you can run anywhere, with a web UI
* [localGPT](https://github.com/PromtEngineer/localGPT): Chat with your documents on your local device using GPT models
* [ChatDocs](https://github.com/marella/chatdocs): UI to Chat with your documents offline

Run LLM inference locally:

* [LocalAI](https://github.com/mudler/LocalAI): OpenAI compatible API. Self-hosted, community-driven and local-first.

* [vLLM](https://github.com/vllm-project/vllm): A high-throughput and memory-efficient inference and serving engine for LLMs (includes OpenAI-compatible server, requires GPU)

* [chat-langchain](https://github.com/langchain-ai/chat-langchain): chat UI for LangChain
* [ollama](https://github.com/jmorganca/ollama): Get up and running with Llama 2 and other large language models locally
* [GPT4All](https://gpt4all.io): open-source LLM chatbots that you can run anywhere

* [llm](https://github.com/simonw/llm): Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine, by Simon Willison (checkout their blog [simonwillison.net](https://simonwillison.net), for a lot of really well written articles about LLMs)
* [vLLM](https://github.com/vllm-project/vllm): A high-throughput and memory-efficient inference and serving engine for LLMs (includes OpenAI-compatible server, requires GPU)
* [ChatDocs](https://github.com/marella/chatdocs): UI to Chat with your documents offline, by the developer of [ctransformers](https://github.com/marella/ctransformers)
* [localGPT](https://github.com/PromtEngineer/localGPT): Chat with your documents on your local device using GPT models
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui): A Gradio web UI for Large Language Models
50 changes: 48 additions & 2 deletions scripts/langchain.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,60 @@
"\n",
"```bash\n",
"wget https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF/resolve/main/mixtral-8x7b-instruct-v0.1.Q2_K.gguf\n",
"```"
"```\n",
"\n",
"Make sure to pick up a model already fine-tuned for chat (they should have `instruct` or `chat` in the name)"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: langchain in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (0.1.0)\n",
"Requirement already satisfied: langchain-community in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (0.0.9)\n",
"Requirement already satisfied: llama-cpp-python in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (0.2.27)\n",
"Requirement already satisfied: PyYAML>=5.3 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (6.0.1)\n",
"Requirement already satisfied: SQLAlchemy<3,>=1.4 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (2.0.25)\n",
"Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (3.9.1)\n",
"Requirement already satisfied: async-timeout<5.0.0,>=4.0.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (4.0.3)\n",
"Requirement already satisfied: dataclasses-json<0.7,>=0.5.7 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (0.6.3)\n",
"Requirement already satisfied: jsonpatch<2.0,>=1.33 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (1.33)\n",
"Requirement already satisfied: langchain-core<0.2,>=0.1.7 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (0.1.7)\n",
"Requirement already satisfied: langsmith<0.1.0,>=0.0.77 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (0.0.77)\n",
"Requirement already satisfied: numpy<2,>=1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (1.26.3)\n",
"Requirement already satisfied: pydantic<3,>=1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (2.4.0)\n",
"Requirement already satisfied: requests<3,>=2 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (2.31.0)\n",
"Requirement already satisfied: tenacity<9.0.0,>=8.1.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain) (8.2.3)\n",
"Requirement already satisfied: typing-extensions>=4.5.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from llama-cpp-python) (4.9.0)\n",
"Requirement already satisfied: diskcache>=5.6.1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from llama-cpp-python) (5.6.3)\n",
"Requirement already satisfied: attrs>=17.3.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (23.2.0)\n",
"Requirement already satisfied: multidict<7.0,>=4.5 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (6.0.4)\n",
"Requirement already satisfied: yarl<2.0,>=1.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.9.4)\n",
"Requirement already satisfied: frozenlist>=1.1.1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.4.1)\n",
"Requirement already satisfied: aiosignal>=1.1.2 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from aiohttp<4.0.0,>=3.8.3->langchain) (1.3.1)\n",
"Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (3.20.1)\n",
"Requirement already satisfied: typing-inspect<1,>=0.4.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from dataclasses-json<0.7,>=0.5.7->langchain) (0.9.0)\n",
"Requirement already satisfied: jsonpointer>=1.9 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from jsonpatch<2.0,>=1.33->langchain) (2.4)\n",
"Requirement already satisfied: anyio<5,>=3 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain-core<0.2,>=0.1.7->langchain) (4.2.0)\n",
"Requirement already satisfied: packaging<24.0,>=23.2 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from langchain-core<0.2,>=0.1.7->langchain) (23.2)\n",
"Requirement already satisfied: annotated-types>=0.4.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from pydantic<3,>=1->langchain) (0.6.0)\n",
"Requirement already satisfied: pydantic-core==2.10.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from pydantic<3,>=1->langchain) (2.10.0)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from requests<3,>=2->langchain) (3.3.2)\n",
"Requirement already satisfied: idna<4,>=2.5 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from requests<3,>=2->langchain) (3.6)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from requests<3,>=2->langchain) (1.26.18)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from requests<3,>=2->langchain) (2023.11.17)\n",
"Requirement already satisfied: greenlet!=0.4.17 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from SQLAlchemy<3,>=1.4->langchain) (3.0.3)\n",
"Requirement already satisfied: sniffio>=1.1 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1.7->langchain) (1.3.0)\n",
"Requirement already satisfied: exceptiongroup>=1.0.2 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from anyio<5,>=3->langchain-core<0.2,>=0.1.7->langchain) (1.2.0)\n",
"Requirement already satisfied: mypy-extensions>=0.3.0 in /home/vemonet/dev/llm/libre-chat/.venv/libre-chat/lib/python3.10/site-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7,>=0.5.7->langchain) (1.0.0)\n"
]
}
],
"source": [
"import sys\n",
"!{sys.executable} -m pip install langchain langchain-community llama-cpp-python\n",
Expand Down

0 comments on commit 84b9288

Please sign in to comment.