From 1c02693cae0b246c289f1c7de24f34076d55f2f7 Mon Sep 17 00:00:00 2001 From: Vincent Emonet Date: Tue, 19 Sep 2023 22:56:59 +0200 Subject: [PATCH] docs --- .github/workflows/docker.yml | 2 ++ docs/docs/index.md | 13 +++++++------ 2 files changed, 9 insertions(+), 6 deletions(-) diff --git a/.github/workflows/docker.yml b/.github/workflows/docker.yml index 41c010c..62ff0bd 100644 --- a/.github/workflows/docker.yml +++ b/.github/workflows/docker.yml @@ -44,6 +44,8 @@ jobs: type=ref,event=branch,suffix=-gpu type=semver,pattern={{version}},suffix=-gpu + - run: df -h + - name: Build and push Docker image id: build-and-push uses: docker/build-push-action@v4 diff --git a/docs/docs/index.md b/docs/docs/index.md index 5f36294..283b133 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -32,7 +32,7 @@ Checkout the demo at [**chat.semanticscience.org**](https://chat.semanticscience ![UI screenshot](/libre-chat/assets/screenshot-light.png) !!! warning "Early stage" - Development on this project has just started, use it with caution. + Development on this project has just started, use it with caution. If you are looking for more mature projects check out the bottom of this page. ## ℹ️ How it works @@ -73,11 +73,12 @@ The web service is deployed using a [**⚡ FastAPI**](https://fastapi.tiangolo.c All files required for querying the model are stored and accessed locally using [**🦜🔗 LangChain**](https://python.langchain.com): the main model binary, the embeddings and documents to create the vectors, and the [vectorstore](https://python.langchain.com/docs/modules/data_connection/vectorstores/). -## 🗺️ Other projects +## 🗺️ More mature projects -If you are looking for more mature tools to use LLMs locally we recommend to look into those really good projects: +If you are looking for more mature tools to play with LLMs locally we recommend to look into those really good projects: -* [ChatDocs](https://github.com/marella/chatdocs): UI to Chat with your documents offline. -* [text-generation-webui](https://github.com/oobabooga/text-generation-webui): A Gradio web UI for Large Language Models -* [llm](https://github.com/simonw/llm): Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine +* [llm](https://github.com/simonw/llm): Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine, by Simon Willison (checkout their blog [simonwillison.net](https://simonwillison.net), for a lot of really well written articles about LLMs) +* [vLLM](https://github.com/vllm-project/vllm): A high-throughput and memory-efficient inference and serving engine for LLMs (includes OpenAI-compatible server, requires GPU) +* [ChatDocs](https://github.com/marella/chatdocs): UI to Chat with your documents offline, by the developer of [ctransformers](https://github.com/marella/ctransformers) * [localGPT](https://github.com/PromtEngineer/localGPT): Chat with your documents on your local device using GPT models +* [text-generation-webui](https://github.com/oobabooga/text-generation-webui): A Gradio web UI for Large Language Models