From cc66dccbe8faba2f2c19ca575bdb9958b68145e8 Mon Sep 17 00:00:00 2001 From: RodriMora Date: Wed, 11 Dec 2024 19:45:49 +0100 Subject: [PATCH] Update README.md (#2827) Added instructions to clone the repo and change directory into it. In following steps there is a "make install" step that would fail if people have not cloned the repo and cd into it, so it may be confusing for some Added python venv alternative to conda too. --- README.md | 16 ++++++++++++++-- 1 file changed, 14 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 631a97a2ddc..6beb8281ee6 100644 --- a/README.md +++ b/README.md @@ -196,14 +196,26 @@ Detailed blogpost by Adyen on TGI inner workings: [LLM inference at scale with T You can also opt to install `text-generation-inference` locally. -First [install Rust](https://rustup.rs/) and create a Python virtual environment with at least -Python 3.9, e.g. using `conda`: +First clone the repository and change directoy into it: + +```shell +git clone https://github.com/huggingface/text-generation-inference +cd text-generation-inference +``` + +Then [install Rust](https://rustup.rs/) and create a Python virtual environment with at least +Python 3.9, e.g. using `conda` or `python venv`: ```shell curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh +#using conda conda create -n text-generation-inference python=3.11 conda activate text-generation-inference + +#using pyton venv +python3 -m venv .venv +source .venv/bin/activate ``` You may also need to install Protoc.