diff --git a/website/docs/administration/model.md b/website/docs/administration/model.md index b72a827feaa3..6faea4f552dd 100644 --- a/website/docs/administration/model.md +++ b/website/docs/administration/model.md @@ -23,6 +23,7 @@ The `llama.cpp` model can be configured with the following parameters: [model.completion.http] kind = "llama.cpp/completion" api_endpoint = "http://localhost:8888" +prompt_template = "
{prefix}{suffix} " # Example prompt template for CodeLlama model series. ``` #### [ollama](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion) @@ -33,6 +34,7 @@ For setting up the `ollama` model, apply the configuration below: [model.completion.http] kind = "ollama/completion" api_endpoint = "http://localhost:8888" +prompt_template = " {prefix}{suffix} " # Example prompt template for CodeLlama model series. ``` #### [mistral / codestral](https://docs.mistral.ai/api/#operation/createFIMCompletion)