Skip to content

Commit

Permalink
Minor stylistic fix
Browse files Browse the repository at this point in the history
  • Loading branch information
koppor committed Aug 13, 2024
1 parent 7fb03e7 commit 36e6a3d
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions en/ai/local-llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@ After you started your service, you can do this:

Voi la! You can use a local LLM right away in JabRef.

## More detailed tutorial
## Step-by-step guide for `ollama`

In this section we will explain how to use `ollama` for downloading and running local LLMs.
The following steps guide you how to use `ollama` for downloading and running local LLMs.

1. Install `ollama` from [their website](https://ollama.com/download)
2. Select a model that you want to run. The `ollama` provides [a big list of models](https://ollama.com/library) to choose from (we recommend you to try [`gemma2:2b`](https://ollama.com/library/gemma2:2b), or [`mistral:7b`](https://ollama.com/library/mistral), or [`tinyllama`](https://ollama.com/library/tinyllama))
Expand Down

0 comments on commit 36e6a3d

Please sign in to comment.