Skip to content

Commit

Permalink
Apply suggestions from code review
Browse files Browse the repository at this point in the history
Co-authored-by: abbycross <[email protected]>
  • Loading branch information
vabarbosa and abbycross authored Nov 11, 2024
1 parent 577c746 commit d73cfb1
Showing 1 changed file with 14 additions and 14 deletions.
28 changes: 14 additions & 14 deletions docs/guides/qiskit-code-assistant-local.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ The Ollama application provides a simple solution to run the GGUF models locally

#### Run the Qiskit Code Assistant model in Ollama

After the `granite-8b-qiskit` GGUF model has been set up in Ollama, run the following command to launch the model and interact with it in the terminal (in chat mode)
After the `granite-8b-qiskit` GGUF model has been set up in Ollama, run the following command to launch the model and interact with it in the terminal (in chat mode).

```
ollama run granite-8b-qiskit
Expand All @@ -148,8 +148,8 @@ Some useful commands:

An alternative to the Ollama application is the `llama-cpp-python` package. It is a Python binding for `llama.cpp`. It gives you more control and flexibility to run the GGUF model locally. It’s ideal for users who wish to integrate the local model in their workflows and Python applications.

1. Install `llama-cpp-python`: https://pypi.org/project/llama-cpp-python/
1. Interact with the model from within your application using `llama_cpp` e.g.,
1. Install [`llama-cpp-python`](https://pypi.org/project/llama-cpp-python/)
1. Interact with the model from within your application using `llama_cpp`. For example:

```python
from llama_cpp import Llama
Expand Down Expand Up @@ -181,24 +181,24 @@ raw_pred = model(input, **generation_kwargs)["choices"][0]["text"]

### Use the Qiskit Code Assistant extensions

The VS Code extension and JupyterLab extension for the Qiskit Code Assistant can be used to prompt the locally deployed `granite-8b-qiskit` GGUF model. Once you have the Ollama application [up and running with the model](#using-the-ollama-application) you can configure the extensions to connect to the local service.
Use the VS Code extension and JupyterLab extension for the Qiskit Code Assistant to prompt the locally deployed `granite-8b-qiskit` GGUF model. Once you have the Ollama application [set up with the model](#using-the-ollama-application), you can configure the extensions to connect to the local service.


#### Connect with the Qiskit Code Assistant VS Code extension

Using the Qiskit Code Assistant VS Code extension allows you to interact with the model and perform code completion while writing your code. This can work well for users looking for assistance writing Qiskit code for their Python applications.
With the Qiskit Code Assistant VS Code extension, you can interact with the model and perform code completion while writing your code. This can work well for users looking for assistance writing Qiskit code for their Python applications.

1. Install the [Qiskit Code Assistant VS Code extension](/guides/qiskit-code-assistant-vscode)
1. In VS Code, go to the **User Settings** and set the **Qiskit Code Assistant: Url** to the URL of your local Ollama deployment (i.e., http://localhost:11434)
1. Reload VS Code, by going to **View > Command Pallette...** and selecting **Developer: Reload Window**
1. Install the [Qiskit Code Assistant VS Code extension](/guides/qiskit-code-assistant-vscode).
1. In VS Code, go to the **User Settings** and set the **Qiskit Code Assistant: Url** to the URL of your local Ollama deployment (for example, http://localhost:11434).
1. Reload VS Code by going to **View > Command Palette...** and selecting **Developer: Reload Window**.

The `granite-8b-qiskit` configured in Ollama should appear in the status bar and ready to use.
The `granite-8b-qiskit` configured in Ollama should appear in the status bar and is then ready to use.

#### Connect with Qiskit Code Assistant JupyterLab extension
#### Connect with the Qiskit Code Assistant JupyterLab extension

Using the Qiskit Code Assistant JupyterLab extension allows you to interact with the model and perform code completion directly in your Jupyter Notebook. Users who predominantly work with Jupyter Notebooks can take advantage of this extension to further enhance their experience writing Qiskit code.
With the Qiskit Code Assistant JupyterLab extension, you can interact with the model and perform code completion directly in your Jupyter Notebook. Users who predominantly work with Jupyter Notebooks can take advantage of this extension to further enhance their experience writing Qiskit code.

1. Install the [Qiskit Code Assistant JupyterLab extension](/guides/qiskit-code-assistant-jupyterlab)
1. In JupyterLab, go to the **Settings Editor** and set the **Qiskit Code Assistant Service API** to the URL of your local Ollama deployment (i.e., http://localhost:11434)
1. Install the [Qiskit Code Assistant JupyterLab extension](/guides/qiskit-code-assistant-jupyterlab).
1. In JupyterLab, go to the **Settings Editor** and set the **Qiskit Code Assistant Service API** to the URL of your local Ollama deployment (for example, http://localhost:11434).

The `granite-8b-qiskit` configured in Ollama should appear in the status bar and ready to use.
The `granite-8b-qiskit` configured in Ollama should appear in the status bar and is then ready to use.

0 comments on commit d73cfb1

Please sign in to comment.