Skip to content

Commit

Permalink
Update AI docs for OpenAI support (#578)
Browse files Browse the repository at this point in the history
* Update AI docs for OpenAI support

* Update cli reference
  • Loading branch information
stwiname authored Dec 19, 2024
1 parent 6f49be6 commit 65114ba
Show file tree
Hide file tree
Showing 3 changed files with 25 additions and 8 deletions.
2 changes: 1 addition & 1 deletion docs/ai/build/rag.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Defining the RAG data set is largely up to the user to define. Currently only [L
We do provide an off the shelf way to create a table from markdown files. This will parse and chunk the content appropriately and use the `nomic-embed-text` model to generate vectors.

```shell
subql-ai embed-mdx -i ./path/to/dir/with/markdown -o ./db --table your-table-name
subql-ai embed-mdx -i ./path/to/dir/with/markdown -o ./db --table your-table-name --model nomic-embed-text
```

## Adding RAG to your app
Expand Down
23 changes: 19 additions & 4 deletions docs/ai/run/cli.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
# CLI Reference

```
Run an AI app
Run a SubQuery AI app
Commands:
subql-ai Run an AI app [default]
subql-ai Run a SubQuery AI app [default]
subql-ai info Get information on a project
subql-ai embed-mdx Creates a Lance db table with embeddings from MDX files
subql-ai repl Creates a CLI chat with a running app
subql-ai publish Publishes a project to IPFS so it can be easily distributed
subql-ai publish Publishes a project to IPFS so it can be easily
distributed
subql-ai init Create a new project skeleton
Options:
Expand All @@ -19,8 +20,16 @@ Options:
[string] [default: "https://unauthipfs.subquery.network/ipfs/api/v0/"]
--ipfsAccessToken A bearer authentication token to be used with the ipfs
endpoint [string]
-h, --host The ollama RPC host
--cacheDir The location to cache data from ipfs. Default is a temp
directory [string]
--debug Enable debug logging [boolean] [default: false]
--logFmt Set the logger format
[string] [choices: "json", "pretty"] [default: "pretty"]
-h, --host The LLM RPC host. If the project model uses an OpenAI
model then the default value is not used.
[string] [default: "http://localhost:11434"]
--openAiApiKey If the project models use OpenAI models, then this api
key will be parsed on to the OpenAI client [string]
-i, --interface The interface to interact with the app
[string] [choices: "cli", "http"] [default: "http"]
--port The port the http service runs on
Expand All @@ -29,8 +38,14 @@ Options:
use the cached version [boolean] [default: false]
--toolTimeout Set a limit for how long a tool can take to run, unit
is MS [number] [default: 10000]
--streamKeepAlive The interval in MS to send empty data in stream
responses to keep the connection alive. Only wokrs with
http interface. Use 0 to disable.
[number] [default: 5000]
```

These can also be specified with environment variables. They should be prefixed with `SUBQL_AI_` and the flag renambed to capitalized snake case. E.g `SUBQL_AI_CACHE_DIR`

### `subql-ai`

Run an AI app.
Expand Down
8 changes: 5 additions & 3 deletions docs/ai/welcome.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ AI apps are self contained and easily scalable AI agents that you can use to pow
- **Empower your AI with RAGs:** By integrating [RAG (Retrieval-Augmented Generation) files](./build/rag.md), your AI Apps can leverage domain-specific knowledge efficiently. With initial support for LanceDB and future compatibility with other vector databases, developers can enhance their applications' performance and accuracy. Additionally, publishing to IPFS ensures data integrity and accessibility.
- **Your AI journey starts here:** The SubQuery AI App framework is designed with user-friendliness in mind, providing intuitive wrappers around core features. This lowers the barrier to entry for developers of all skill levels, making it easier to create, run, and deploy AI Apps.
- **Connect, create, and integrate with function tooling:** You can extend your AI Apps with additional [function tooling](./build/function_tools.md), facilitating connections to external systems and tools. This capability enables rich integrations, allowing users to create versatile applications that can interact seamlessly with blockchains and other ecosystems.
- **Choose your model:** By supporting a range of open-source LLM models, starting with Ollama-compatible ones, the SubQuery AI App Framework ensures that users can choose the best model for their applications without being locked into a specific model ecosystem. This flexibility fosters open-source innovation.
- **Choose your model:** By supporting a range of open-source Ollama LLM models as well as, OpenAI, the SubQuery AI App Framework ensures that users can choose the best model for their applications without being locked into a specific model ecosystem. This flexibility fosters open-source innovation.
- **Proven standards for seamless integration:** SubQuery AI Apps expose the industry-standard [OpenAI API](./query/query.md), ensuring compatibility with a wide range of applications and tools. This makes it easier for developers to integrate AI capabilities into their projects while adhering to established standards.

![AI App Framework Features](/assets/img/ai/features.jpg)
Expand All @@ -22,7 +22,9 @@ AI apps are self contained and easily scalable AI agents that you can use to pow
To use the framework there are a couple of dependencies:

- [Deno](https://deno.land/). The SubQuery AI framework is built on Deno and is needed to build your app.
- [Ollama](https://ollama.com/). Alternatively an endpoint to an Ollama instance.
- An LLM
- [Ollama](https://ollama.com/). Alternatively an endpoint to an Ollama instance.
- [OpenAI](https://platform.openai.com). You will need a paid API Key.

### Install the framework

Expand All @@ -38,7 +40,7 @@ You can confirm installation by running `subql-ai --help`.

## Create a new App

You can initialise a new app using `subql-ai init`. It will ask you to provide a name and a Ollama model to use.
You can initialise a new app using `subql-ai init`. It will ask you to provide a name and a LLM model to use.

![Init a new AI App](/assets/img/ai/guide-init.png)

Expand Down

0 comments on commit 65114ba

Please sign in to comment.