Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bot] Update API inference documentation #1489

Merged
merged 26 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
453d825
Add ci workflow to auto-generate api inference doc
hanouticelina Nov 4, 2024
e5453fd
Update token
hanouticelina Nov 4, 2024
d25af56
add permissions section
hanouticelina Nov 15, 2024
ca9c544
change pnpm installation gh action
hanouticelina Nov 15, 2024
c4fa360
fix pnpm installation
hanouticelina Nov 15, 2024
d579522
change working directory
hanouticelina Nov 15, 2024
a4f51f5
revert
hanouticelina Nov 15, 2024
fa5547e
Merge branch 'main' into regenerate-api-inference-docs
Wauplin Nov 15, 2024
db214b4
hopefully good
Wauplin Nov 15, 2024
9dbcd43
trying with other versions
Wauplin Nov 15, 2024
95e7b6d
new tests
Wauplin Nov 15, 2024
7b57ea3
pnpm version
Wauplin Nov 15, 2024
817fdb9
do not update .lock file
Wauplin Nov 15, 2024
77bd2ce
don't update lock file
Wauplin Nov 15, 2024
f9b95ba
fix
Wauplin Nov 15, 2024
5dd804c
explicit package.json
Wauplin Nov 15, 2024
f26cc4b
fix lock file
Wauplin Nov 15, 2024
8a7977b
implcit
Wauplin Nov 15, 2024
d0ccd86
explicit
Wauplin Nov 15, 2024
27dcaf1
update pnpm ?
Wauplin Nov 15, 2024
859db66
daily cron job
Wauplin Nov 15, 2024
c1aef71
update huggingface/tasks before generating docs
hanouticelina Nov 18, 2024
a914215
run workflow on this branch to test
hanouticelina Nov 18, 2024
290314b
Merge branch 'main' into regenerate-api-inference-docs
hanouticelina Nov 18, 2024
202853f
Update API inference documentation (automated)
hanouticelina Nov 18, 2024
4d7d1f0
Merge branch 'main' into update-api-inference-docs-automated-pr
hanouticelina Nov 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions .github/workflows/api_inference_generate_documentation.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
name: Update API Inference Documentation

on:
workflow_dispatch:
push:
branches:
- regenerate-api-inference-docs
# schedule:
# - cron: "0 3 * * *" # Every day at 3am

concurrency:
group: api_inference_generate_documentation
cancel-in-progress: true

jobs:
pull_request:
runs-on: ubuntu-latest
steps:
# Setup
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: "20"
- name: Install pnpm
uses: pnpm/action-setup@v2
with:
run_install: |
- recursive: true
cwd: ./scripts/api-inference
args: [--frozen-lockfile]
package_json_file: ./scripts/api-inference/package.json
- name: Update huggingface/tasks package
working-directory: ./scripts/api-inference
run: |
pnpm update @huggingface/tasks@latest
# Generate
- name: Generate API inference documentation
run: pnpm run generate
working-directory: ./scripts/api-inference

# Check changes
- name: Check changes
run: git status

# Create or update Pull Request
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
token: ${{ secrets.TOKEN_INFERENCE_SYNC_BOT }}
commit-message: Update API inference documentation (automated)
branch: update-api-inference-docs-automated-pr
delete-branch: true
title: "[Bot] Update API inference documentation"
body: |
This PR automatically regenerates the API inference documentation by running:
```sh
cd scripts/api-inference
pnpm run generate
```

This PR was automatically created by the [Update API Inference Documentation workflow](https://github.com/huggingface/hub-docs/blob/main/.github/workflows/api_inference_generate_documentation.yml).

Please review the changes before merging.
reviewers: |
Wauplin
hanouticelina
32 changes: 16 additions & 16 deletions docs/api-inference/tasks/chat-completion.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ curl 'https://api-inference.huggingface.co/models/google/gemma-2-2b-it/v1/chat/c
</curl>

<python>
With huggingface_hub client:
Using `huggingface_hub`:
```py
from huggingface_hub import InferenceClient

Expand All @@ -103,7 +103,7 @@ for chunk in stream:
print(chunk.choices[0].delta.content, end="")
```

With openai client:
Using `openai`:
```py
from openai import OpenAI

Expand Down Expand Up @@ -134,11 +134,11 @@ To use the Python client, see `huggingface_hub`'s [package reference](https://hu
</python>

<js>
With huggingface_hub client:
Using `huggingface.js`:
```js
import { HfInference } from "@huggingface/inference"
import { HfInference } from "@huggingface/inference";

const client = new HfInference("hf_***")
const client = new HfInference("hf_***");

let out = "";

Expand All @@ -162,14 +162,14 @@ for await (const chunk of stream) {
}
```

With openai client:
Using `openai`:
```js
import { OpenAI } from "openai"
import { OpenAI } from "openai";

const client = new OpenAI({
baseURL: "https://api-inference.huggingface.co/v1/",
apiKey: "hf_***"
})
});

let out = "";

Expand Down Expand Up @@ -237,7 +237,7 @@ curl 'https://api-inference.huggingface.co/models/meta-llama/Llama-3.2-11B-Visio
</curl>

<python>
With huggingface_hub client:
Using `huggingface_hub`:
```py
from huggingface_hub import InferenceClient

Expand Down Expand Up @@ -272,7 +272,7 @@ for chunk in stream:
print(chunk.choices[0].delta.content, end="")
```

With openai client:
Using `openai`:
```py
from openai import OpenAI

Expand Down Expand Up @@ -314,11 +314,11 @@ To use the Python client, see `huggingface_hub`'s [package reference](https://hu
</python>

<js>
With huggingface_hub client:
Using `huggingface.js`:
```js
import { HfInference } from "@huggingface/inference"
import { HfInference } from "@huggingface/inference";

const client = new HfInference("hf_***")
const client = new HfInference("hf_***");

let out = "";

Expand Down Expand Up @@ -353,14 +353,14 @@ for await (const chunk of stream) {
}
```

With openai client:
Using `openai`:
```js
import { OpenAI } from "openai"
import { OpenAI } from "openai";

const client = new OpenAI({
baseURL: "https://api-inference.huggingface.co/v1/",
apiKey: "hf_***"
})
});

let out = "";

Expand Down
14 changes: 2 additions & 12 deletions docs/api-inference/tasks/image-text-to-text.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,13 +45,8 @@ curl https://api-inference.huggingface.co/models/meta-llama/Llama-3.2-11B-Vision
</curl>

<python>
With huggingface_hub client:
Using `huggingface_hub`:
```py
import requests

API_URL = "https://api-inference.huggingface.co/models/meta-llama/Llama-3.2-11B-Vision-Instruct"
headers = {"Authorization": "Bearer hf_***"}

from huggingface_hub import InferenceClient

client = InferenceClient(api_key="hf_***")
Expand All @@ -69,13 +64,8 @@ for chunk in stream:
print(chunk.choices[0].delta.content, end="")
```

With openai client:
Using `openai`:
```py
import requests

API_URL = "https://api-inference.huggingface.co/models/meta-llama/Llama-3.2-11B-Vision-Instruct"
headers = {"Authorization": "Bearer hf_***"}

from openai import OpenAI

client = OpenAI(
Expand Down
11 changes: 11 additions & 0 deletions docs/api-inference/tasks/text-to-image.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,16 @@ curl https://api-inference.huggingface.co/models/black-forest-labs/FLUX.1-dev \
</curl>

<python>
Using `huggingface_hub`:
```py
from huggingface_hub import InferenceClient
client = InferenceClient("black-forest-labs/FLUX.1-dev", token="hf_***")

# output is a PIL.Image object
image = client.text_to_image("Astronaut riding a horse")
```

Using `requests`:
```py
import requests

Expand All @@ -57,6 +67,7 @@ def query(payload):
image_bytes = query({
"inputs": "Astronaut riding a horse",
})

# You can access the image with PIL.Image for example
import io
from PIL import Image
Expand Down
3 changes: 2 additions & 1 deletion scripts/api-inference/package.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
{
"name": "api-inference-generator",
"packageManager": "[email protected]",
"version": "1.0.0",
"description": "",
"main": "index.js",
Expand All @@ -13,7 +14,7 @@
"author": "",
"license": "ISC",
"dependencies": {
"@huggingface/tasks": "^0.12.15",
"@huggingface/tasks": "^0.13.3",
"@types/node": "^22.5.0",
"handlebars": "^4.7.8",
"node": "^20.17.0",
Expand Down
10 changes: 5 additions & 5 deletions scripts/api-inference/pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading