Skip to content

Commit

Permalink
docs: add --no-webserver for installation flags until we add detailed…
Browse files Browse the repository at this point in the history
… documentation about setup the webserver (#2181)
  • Loading branch information
wsxiaoys authored May 19, 2024
1 parent 93039ff commit d391ac9
Show file tree
Hide file tree
Showing 7 changed files with 9 additions and 7 deletions.
2 changes: 1 addition & 1 deletion website/docs/installation/apple.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Thanks to Apple's Accelerate and CoreML frameworks, we can now run Tabby on edge
brew install tabbyml/tabby/tabby

# Start server with StarCoder-1B
tabby serve --device metal --model StarCoder-1B
tabby serve --device metal --model StarCoder-1B --no-webserver
```

The compute power of M1/M2 is limited and is likely to be sufficient only for individual usage. If you require a shared instance for a team, we recommend considering Docker hosting with CUDA or ROCm. You can find more information about Docker [here](../docker).
4 changes: 2 additions & 2 deletions website/docs/installation/docker-compose.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ services:
tabby:
restart: always
image: tabbyml/tabby
command: serve --model StarCoder-1B --device cuda
command: serve --model StarCoder-1B --device cuda --no-webserver
volumes:
- "$HOME/.tabby:/data"
ports:
Expand All @@ -47,7 +47,7 @@ services:
restart: always
image: tabbyml/tabby
entrypoint: /opt/tabby/bin/tabby-cpu
command: serve --model StarCoder-1B
command: serve --model StarCoder-1B --no-webserver
volumes:
- "$HOME/.tabby:/data"
ports:
Expand Down
4 changes: 2 additions & 2 deletions website/docs/installation/docker.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ import TabItem from '@theme/TabItem';
```bash title="run.sh"
docker run -it --gpus all \
-p 8080:8080 -v $HOME/.tabby:/data \
tabbyml/tabby serve --model StarCoder-1B --device cuda
tabbyml/tabby serve --model StarCoder-1B --device cuda --no-webserver
```

</TabItem>
Expand All @@ -28,7 +28,7 @@ import TabItem from '@theme/TabItem';
```bash title="run.sh"
docker run --entrypoint /opt/tabby/bin/tabby-cpu -it \
-p 8080:8080 -v $HOME/.tabby:/data \
tabbyml/tabby serve --model StarCoder-1B
tabbyml/tabby serve --model StarCoder-1B --no-webserver
```

</TabItem>
Expand Down
1 change: 1 addition & 0 deletions website/docs/installation/modal/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ def app():
"cuda",
"--parallelism",
"4",
"--no-webserver",
]
)

Expand Down
1 change: 1 addition & 0 deletions website/docs/installation/modal/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ def app():
"8000",
"--device",
"cuda",
"--no-webserver",
]
)

Expand Down
2 changes: 1 addition & 1 deletion website/docs/installation/skypilot/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Finally, we define the command line that actually initiates the container job:
run: |
docker run --gpus all -p 8080:8080 -v ~/.tabby:/data \
tabbyml/tabby \
serve --model TabbyML/StarCoder-1B --device cuda
serve --model TabbyML/StarCoder-1B --device cuda --no-webserver
```
## Launch the service
Expand Down
2 changes: 1 addition & 1 deletion website/docs/installation/windows/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ nvcc --version
## 3. Running Tabby locally
Open a command prompt or PowerShell window in the directory where you downloaded the Tabby executable. Run the following command:
```
.\tabby_x86_64-windows-msvc-cuda117.exe serve --model StarCoder-1B --device cuda
.\tabby_x86_64-windows-msvc-cuda117.exe serve --model StarCoder-1B --device cuda --no-webserver
```
You should see the following output if the command runs successfully:
![Windows running output](./status.png)
Expand Down

0 comments on commit d391ac9

Please sign in to comment.