[doc] add open-webui example (#16747)

Signed-off-by: reidliu41 <reid201711@gmail.com>
Co-authored-by: reidliu41 <reid201711@gmail.com>
This commit is contained in:
Reid 2025-04-17 18:27:32 +08:00 committed by GitHub
parent 61a44a0b22
commit d8e557b5e5
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
3 changed files with 30 additions and 0 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

View File

@ -9,6 +9,7 @@ dstack
helm
lws
modal
open-webui
skypilot
triton
:::

View File

@ -0,0 +1,29 @@
(deployment-open-webui)=
# Open WebUI
1. Install the (Docker)[https://docs.docker.com/engine/install/]
2. Start the vLLM server with the supported chat completion model, e.g.
```console
vllm serve qwen/Qwen1.5-0.5B-Chat
```
1. Start the (Open WebUI)[https://github.com/open-webui/open-webui] docker container (replace the vllm serve host and vllm serve port):
```console
docker run -d -p 3000:8080 \
--name open-webui \
-v open-webui:/app/backend/data \
-e OPENAI_API_BASE_URL=http://<vllm serve host>:<vllm serve port>/v1 \
--restart always \
ghcr.io/open-webui/open-webui:main
```
1. Open it in the browser: <http://open-webui-host:3000/>
On the top of the web page, you can see the model `qwen/Qwen1.5-0.5B-Chat`.
:::{image} /assets/deployment/open_webui.png
:::