From c69bf4ee064741c1fa12ec74d20c97f7cdd67d9f Mon Sep 17 00:00:00 2001 From: Reid <61492567+reidliu41@users.noreply.github.com> Date: Thu, 17 Apr 2025 19:34:20 +0800 Subject: [PATCH] fix: hyperlink (#16778) Signed-off-by: reidliu41 Co-authored-by: reidliu41 --- docs/source/deployment/frameworks/open-webui.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/deployment/frameworks/open-webui.md b/docs/source/deployment/frameworks/open-webui.md index 08ad90ba..83e5303a 100644 --- a/docs/source/deployment/frameworks/open-webui.md +++ b/docs/source/deployment/frameworks/open-webui.md @@ -2,7 +2,7 @@ # Open WebUI -1. Install the (Docker)[https://docs.docker.com/engine/install/] +1. Install the [Docker](https://docs.docker.com/engine/install/) 2. Start the vLLM server with the supported chat completion model, e.g. @@ -10,7 +10,7 @@ vllm serve qwen/Qwen1.5-0.5B-Chat ``` -1. Start the (Open WebUI)[https://github.com/open-webui/open-webui] docker container (replace the vllm serve host and vllm serve port): +1. Start the [Open WebUI](https://github.com/open-webui/open-webui) docker container (replace the vllm serve host and vllm serve port): ```console docker run -d -p 3000:8080 \