From face83c7eccff05c5cd3ec3b6f114f71b7694e4e Mon Sep 17 00:00:00 2001 From: blueceiling <148506960+blueceiling@users.noreply.github.com> Date: Mon, 25 Dec 2023 17:37:07 -0700 Subject: [PATCH] [Docs] Add "About" Heading to README.md (#2260) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 13c654a2..8ea4d029 100644 --- a/README.md +++ b/README.md @@ -27,7 +27,7 @@ Easy, fast, and cheap LLM serving for everyone - [2023/06] We officially released vLLM! FastChat-vLLM integration has powered [LMSYS Vicuna and Chatbot Arena](https://chat.lmsys.org) since mid-April. Check out our [blog post](https://vllm.ai). --- - +## About vLLM is a fast and easy-to-use library for LLM inference and serving. vLLM is fast with: