Harry Mellor 3cd91dc955
Help user create custom model for Transformers backend remote code models (#16719)
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
2025-04-17 01:05:59 +00:00
..
2023-05-22 17:02:44 -07:00
2025-03-29 04:27:22 +00:00

vLLM documents

Build the docs

  • Make sure in docs directory
cd docs
  • Install the dependencies:
pip install -r ../requirements/docs.txt
  • Clean the previous build (optional but recommended):
make clean
  • Generate the HTML documentation:
make html

Open the docs with your browser

  • Serve the documentation locally:
python -m http.server -d build/html/

This will start a local server at http://localhost:8000. You can now open your browser and view the documentation.

If port 8000 is already in use, you can specify a different port, for example:

python -m http.server 3000 -d build/html/