
* Minor fix in supported models * Add another small fix for Aquila model --------- Co-authored-by: Zhuohan Li <zhuohan123@gmail.com>
vLLM documents
Build the docs
# Install dependencies.
pip install -r requirements-docs.txt
# Build the docs.
make clean
make html
Open the docs with your browser
python -m http.server -d build/html/
Launch your browser and open localhost:8000.