This website requires JavaScript.
Explore
Help
Register
Sign In
20231088
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
vllm
/
docs
/
source
/
getting_started
/
installation
History
TJian
eaa92d4437
[ROCm] [Feature] [Doc] [Dockerfile] [BugFix] Support Per-Token-Activation Per-Channel-Weight FP8 Quantization Inferencing (
#12501
)
2025-02-07 08:13:43 -08:00
..
ai_accelerator
[Doc] Improve installation signposting (
#12575
)
2025-01-31 15:38:35 -08:00
cpu
[Doc] double quote cmake package in build.inc.md (
#12840
)
2025-02-06 09:17:55 -08:00
gpu
[ROCm] [Feature] [Doc] [Dockerfile] [BugFix] Support Per-Token-Activation Per-Channel-Weight FP8 Quantization Inferencing (
#12501
)
2025-02-07 08:13:43 -08:00
device.template.md
[Doc] Organise installation documentation into categories and tabs (
#11935
)
2025-01-13 12:27:36 +00:00
index.md
[Doc] Improve installation signposting (
#12575
)
2025-01-31 15:38:35 -08:00
python_env_setup.inc.md
[Doc] Convert docs to use colon fences (
#12471
)
2025-01-29 11:38:29 +08:00