This website requires JavaScript.
Explore
Help
Register
Sign In
20231088
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
vllm
/
csrc
/
quantization
History
CHU Tianxiang
01a5d18a53
Add Support for 2/3/8-bit GPTQ Quantization Models (
#2330
)
2024-02-28 21:52:23 -08:00
..
awq
Refactor 2 awq gemm kernels into m16nXk32 (
#2723
)
2024-02-12 11:02:17 -08:00
fp8_e5m2_kvcache
Fix compile error when using rocm (
#2648
)
2024-02-01 09:35:09 -08:00
gptq
Add Support for 2/3/8-bit GPTQ Quantization Models (
#2330
)
2024-02-28 21:52:23 -08:00
squeezellm
Enable CUDA graph for GPTQ & SqueezeLLM (
#2318
)
2024-01-03 09:52:29 -08:00