Cyrus Leung ee77fdb5de
[Doc][2/N] Reorganize Models and Usage sections (#11755)
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
2025-01-06 21:40:31 +08:00

301 B

(quantization-index)=

Quantization

Quantization trades off model precision for smaller memory footprint, allowing large models to be run on a wider range of devices.

:caption: Contents
:maxdepth: 1

supported_hardware
auto_awq
bnb
gguf
int8
fp8
fp8_e5m2_kvcache
fp8_e4m3_kvcache