Logo
Explore Help
Register Sign In
20231088/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 1 Packages Projects Releases Wiki Activity
4,397 Commits 1 Branch 0 Tags
Commit Graph

4 Commits

Author SHA1 Message Date
sroy745
b41fb9d3b1
[Encoder Decoder] Update Mllama to run with both FlashAttention and XFormers (#9982)
Signed-off-by: Sourashis Roy <sroy@roblox.com>
2024-11-12 10:53:57 -08:00
sroy745
a78dd3303e
[Encoder Decoder] Add flash_attn kernel support for encoder-decoder models (#9559) 2024-11-01 23:22:49 -07:00
wangshuai09
3ddbe25502
[Hardware][CPU] using current_platform.is_cpu (#9536) 2024-10-22 00:50:43 -07:00
sroy745
1009e93c5d
[Encoder decoder] Add cuda graph support during decoding for encoder-decoder models (#7631) 2024-09-17 07:35:01 -07:00
Powered by Gitea Version: 23.0.0 Page: 342ms Template: 8ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API