This website requires JavaScript.
Explore
Help
Register
Sign In
20231088
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
1
Packages
Projects
Releases
Wiki
Activity
133
Commits
1
Branch
0
Tags
Commit Graph
3 Commits
Author
SHA1
Message
Date
Woosuk Kwon
0f4b32199e
Support various block sizes & Change default block size to 16 (
#38
)
2023-04-15 09:03:24 -07:00
Siyuan (Ryans) Zhuang
21b3671bbc
Basic attention kernel that supports cached KV + (multi-)prompts (
#24
)
2023-04-04 20:34:46 -07:00
Woosuk Kwon
0deacbce6e
Implement
single_query_cached_kv_attention
kernel (
#3
)
2023-03-01 15:02:19 -08:00