Logo
Explore Help
Register Sign In
20231088/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 1 Packages Projects Releases Wiki Activity
4,397 Commits 1 Branch 0 Tags
Commit Graph

59 Commits

Author SHA1 Message Date
Ricardo Lu
8c4b2592fb
fix: enable trust-remote-code in api server & benchmark. (#509) 2023-07-19 17:06:15 -07:00
WRH
cf21a9bd5c
support trust_remote_code in benchmark (#518) 2023-07-19 17:02:40 -07:00
Woosuk Kwon
4338cc4750
[Tokenizer] Add an option to specify tokenizer (#284) 2023-06-28 09:46:58 -07:00
Woosuk Kwon
0b98ba15c7
Change the name to vLLM (#150) 2023-06-17 03:07:40 -07:00
Zhuohan Li
e5464ee484
Rename servers to engines (#152) 2023-06-17 17:25:21 +08:00
Woosuk Kwon
bab8f3dd0d
[Minor] Fix benchmark_throughput.py (#151) 2023-06-16 21:00:52 -07:00
Woosuk Kwon
311490a720
Add script for benchmarking serving throughput (#145) 2023-06-14 19:55:38 -07:00
Woosuk Kwon
8274ca23ac
Add docstrings for LLM (#137) 2023-06-04 12:52:41 -07:00
Woosuk Kwon
211318d44a
Add throughput benchmarking script (#133) 2023-05-28 03:20:05 -07:00
First Previous 1 2 Next Last
Powered by Gitea Version: 23.0.0 Page: 202ms Template: 4ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API