[Bugfix][Docs] Fix offline Whisper (#13274)
This commit is contained in:
parent
c9f9d5b397
commit
579d7a63b2
@ -938,6 +938,26 @@ The following table lists those that are tested in vLLM.
|
||||
* ✅︎
|
||||
:::
|
||||
|
||||
#### Transcription (`--task transcription`)
|
||||
|
||||
Speech2Text models trained specifically for Automatic Speech Recognition.
|
||||
|
||||
:::{list-table}
|
||||
:widths: 25 25 25 5 5
|
||||
:header-rows: 1
|
||||
|
||||
- * Architecture
|
||||
* Models
|
||||
* Example HF Models
|
||||
* [LoRA](#lora-adapter)
|
||||
* [PP](#distributed-serving)
|
||||
- * `Whisper`
|
||||
* Whisper-based
|
||||
* `openai/whisper-large-v3-turbo`
|
||||
* 🚧
|
||||
* 🚧
|
||||
:::
|
||||
|
||||
_________________
|
||||
|
||||
## Model Support Policy
|
||||
|
@ -421,7 +421,7 @@ class LLM:
|
||||
instead pass them via the ``inputs`` parameter.
|
||||
"""
|
||||
runner_type = self.llm_engine.model_config.runner_type
|
||||
if runner_type != "generate":
|
||||
if runner_type not in ["generate", "transcription"]:
|
||||
messages = [
|
||||
"LLM.generate() is only supported for (conditional) generation "
|
||||
"models (XForCausalLM, XForConditionalGeneration).",
|
||||
|
Loading…
x
Reference in New Issue
Block a user