Skip to content

Commit ac6a4cf

Browse files
authored
fix: ensure enough tokens for structured output in vLLM test (#591) (#595)
1 parent 0e56243 commit ac6a4cf

1 file changed

Lines changed: 1 addition & 0 deletions

File tree

test/backends/test_openai_vllm.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -222,6 +222,7 @@ class Answer(pydantic.BaseModel):
222222
actions=[CBlock(value=prompt) for prompt in prompts],
223223
format=Answer,
224224
ctx=m_session.ctx,
225+
model_options={ModelOption.MAX_NEW_TOKENS: 256},
225226
)
226227

227228
assert len(results) == len(prompts)

0 commit comments

Comments
 (0)