Skip to content

Commit c7d2a55

Browse files
authored
[CI Failure] fix test_default_mm_loras (vllm-project#27795)
Signed-off-by: Huamin Li <[email protected]>
1 parent af826e0 commit c7d2a55

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

tests/lora/test_default_mm_loras.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,8 @@
3030
"enable_lora": "True",
3131
"max_num_seqs": 2,
3232
"max_lora_rank": 320,
33-
"max_model_len": 12800,
33+
# Keep these LoRA tests on short-RoPE for determinism post-LongRoPE change.
34+
"max_model_len": 4096,
3435
"gpu_memory_utilization": 0.8,
3536
"limit_mm_per_prompt": {"audio": 1},
3637
"enforce_eager": True,

0 commit comments

Comments
 (0)