fix: accept legacy --llm-base-url alias for LLM base URL#131
Conversation
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
|
补一条更具体的 reviewer context,方便快速判断这个修复的必要性: Why this is a real bugIssue #57 used the publicly/intuitively named flag: --llm-base-url https://openrouter.ai/api/v1but Hercules only recognized: --llm-model-base-urlBecause the CLI uses That means the configured base URL never reached runtime config, so the request flow fell back to the default OpenAI path and surfaced a misleading OpenAI API-key error. Why this fix is intentionally smallThis PR does not change model/provider behavior.
Minimal validationThe added regression tests verify both:
So this should be a safe compatibility fix rather than a behavior change. |
Summary
Accept the legacy/publicly used CLI flag
--llm-base-urlas an alias of--llm-model-base-url.Problem
Issue #57 used
--llm-base-urlwith OpenRouter +anthropic/claude-3-haiku.Hercules only recognized
--llm-model-base-url, and because argument parsing usesparse_known_args(), the unknown flag was silently ignored.That caused the configured base URL to be dropped, so the request path fell back to the default OpenAI flow and surfaced a misleading OpenAI API-key error.
Fix
--llm-base-urlas an alias mapped tollm_model_base_urlValidation
Closes #57