Skip to content

fix: accept legacy --llm-base-url alias for LLM base URL#131

Open
MackDing wants to merge 1 commit into
test-zeus-ai:mainfrom
MackDing:fix/issue-57-llm-base-url-alias
Open

fix: accept legacy --llm-base-url alias for LLM base URL#131
MackDing wants to merge 1 commit into
test-zeus-ai:mainfrom
MackDing:fix/issue-57-llm-base-url-alias

Conversation

@MackDing
Copy link
Copy Markdown

@MackDing MackDing commented May 8, 2026

Summary

Accept the legacy/publicly used CLI flag --llm-base-url as an alias of --llm-model-base-url.

Problem

Issue #57 used --llm-base-url with OpenRouter + anthropic/claude-3-haiku.
Hercules only recognized --llm-model-base-url, and because argument parsing uses parse_known_args(), the unknown flag was silently ignored.

That caused the configured base URL to be dropped, so the request path fell back to the default OpenAI flow and surfaced a misleading OpenAI API-key error.

Fix

  • add --llm-base-url as an alias mapped to llm_model_base_url
  • add regression tests for both the alias and the canonical flag

Validation

python3 -m pytest tests/test_llm_cli_aliases.py -q

Closes #57

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits.
Credits must be used to enable repository wide code reviews.

@MackDing
Copy link
Copy Markdown
Author

MackDing commented May 8, 2026

补一条更具体的 reviewer context,方便快速判断这个修复的必要性:

Why this is a real bug

Issue #57 used the publicly/intuitively named flag:

--llm-base-url https://openrouter.ai/api/v1

but Hercules only recognized:

--llm-model-base-url

Because the CLI uses parse_known_args(), the unknown --llm-base-url flag was silently ignored instead of failing fast.

That means the configured base URL never reached runtime config, so the request flow fell back to the default OpenAI path and surfaced a misleading OpenAI API-key error.

Why this fix is intentionally small

This PR does not change model/provider behavior.
It only restores compatibility for a legacy/publicly used CLI spelling by mapping:

  • --llm-base-url -> llm_model_base_url

Minimal validation

The added regression tests verify both:

  • the legacy alias --llm-base-url
  • the canonical flag --llm-model-base-url

So this should be a safe compatibility fix rather than a behavior change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Misleading error while using anthropic/claude-3-haiku llm model

1 participant