Configurable max context #445
iuriguilherme
started this conversation in
Ideas
Replies: 1 comment
-
|
This also seems like a reasonable configuration request. The default auto-compaction threshold is fine for many cloud models, but once people start using:
then a fixed built-in threshold can become either too conservative or too risky. A setting like: {
"autoCompactionThreshold": 80
}would be easy to explain and useful for advanced users. The main thing I would want from an implementation is that the setting remains:
So yes, I think this is a sensible idea, especially for non-default provider/model setups. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
claude-code issue: anthropics/claude-code#34925
Original text for reference:
Allow users to configure the context window percentage at which auto-compaction triggers, via a setting in settings.json (user or project level).
If not specified, the current default behavior applies.
Specially when using alternative models and managed by other tools such as Ollama and LMStudio, the default maximum context may not be the optimal threshold for automatic compaction. And currently it's only possible to configure this via the API for the current session and not in the
settings.jsonfile.Beta Was this translation helpful? Give feedback.
All reactions