-
Notifications
You must be signed in to change notification settings - Fork 88
fix: OAuth logout cache invalidation and ephemeral preservation (#975, #974) #976
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This commit fixes two related issues with authentication and provider switching: Issue #975: OAuth logout does not invalidate cached tokens - Added invalidateCache() method to AuthPrecedenceResolver that flushes all known runtime scopes (legacy-singleton, provider-manager-singleton, and the current runtime context) - Updated BaseProvider.clearAuthCache() to call the resolver's invalidateCache() instead of being a no-op - Enhanced OAuthManager.clearProviderAuthCaches() to flush all known runtime scopes, not just the current runtime context Issue #974: Provider switching improperly clears context - Added DEFAULT_PRESERVE_EPHEMERALS constant with context-related settings (context-limit, max_tokens, streaming) - Modified switchActiveProvider() to merge default preserved ephemerals with any caller-specified ones, ensuring user preferences persist across provider switches Both fixes include comprehensive test coverage to prevent regressions. closes #975 closes #974
WalkthroughAdds multi-scope OAuth cache invalidation on logout and preserves a default set of ephemeral runtime settings across provider switches to avoid stale tokens and unintended context clearing. Changes
Sequence Diagram(s)mermaid CLI->>OAuthMgr: auth logout() Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Possibly related PRs
Poem
Pre-merge checks and finishing touches✅ Passed checks (4 passed)
✨ Finishing touches
📜 Recent review detailsConfiguration used: Organization UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
⏰ Context from checks skipped due to timeout of 270000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (7)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
I'll analyze this PR that addresses two authentication and provider switching issues. Let me first understand the changes and then provide a structured review. Thanks for providing that PR summary and files! I now have a good understanding of the issues being fixed. Let me analyze the implementation and provide feedback on the quality of this PR. This PR addresses two important authentication and provider switching issues. Here's my analysis: Issue #975: OAuth logout cache invalidationProblem: OAuth logout wasn't properly invalidating cached tokens, causing stale credentials to persist across provider switches until you switched away and back. Root Cause: Multiple caching layers (legacy-singleton, provider-manager-singleton, and runtime-specific scopes) existed but only the current runtime's cache was flushed during logout. Solution:
Key Implementation: // precedence.ts - added invalidateCache method
invalidateCache?(): void {
// Flush all known runtime scopes
this.flushKnownRuntimeScopes();
}
// BaseProvider.ts - updated clearAuthCache
clearAuthCache(): void {
this.authResolver.invalidateCache?.();
}
// oauth-manager.ts - enhanced cache clearing
const runtimeId = runtimeContext?.runtimeId;
if (runtimeId) {
flushRuntimeAuthScope(runtimeId);
}Issue #974: Provider switching context preservationProblem: When switching providers, context-related ephemeral settings like Root Cause: Solution:
Key Implementation: // runtimeSettings.ts - added preserved ephemerals
const DEFAULT_PRESERVE_EPHEMERALS = [
'context-limit',
'max_tokens',
'streaming',
];
export async function switchActiveProvider(providerName: string, options = {}) {
const preserveEphemerals = [
...DEFAULT_PRESERVE_EPHEMERALS,
...(options.preserveEphemerals ?? []),
];
// ... rest of function
}Test CoverageThe PR includes comprehensive test coverage:
The tests verify:
AssessmentThe fixes are well-designed and address the root causes:
The implementation follows existing code patterns and includes proper error handling. The test coverage appears comprehensive for the documented scenarios. Overall, these are solid fixes that resolve the reported issues without introducing significant complexity or breaking changes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (2)
packages/cli/src/runtime/runtimeSettings.ts (2)
1464-1476: Consider using const assertion for better type safety.The constant is well-documented and addresses Issue #974 correctly. However, using a const assertion would provide compile-time immutability and better type inference.
🔎 Suggested improvement
-const DEFAULT_PRESERVE_EPHEMERALS = [ +const DEFAULT_PRESERVE_EPHEMERALS = [ 'context-limit', 'max_tokens', 'streaming', -]; +] as const;This prevents accidental mutations and provides stricter type checking while keeping the same runtime behavior.
1490-1494: Consider deduplicating the preserved ephemerals array.The merge logic correctly implements the requirement to preserve default context-related settings. The current implementation allows potential duplicates if a caller specifies keys that are already in
DEFAULT_PRESERVE_EPHEMERALS, though this doesn't cause functional issues.🔎 Optional improvement to prevent duplicates
- // Merge default preserved ephemerals with any caller-specified ones - const preserveEphemerals = [ - ...DEFAULT_PRESERVE_EPHEMERALS, - ...(options.preserveEphemerals ?? []), - ]; + // Merge default preserved ephemerals with any caller-specified ones (deduplicated) + const preserveEphemerals = [ + ...new Set([ + ...DEFAULT_PRESERVE_EPHEMERALS, + ...(options.preserveEphemerals ?? []), + ]) + ];This ensures each key appears only once, making the code more robust against future changes.
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
packages/cli/src/auth/oauth-manager.tspackages/cli/src/runtime/__tests__/provider-context-preservation.spec.tspackages/cli/src/runtime/runtimeSettings.tspackages/core/src/auth/oauth-logout-cache-invalidation.spec.tspackages/core/src/auth/precedence.tspackages/core/src/providers/BaseProvider.ts
🧰 Additional context used
📓 Path-based instructions (1)
**/*.{ts,tsx}
📄 CodeRabbit inference engine (CLAUDE.md)
**/*.{ts,tsx}: Don't useany- Always specify proper types. Useunknownif the type is truly unknown and add proper type guards.
Do not useconsole.logorconsole.debug- Use the sophisticated logging system instead. Log files are written to ~/.llxprt/debug/
Fix all linting errors, including warnings aboutanytypes
Files:
packages/cli/src/runtime/__tests__/provider-context-preservation.spec.tspackages/core/src/providers/BaseProvider.tspackages/core/src/auth/precedence.tspackages/core/src/auth/oauth-logout-cache-invalidation.spec.tspackages/cli/src/auth/oauth-manager.tspackages/cli/src/runtime/runtimeSettings.ts
🧠 Learnings (1)
📚 Learning: 2025-12-18T14:06:22.557Z
Learnt from: acoliver
Repo: vybestack/llxprt-code PR: 0
File: :0-0
Timestamp: 2025-12-18T14:06:22.557Z
Learning: OpenAIResponsesProvider (packages/core/src/providers/openai-responses/OpenAIResponsesProvider.ts) currently bypasses the ephemeral truncation system by using direct `JSON.stringify(toolResponseBlock.result)` and needs to be updated to support ephemeral settings like the other providers.
Applied to files:
packages/cli/src/runtime/runtimeSettings.ts
🧬 Code graph analysis (1)
packages/core/src/auth/precedence.ts (1)
packages/core/src/index.ts (2)
getActiveProviderRuntimeContext(266-266)flushRuntimeAuthScope(82-82)
⏰ Context from checks skipped due to timeout of 270000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (7)
- GitHub Check: Test (ubuntu-latest, 24.x)
- GitHub Check: Test (windows-latest, 24.x)
- GitHub Check: Test (macos-latest, 24.x)
- GitHub Check: E2E Test (Linux) - sandbox:none
- GitHub Check: E2E Test (Linux) - sandbox:docker
- GitHub Check: Slow E2E - Win
- GitHub Check: E2E Test (macOS)
🔇 Additional comments (3)
packages/core/src/providers/BaseProvider.ts (1)
478-493: LGTM! Well-implemented cache invalidation.The updated
clearAuthCache()method properly invokesauthResolver.invalidateCache()with appropriate defensive checks. The documentation clearly describes the behavior change and references the relevant plan/requirement.packages/cli/src/auth/oauth-manager.ts (1)
1386-1414: LGTM! Robust multi-scope cache invalidation.The enhanced
clearProviderAuthCaches()method properly:
- Flushes multiple known runtime scopes (legacy-singleton, provider-manager-singleton, and current context)
- Uses appropriate error handling for both context retrieval and individual scope flushing
- Logs debug information without interrupting the logout flow
- Aligns with the cache invalidation approach in the core precedence resolver
packages/cli/src/runtime/__tests__/provider-context-preservation.spec.ts (1)
1-106: LGTM! Well-structured tests for context preservation.The test suite clearly documents and verifies the expected behavior for Issue #974:
- Defines the list of context-related settings that should be preserved
- Verifies that context-limit, max_tokens, and streaming are preserved during provider switches
- Ensures that other ephemeral settings (like auth-key and temperature) are still cleared appropriately
The tests provide good documentation of the fix and will help prevent regressions.
Code Coverage Summary
CLI Package - Full Text ReportCore Package - Full Text ReportFor detailed HTML reports, please see the 'coverage-reports-24.x-ubuntu-latest' artifact from the main CI run. |
|
@CodeRabbit I dont understand how this means the context (aka HistoryService) won't be cleared. I'm not clear why we would preserve the ephemeral? That seems counterproductive and irrelevant to the actual issue that switching means the History is basically cleared (aka the context) or am I missing something? |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 4826 🏁 Script executed: Length of output: 3685 🏁 Script executed: Length of output: 210 🏁 Script executed: Length of output: 1089 🏁 Script executed: Length of output: 5615 🏁 Script executed: Length of output: 8906 🏁 Script executed: Length of output: 26 🏁 Script executed: Length of output: 67372 🏁 Script executed: Length of output: 15055 🏁 Script executed: Length of output: 6786 @acoliver Great question! I can see the confusion here. Let me clarify what's actually happening: The Real ProblemThe issue isn't that Instead, here's what's happening:
Why This Fix WorksBy preserving From CodeRabbit's AnalysisIn the issue comments, CodeRabbit explained it well:
So the fix is correct - it preserves the context-sizing settings that prevent history truncation during provider switches. The conversation history itself stays intact; we're just preventing the settings that control its size from being lost. Does this clarify the mechanism? The terminology "context" is admittedly confusing since it refers to both the conversation and the settings that control how much of it is kept! |
|
@CodeRabbit but it isnt acting like it was truncated. The model acts like it is clear. It does this even if the context wasn't full and the size is close. |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 11869 🏁 Script executed: Length of output: 3348 🏁 Script executed: Length of output: 177 🏁 Script executed: Length of output: 14842 🏁 Script executed: Length of output: 6732 🏁 Script executed: Length of output: 2503 🏁 Script executed: Length of output: 5570 🏁 Script executed: Length of output: 2422 🏁 Script executed: Length of output: 441 🏁 Script executed: Length of output: 5435 🏁 Script executed: Length of output: 103 🏁 Script executed: Length of output: 2385 🏁 Script executed: Length of output: 2341 🏁 Script executed: Length of output: 2097 🏁 Script executed: Length of output: 5839 🏁 Script executed: Length of output: 8355 🏁 Script executed: Length of output: 847 🏁 Script executed: Length of output: 5377 @acoliver I think I've been misunderstanding the architecture. Let me clarify what I've discovered: Key Finding: Providers Are StatelessLooking at the code, providers don't store conversation history. The export interface GenerateChatOptions {
contents: IContent[]; // <-- Conversation history passed in
tools?: ProviderToolset;
// ...
}This means the caller (CLI/UI layer) is responsible for maintaining and passing the conversation history to providers. Where This Gets ConfusingWhen you reported Issue #974:
This PR's fix preserves these settings:
But you're saying the actual conversation messages appear to be cleared, not just truncated. The QuestionWhere does the CLI actually store the conversation history (the IContent[] array)? Is it:
If the conversation history is stored per-provider and gets cleared when switching away, then preserving settings wouldn't fix the problem - the actual messages would still be lost. Can you clarify what you actually observed in your testing after applying this fix? Does the conversation history truly persist across Anthropic → OpenAI → Anthropic switches now, or are we fixing a different symptom? |
- Update test comments in oauth-logout-cache-invalidation.spec.ts to reflect that invalidateCache() now exists and works - Replace console.debug with DebugLogger in precedence.ts to follow project logging conventions
Summary
This PR fixes two related issues with authentication state management and provider switching:
Issue #975: OAuth logout does not invalidate cached tokens
Problem: When using OAuth authentication (e.g., Anthropic OAuth), logging out and logging back in with a different account did not take effect until switching away from the provider and back. The old credentials continued to be used for API calls due to multiple caching layers that weren't properly invalidated.
Root Cause: The auth system has runtime-scoped state caching (legacy-singleton, provider-manager-singleton, and runtime-specific IDs). On logout, only the current runtime's cache was flushed, leaving stale tokens in other scopes.
Solution:
invalidateCache()method toAuthPrecedenceResolverthat flushes all known runtime scopesBaseProvider.clearAuthCache()to call the resolver'sinvalidateCache()(was previously a no-op)OAuthManager.clearProviderAuthCaches()to flush all known runtime scopesIssue #974: Provider switching improperly clears context
Problem: When switching providers (e.g., OpenAI → Anthropic → OpenAI), context-related ephemeral settings like
context-limit,max_tokens, andstreamingwere being cleared, causing unexpected behavior.Root Cause:
switchActiveProvider()cleared ALL ephemeral settings exceptactiveProvider, without considering which settings should persist across switches.Solution:
DEFAULT_PRESERVE_EPHEMERALSconstant with context-related settings that should persistswitchActiveProvider()to merge default preserved ephemerals with any caller-specified onesChanges
packages/core/src/auth/precedence.ts
invalidateCache()method toAuthPrecedenceResolverpackages/core/src/providers/BaseProvider.ts
clearAuthCache()to call resolver'sinvalidateCache()packages/cli/src/auth/oauth-manager.ts
clearProviderAuthCaches()to flush all known runtime scopespackages/cli/src/runtime/runtimeSettings.ts
DEFAULT_PRESERVE_EPHEMERALSconstantswitchActiveProvider()to preserve context-related settingsTest Coverage
packages/core/src/auth/oauth-logout-cache-invalidation.spec.ts- Tests for Issue Anthropic OAuth logout does not invalidate in-memory cached token until provider switch #975packages/cli/src/runtime/__tests__/provider-context-preservation.spec.ts- Tests for Issue Provider switching from OpenAI back to Anthropic improperly clears context #974Testing
node scripts/start.js --profile-load synthetic "write me a haiku"Closes
closes #975
closes #974
Summary by CodeRabbit
Bug Fixes
Tests
✏️ Tip: You can customize this high-level summary in your review settings.