Skip to content

fix(/context): gracefully handle huge context files + ux #1331

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

GoodluckH
Copy link
Collaborator

#1254

demo-context-file-exceeded-limit.mp4

@GoodluckH GoodluckH self-assigned this Apr 23, 2025
@GoodluckH GoodluckH requested a review from a team as a code owner April 23, 2025 20:41
@codecov-commenter
Copy link

codecov-commenter commented Apr 23, 2025

Codecov Report

Attention: Patch coverage is 45.63107% with 56 lines in your changes missing coverage. Please review.

Project coverage is 17.93%. Comparing base (1e05983) to head (c2e6d65).

Files with missing lines Patch % Lines
crates/q_chat/src/lib.rs 0.00% 38 Missing ⚠️
crates/q_chat/src/conversation_state.rs 47.05% 18 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1331      +/-   ##
==========================================
+ Coverage   17.92%   17.93%   +0.01%     
==========================================
  Files        1844     1844              
  Lines      164489   164580      +91     
  Branches   144685   144776      +91     
==========================================
+ Hits        29477    29520      +43     
- Misses     133562   133609      +47     
- Partials     1450     1451       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@GoodluckH GoodluckH force-pushed the 1254-bug-amazon-q-cli-fails-for-all-model-calls-when-a-really-large-file-is-added-to-context branch 4 times, most recently from 9c96cc4 to a49095c Compare April 24, 2025 17:10
.ok();
}
for (filename, _) in dropped_files {
files.retain(|(f, _)| f != &filename);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this correct? this looks like every context file will end up being dropped.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how? we only keep the context files that are not in dropped_files. so basically storing the leftover files

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just verified and looks like it does drop everything. If that's not correct, then we should have a test case that asserts this behavior is correct at least.

.await
.into_fig_conversation_state()
.expect("unable to construct conversation state")
}

/// Returns a conversation state representation which reflects the exact conversation to send
/// back to the model.
pub async fn backend_conversation_state(&mut self, run_hooks: bool, quiet: bool) -> BackendConversationState<'_> {
pub async fn backend_conversation_state(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to avoid adding another parameter show_dropped_context_files_warning to this func if we can, instead include it as part of BackendConversationState and let the consumer print the message (e.g. in /usage, and as_sendable_conversation_state).

.collect();

let dropped_files =
drop_matched_context_files(&mut combined_files, CONTEXT_FILES_MAX_SIZE).ok();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't really like having to duplicate drop_matched_context_files here and in context_messages, we should have a single method on the context manager that collects context files, and have ContextManager instantiated with a configurable max value (by default CONTEXT_FILES_MAX_SIZE). This way we can at least write some reasonable test cases against it.

@GoodluckH GoodluckH force-pushed the 1254-bug-amazon-q-cli-fails-for-all-model-calls-when-a-really-large-file-is-added-to-context branch 3 times, most recently from 51670be to caa85e5 Compare April 29, 2025 21:42
@GoodluckH GoodluckH force-pushed the 1254-bug-amazon-q-cli-fails-for-all-model-calls-when-a-really-large-file-is-added-to-context branch from caa85e5 to c2e6d65 Compare April 29, 2025 21:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

bug: Amazon Q CLI fails for all model calls when a really large file is added to context
3 participants