Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: logging error with more than 1k characters #970

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

urbanonymous
Copy link

@urbanonymous urbanonymous commented Mar 3, 2025

Fix logging error with more than 1k characters:

  • In this PR the 1000 characters limit from the

Description:

  • Extract the magic number to a constant
  • Set the constant into 100k

Motivation:

  • Tons of SyntaxtError logs in the console

Related Issues:

The simplest and most effective fix is to set MAX_RESPONSE_LENGTH to a higher, practical value that accommodates typical LLM outputs while keeping log messages manageable for broadcasting to clients via WebSockets. However, we must also modify the code to prevent parsing errors when truncation occurs. Here’s why and how:

  • Why Not Remove the Check? While removing the length check would log full responses, it risks sending extremely large messages (e.g., megabytes) to clients, potentially causing performance issues in WebSocket communication or client handling. A reasonable limit provides balance.

  • Why a Higher Limit? LLM outputs can easily exceed 1000 characters. For example, a 4096-token response (a common max for models like GPT-3) might translate to ~16,000 characters or more in JSON form, including metadata. A higher limit ensures most responses are logged fully.

  • Why Fix Truncation? Even with a higher limit, responses exceeding it must be handled gracefully to avoid invalid JSON errors.

Setting MAX_RESPONSE_LENGTH to 100,000 characters (~100KB):

  • Covers Most Cases: This captures full responses for most LLM outputs, including lengthy ones up to tens of thousands of characters, while accounting for JSON overhead (e.g., keys, quotes).

  • WebSocket Friendly: Modern WebSocket implementations, including in Cloudflare Workers (where your app runs with workerd), support messages up to 1MB. At ~100KB, this is well within limits and safe for broadcasting.

  • Practical for Logging: It balances completeness with preventing excessively large log entries.

@urbanonymous urbanonymous changed the title Increase MAX_RESPONSE_LENGTH to 100k fix: log issue with more than 1k characters Mar 3, 2025
@urbanonymous urbanonymous changed the title fix: log issue with more than 1k characters fix: logging error with more than 1k characters Mar 3, 2025
? JSON.parse(responseString.substring(0, maxLength) + '...')
: response;
requestOptionsArray[0].response = responseString.length > MAX_RESPONSE_LENGTH
? JSON.parse(responseString.substring(0, MAX_RESPONSE_LENGTH) + '...')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can just return the substring instead instead of trying to deserialize the string

Copy link
Collaborator

@narengogi narengogi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants