Skip to content

Features/add stream chat #354

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 17 commits into
base: main
Choose a base branch
from

Conversation

iceljc
Copy link
Collaborator

@iceljc iceljc commented Jun 27, 2025

PR Type

Enhancement


Description

  • Add streaming chat message support with real-time display

  • Implement message queue handling for smooth text streaming

  • Add new SignalR events for stream message lifecycle

  • Update UI to handle streaming states and display


Changes walkthrough 📝

Relevant files
Configuration changes
2 files
constants.js
Add System role to BOT_SENDERS                                                     
+2/-1     
.env
Add stream chat feature flag                                                         
+1/-0     
Enhancement
6 files
conversationTypes.js
Add is_streaming property to message type                               
+1/-0     
common.js
Add delay utility function and refactor classnames             
+13/-1   
conversation-service.js
Convert async functions to Promise-based implementations 
+28/-12 
signalr-service.js
Add stream message SignalR event handlers                               
+25/-0   
chat-box.svelte
Implement streaming chat with message queue handling         
+141/-73
rc-message.svelte
Add conditional rendering for empty messages                         
+11/-7   
Formatting
1 files
api-endpoints.js
Minor formatting adjustment to endpoints                                 
+1/-0     
Bug fix
1 files
conv-dialogs.svelte
Update sender role checking logic                                               
+2/-2     

Need help?
  • Type /help how to ... in the comments thread for any questions about Qodo Merge usage.
  • Check out the documentation for more information.
  • Copy link

    PR Reviewer Guide 🔍

    Here are some key observations to aid the review process:

    ⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
    🧪 No relevant tests
    🔒 No security concerns identified
    ⚡ Recommended focus areas for review

    Race Condition

    The message queue handling logic has potential race conditions. The isHandlingQueue flag and messageQueue manipulation could lead to inconsistent state when multiple stream messages arrive rapidly or when the component unmounts during processing.

    async function handleMesssageQueue(message) {
    	if (isHandlingQueue) return;
    
    	isHandlingQueue = true;
    	while (messageQueue.length > 0) {
    		const item = messageQueue.shift();
    		messageQueue = [...messageQueue];
    		if (!item) {
    			continue;
    		}
    
    		const lastMsg = dialogs[dialogs.length - 1];
    		if (lastMsg?.sender?.role !== UserRole.Assistant
    			|| lastMsg?.message_id !== message.message_id
    		) {
    			continue;
    		}
    
    		try {
    			for (const char of item.text) {
    				dialogs[dialogs.length - 1].text += char;
    				refresh();
    				await delay(12);
    			}
    		} catch (err) {
    			console.log(`Error when processing message queue`, err);
    		}
    	}
    	isHandlingQueue = false;
    }
    Promise Wrapping

    Several async functions are unnecessarily wrapped in new Promise constructors when they already return promises from axios calls. This creates promise anti-patterns and potential memory leaks if the outer promise is not properly handled.

        return new Promise((resolve, reject) => {
            axios.get(url).then(response => {
                resolve(response?.data);
            }).catch(err => {
                reject(err);
            });
        });
    }
    Performance Issue

    The character-by-character streaming with 12ms delay could cause performance issues with long messages. The tight loop with frequent DOM updates and delays may block the UI thread and impact user experience.

    for (const char of item.text) {
    	dialogs[dialogs.length - 1].text += char;
    	refresh();
    	await delay(12);
    }

    Copy link

    qodo-merge-pro bot commented Jun 27, 2025

    PR Code Suggestions ✨

    Explore these optional code suggestions:

    CategorySuggestion                                                                                                                                    Impact
    Possible issue
    Ensure loading state resets on error

    If an error occurs during file upload or message sending, isSendingMsg may never
    be reset to false, causing the UI to remain in a loading state. Use a
    try/finally block to ensure isSendingMsg is always reset.

    src/routes/chat/[agentId]/[conversationId]/chat-box.svelte [669-730]

         async function sendChatMessage(msgText, data = null, conversationId = null) {
             isSendingMsg = true;
    -        clearInstantLogs();
    -        renewUserSentMessages(msgText);
    -        const agentId = params.agentId;
    -        const convId = conversationId || params.conversationId;
    -
    -
    -        let postback = data?.postback;
    -        // if (!postback) {
    -        // 	postback = buildPostbackMessage(dialogs, data?.payload || msgText, data?.truncateMsgId);
    -        // }
    -
    -        /** @type {import('$conversationTypes').MessageData?} */
    -        let messageData = {
    -            ...data,
    -            postback: postback,
    -            states: [
    -                ...data?.states || [],
    -                { key: "use_stream_message", value: PUBLIC_LIVECHAT_STREAM_ENABLED }
    -            ]
    -        };
    -
    -        /** @type {any[]} */
    -        let files = [];
    -        if (!!!messageData?.inputMessageId) {
    -            files = getChatFiles();
    +        try {
    +            clearInstantLogs();
    +            renewUserSentMessages(msgText);
    +            const agentId = params.agentId;
    +            const convId = conversationId || params.conversationId;
    +    
    +            let postback = data?.postback;
    +    
    +            /** @type {import('$conversationTypes').MessageData?} */
    +            let messageData = {
    +                ...data,
    +                postback: postback,
    +                states: [
    +                    ...data?.states || [],
    +                    { key: "use_stream_message", value: PUBLIC_LIVECHAT_STREAM_ENABLED }
    +                ]
    +            };
    +    
    +            /** @type {any[]} */
    +            let files = [];
    +            if (!!!messageData?.inputMessageId) {
    +                files = getChatFiles();
    +            }
    +            resetStorage();
    +    
    +            if (files?.length > 0 && !!!messageData.inputMessageId) {
    +                const filePayload = buildFilePayload(files);
    +                const resMessageId = await uploadConversationFiles(agentId, convId, files);
    +                messageData = { ...messageData, inputMessageId: resMessageId };
    +                if (!!filePayload) {
    +                    messageData = {
    +                        ...messageData,
    +                        // @ts-ignore
    +                        postback: {
    +                            ...postback,
    +                            payload: `${postback?.payload || msgText || ''}\r\n${filePayload}`
    +                        }
    +                    };
    +                }
    +            } else if (!!messageData?.inputMessageId) {
    +                const retFiles = await getConversationFiles(convId, messageData.inputMessageId, FileSourceType.User);
    +                const filePayload = buildFilePayload(retFiles);
    +                if (!!filePayload) {
    +                    messageData = {
    +                        ...messageData,
    +                        // @ts-ignore
    +                        postback: {
    +                            ...postback,
    +                            payload: `${postback?.payload || msgText || ''}\r\n${filePayload}`
    +                        }
    +                    };
    +                }
    +            }
    +    
    +            await sendMessageToHub(agentId, convId, msgText, messageData);
    +            deleteMessageDraft();
    +        } finally {
    +            isSendingMsg = false;
             }
    -        resetStorage();
    -
    -        if (files?.length > 0 && !!!messageData.inputMessageId) {
    -            const filePayload = buildFilePayload(files);
    -            const resMessageId = await uploadConversationFiles(agentId, convId, files);
    -            messageData = { ...messageData, inputMessageId: resMessageId };
    -            if (!!filePayload) {
    -                messageData = {
    -                    ...messageData,
    -                    // @ts-ignore
    -                    postback: {
    -                        ...postback,
    -                        payload: `${postback?.payload || msgText || ''}\r\n${filePayload}`
    -                    }
    -                };
    -            }
    -        } else if (!!messageData?.inputMessageId) {
    -            const retFiles = await getConversationFiles(convId, messageData.inputMessageId, FileSourceType.User);
    -            const filePayload = buildFilePayload(retFiles);
    -            if (!!filePayload) {
    -                messageData = {
    -                    ...messageData,
    -                    // @ts-ignore
    -                    postback: {
    -                        ...postback,
    -                        payload: `${postback?.payload || msgText || ''}\r\n${filePayload}`
    -                    }
    -                };
    -            }
    -        }
    -
    -        await sendMessageToHub(agentId, convId, msgText, messageData);
    -        deleteMessageDraft();
    -        isSendingMsg = false;
         }
    • Apply / Chat
    Suggestion importance[1-10]: 8

    __

    Why: This suggestion correctly identifies a bug in error handling. If any of the await calls within the sendChatMessage function fail, isSendingMsg would not be reset to false, leaving the UI in a stuck loading state. The proposed try...finally block is the correct solution to ensure the state is always reset, significantly improving the robustness of the component.

    Medium
    Fix Svelte reactivity for streaming messages

    The function mutates dialogs[dialogs.length - 1].text directly, which may cause
    Svelte reactivity issues and unexpected UI behavior. Instead, reassign the
    dialogs array or the message object to ensure proper reactivity and state
    updates.

    src/routes/chat/[agentId]/[conversationId]/chat-box.svelte [543-572]

         async function handleMesssageQueue(message) {
             if (isHandlingQueue) return;
     
             isHandlingQueue = true;
             while (messageQueue.length > 0) {
                 const item = messageQueue.shift();
                 messageQueue = [...messageQueue];
                 if (!item) {
                     continue;
                 }
     
                 const lastMsg = dialogs[dialogs.length - 1];
                 if (lastMsg?.sender?.role !== UserRole.Assistant
                     || lastMsg?.message_id !== message.message_id
                 ) {
                     continue;
                 }
     
                 try {
    +                let updatedText = lastMsg.text || '';
                     for (const char of item.text) {
    -                    dialogs[dialogs.length - 1].text += char;
    +                    updatedText += char;
    +                    dialogs[dialogs.length - 1] = { ...lastMsg, text: updatedText };
                         refresh();
                         await delay(12);
                     }
                 } catch (err) {
                     console.log(`Error when processing message queue`, err);
                 }
             }
             isHandlingQueue = false;
         }
    • Apply / Chat
    Suggestion importance[1-10]: 6

    __

    Why: The suggestion correctly identifies that direct mutation of an object property within an array can lead to reactivity issues in Svelte. Although the original code has a refresh() function that likely mitigates this, the proposed change of creating a new object for the update ({ ...lastMsg, text: updatedText }) is a more robust and idiomatic Svelte pattern, improving code clarity and maintainability.

    Low
    • Update

    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    1 participant