Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chatUtilsSample parse the response as text. #1146

Open
kadirtugarantibbva opened this issue Feb 14, 2025 · 0 comments
Open

chatUtilsSample parse the response as text. #1146

kadirtugarantibbva opened this issue Feb 14, 2025 · 0 comments
Assignees

Comments

@kadirtugarantibbva
Copy link

kadirtugarantibbva commented Feb 14, 2025

Extension sample

chat-sample

VS Code version

1.97.1

What went wrong?

     const libResult = chatUtils.sendChatParticipantRequest(
            request,
            chatContext,
            {
                prompt: 'You are a cat! Answer as a cat.',
                responseStreamOptions: {
                    stream,
                    references: true,
                    responseText: true
                },
                tools
            },
            token);

        return await libResult.result;

Hello I'm trying this chatUtilsSample.ts example. I want to access the final LLM response as string.

I tried to log this libResult.result object but it shows some metadata with toolcalling information.

I can see the result in chat window but I want to access that response. Is there way to do that?

Thanks.
BR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants