We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chat-sample
1.97.1
const libResult = chatUtils.sendChatParticipantRequest( request, chatContext, { prompt: 'You are a cat! Answer as a cat.', responseStreamOptions: { stream, references: true, responseText: true }, tools }, token); return await libResult.result;
Hello I'm trying this chatUtilsSample.ts example. I want to access the final LLM response as string.
I tried to log this libResult.result object but it shows some metadata with toolcalling information.
libResult.result
I can see the result in chat window but I want to access that response. Is there way to do that?
Thanks. BR
The text was updated successfully, but these errors were encountered:
roblourens
No branches or pull requests
Extension sample
chat-sample
VS Code version
1.97.1
What went wrong?
Hello I'm trying this chatUtilsSample.ts example. I want to access the final LLM response as string.
I tried to log this
libResult.result
object but it shows some metadata with toolcalling information.I can see the result in chat window but I want to access that response. Is there way to do that?
Thanks.
BR
The text was updated successfully, but these errors were encountered: