Skip to content

Is there any way to get lc_kwargs as an output from the stream without the callback handleLLMEnd?  #57

@josuasimatupang

Description

      let llmOutput = null;
      const prompt = getPrompt(autobotParameter?.prePrompt);
      const outputParser = new HttpResponseOutputParser();

      /**
       * Chat models stream message chunks rather than bytes, so this
       * output parser handles serialization and byte-encoding.
       */
      // const chain = RunnableSequence.from([prompt, model, outputParser]);

      const chain = prompt.pipe(model).pipe(outputParser);

      // Create a context object with all required template variables
      const contextData = {
        chat_history: formattedPreviousMessages,
        input: rewrittenIntent?.lc_kwargs.content,
        context: formatContext(getContext.data),
      };

      const stream = await chain.stream(contextData, {
        callbacks: [
          {
            // lc_serializable: true,
            handleLLMEnd: (output: any) => {
              console.log('🚀 ~ POST ~ output:', output);
              // Get lc_kwargs from output
              llmOutput = output.generations[0][0]?.message;
            },
          },
        ],
      });

      // Safely set the headers after `llmOutput` is ready
      return new StreamingTextResponse(stream);

If i use callbacks it more likely return the response first then the callback is hit. Any other approach can I write?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions