Replies: 1 comment 2 replies
-
Hi @chaliy, thanks for reaching us. Would you like to write a dummy script to elaborate more about your case and let us know the mainly blocker you met when achieving this? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
An agent that is in a loop will call LLM, to call a function to get extra context, and when enough context, generate an answer. Ideally, the agent should support other common chat features, like chat history, citation and guardrails.
ReAct Paper - https://arxiv.org/abs/2210.03629
Beta Was this translation helpful? Give feedback.
All reactions