getting null data in input and output columns 😬 #3396
-
so, i'm trying trace using start_trace() and i'm doing it on huggingface model llm with langchain's LlamaCPP framework. it is working and i'm getting raw json data but in table i'm not able to get input , output , Total tokens, and status this type of raw json i'm getting: this is the code i'm working on : start_trace() from opentelemetry.instrumentation.langchain import LangchainInstrumentor instrumentor = LangchainInstrumentor() import os callback_manager = CallbackManager([StreamingStdOutCallbackHandler()]) Custom_Prompt_Template = """ prompt = PromptTemplate(input_variables= ["input"], template= Custom_Prompt_Template) llm_chain = prompt | llm input = "What is the difference between llm and slm?" |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
@thy09 Could you please take a look at this? Thanks. |
Beta Was this translation helpful? Give feedback.
-
This is by design that the traces are produced by "LangchainInstrumentor" which is a 3rd party contract. So PF only show the parent-child structure in UI but no more information. |
Beta Was this translation helpful? Give feedback.
See this issue: #3080