Replies: 1 comment 1 reply
-
Have you set custom pricing @Huyueeer for vLLM ? https://docs.litellm.ai/docs/proxy/custom_pricing#usage-with-openai-proxy-server-1 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My proxy contains the Qwen model of Azure OpenAI and VLLM inference. I found that the usage of Azure OpenAI will be recorded in the usage, but the usage of VLLM's Qwen will not be recorded in the usage.
Beta Was this translation helpful? Give feedback.
All reactions