-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Description
[Bug/Feature Request]: Snowflake Cortex now supports function calling but LiteLLM doesn't expose it
Description
Snowflake Cortex REST API now supports function calling (tools) for certain models like Claude 3.5 Sonnet, but LiteLLM's Snowflake provider does not include tools
and tool_choice
in its supported parameters list.
Current Behavior
When attempting to use tools with Snowflake models through the Responses API:
import litellm
litellm.drop_params = True # Required to prevent error
response = litellm.responses(
model="snowflake/claude-3-5-sonnet",
input="What's the weather in Paris?",
tools=[{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"]
}
}
}]
)
Result: The tools
parameter is dropped and the model responds with plain text instead of making function calls.
Expected Behavior
The tools should be passed through to Snowflake's API and the model should respond with function calls.
Root Cause
In litellm/llms/snowflake/chat/transformation.py
, line 33:
def get_supported_openai_params(self, model: str) -> List:
return ["temperature", "max_tokens", "top_p", "response_format"]
Missing: tools
, tool_choice
Evidence from Snowflake Documentation
According to the official Snowflake Cortex REST API documentation:
Tool/Function Calling:
- Supported only with specific models like Claude 3.5 Sonnet and Claude 3.7 Sonnet
- Allows configuring tool specifications, tool choices, and tool use/results
- Supports a "chain of thought" approach with tool execution
Key Parameters:
tools
: Defines available tools for function callingtool_choice
: Controls tool usage behavior
Historical Context
PR #8950 (merged March 13, 2025) added Snowflake REST API support but explicitly noted that Snowflake did NOT support tool calling at that time. Snowflake has since added this feature.
Proposed Fix
This appears to be a simple fix. Following the pattern from other providers (Gemini, Anthropic), we need to:
- Update
litellm/llms/snowflake/chat/transformation.py
line 33:
def get_supported_openai_params(self, model: str) -> List:
return [
"temperature",
"max_tokens",
"top_p",
"response_format",
"tools", # ADD
"tool_choice" # ADD
]
- Optionally: Add model-specific logic since only certain models support tools (Claude 3.5/3.7 Sonnet, etc.)
Comparison with Other Providers
Gemini (litellm/llms/gemini/chat/transformation.py:76-99
):
def get_supported_openai_params(self, model: str) -> List[str]:
supported_params = [
"temperature",
"top_p",
"max_tokens",
"tools", # ✅
"tool_choice", # ✅
"functions",
# ...
]
Anthropic (litellm/llms/anthropic/chat/transformation.py
):
def get_supported_openai_params(self, model: str):
params = [
"stream",
"temperature",
"top_p",
"max_tokens",
"tools", # ✅
"tool_choice", # ✅
# ...
]
Environment
- LiteLLM version: 1.77.7 (latest)
- Python version: 3.13
- Snowflake Account Region: us-east-1
- Model tested:
snowflake/claude-3-5-sonnet
Additional Notes
The Responses API transformation layer already handles tools correctly (see litellm/responses/litellm_completion_transformation/transformation.py:108-118
), so this is purely a provider configuration update.
Impact: Medium - Blocks users from using function calling with Snowflake Cortex models that support it
Complexity: Low - Simple parameter list update
Urgency: Medium - Feature gap with upstream provider capabilities