Skip to content

Commit ccafdda

Browse files
committed
Update index.mdx
1 parent fb4f43f commit ccafdda

1 file changed

Lines changed: 57 additions & 11 deletions

File tree

docs/features/plugin/tools/index.mdx

Lines changed: 57 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,18 +29,8 @@ Explore ready-to-use tools here:
2929

3030
There are two easy ways to install Tools in Open WebUI:
3131

32-
### Option 1: Manual Download & Import
33-
34-
1. Visit the [Community Tool Library](https://openwebui.com/tools)
35-
2. Click on a Tool you like.
36-
3. Click the blue Get button > “Download as JSON export.”
37-
4. Go to Workspace ➡️ Tools in Open WebUI.
38-
5. Click “Import Tools” and upload the downloaded file.
39-
40-
### Option 2: One-Click Import from the Web
41-
4232
1. Go to [Community Tool Library](https://openwebui.com/tools)
43-
2. Choose a Tool, then click the blue Get button.
33+
2. Choose a Tool, then click the Get button.
4434
3. Enter your Open WebUI instance’s IP address or URL.
4535
4. Click “Import to WebUI” — done!
4636

@@ -78,6 +68,62 @@ You can also let your LLM auto-select the right Tools using the AutoTool Filter:
7868

7969
✅ And that’s it — your LLM is now Tool-powered! You're ready to supercharge your chats with web search, image generation, voice output, and more.
8070

71+
72+
---
73+
74+
# 🧠 Choosing How Tools Are Used: Default vs Native
75+
76+
Once Tools are enabled for your model, Open WebUI gives you two different ways to let your LLM use them in conversations.
77+
78+
You can decide how the model should call Tools by choosing between:
79+
80+
- 🟡 Default Mode (Prompt-based)
81+
- 🟢 Native Mode (Built-in function calling)
82+
83+
Let’s break it down:
84+
85+
### 🟡 Default Mode (Prompt-based Tool Triggering)
86+
87+
This is the default setting in Open WebUI.
88+
89+
Here, your LLM doesn’t need to natively support function calling. Instead, we guide the model using smart prompts (ReACT-style — Reasoning + Acting) to select and use a Tool.
90+
91+
✅ Works with almost any model
92+
✅ Great way to unlock Tools with basic or local models
93+
❗ Not as reliable or flexible as Native Mode when chaining tools
94+
95+
### 🟢 Native Mode (Function Calling Built-In)
96+
97+
If your model does support “native” function calling (like GPT-4o or GPT-3.5-turbo-1106), you can use this powerful mode to let the LLM decide — in real time — when and how to call multiple Tools during a single chat message.
98+
99+
✅ Fast, accurate, and can chain multiple Tools in one response
100+
✅ The most natural and advanced experience
101+
❗ Requires a model that actually supports native function calling
102+
103+
### ✳️ How to Switch Between Modes
104+
105+
Want to enable native function calling in your chats? Here's how:
106+
107+
1. Open the chat window with your model.
108+
2. Click ⚙️ Chat Controls > Advanced Params.
109+
3. Look for the Function Calling setting and switch it from Default → Native
110+
111+
That’s it! Your chat is now using true native Tool support (as long as the model supports it).
112+
113+
➡️ We recommend using GPT-4o or another OpenAI model for the best native function-calling experience.
114+
🔎 Some local models may claim support, but often struggle with accurate or complex Tool usage.
115+
116+
💡 Summary:
117+
118+
| Mode | Who it’s for | Pros | Cons |
119+
|----------|----------------------------------|-----------------------------------------|--------------------------------------|
120+
| Default | Any model, even local ones | Broad compatibility, safer, flexible | May be less accurate or slower |
121+
| Native | GPT-4o, GPT-3.5 turbo, etc. | Fast, smart, excellent tool chaining | Needs proper function call support |
122+
123+
Choose the one that works best for your setup — and remember, you can always switch on the fly via Chat Controls.
124+
125+
👏 And that's it — your LLM now knows how and when to use Tools, intelligently.
126+
81127
---
82128

83129
# 🧠 Summary

0 commit comments

Comments
 (0)