You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Super stoked about everything that folks have built here. I am trying to develop a custom coding assistance stack using OpenWebUI + Continue as the front end.
My ideal setup would be to configure my MCP servers to work with OpenWebUI as "tools" (running on localhost so, I understand, as "personal" tools, not Admin tools) and then leverage those MCP servers through Continue by using the OpenWebUI endpoint as my model provider. This way I can customize model wrappers in OWUI while still making MCP available in Continue without having to run separate processes (one for OWUI via mcpo and another via the Continue config).
Does this make sense? I'm wondering if this is possible. I guess if the OWUI endpoint is being used for LLM inference that might be separate from the MCP service in Continue's eyes, so I'd need to configure MCP with Continue directly? But I am kind of a n00b here so not very clear on the separation of concerns. I understand the tool use is originating from the LLM, not Continue in any case.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi team!
Super stoked about everything that folks have built here. I am trying to develop a custom coding assistance stack using OpenWebUI + Continue as the front end.
My ideal setup would be to configure my MCP servers to work with OpenWebUI as "tools" (running on localhost so, I understand, as "personal" tools, not Admin tools) and then leverage those MCP servers through Continue by using the OpenWebUI endpoint as my model provider. This way I can customize model wrappers in OWUI while still making MCP available in Continue without having to run separate processes (one for OWUI via mcpo and another via the Continue config).
Does this make sense? I'm wondering if this is possible. I guess if the OWUI endpoint is being used for LLM inference that might be separate from the MCP service in Continue's eyes, so I'd need to configure MCP with Continue directly? But I am kind of a n00b here so not very clear on the separation of concerns. I understand the tool use is originating from the LLM, not Continue in any case.
Curious any opinions on best practices. Thanks.
Beta Was this translation helpful? Give feedback.
All reactions