-
Notifications
You must be signed in to change notification settings - Fork 473
feat: add connection status check for Ollama and LM Studio #2968
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
ComputelessComputer
merged 5 commits into
main
from
devin/1768061693-local-llm-connection-status
Jan 10, 2026
Merged
feat: add connection status check for Ollama and LM Studio #2968
ComputelessComputer
merged 5 commits into
main
from
devin/1768061693-local-llm-connection-status
Jan 10, 2026
+271
−11
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Add lightweight connection check functions for Ollama and LM Studio - Create useLocalProviderStatus hook to check if local providers are running - Display connection status badge (Connected/Not Running) in provider cards - Status is checked on mount and periodically refreshed every 15 seconds Co-Authored-By: [email protected] <[email protected]>
Contributor
Author
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
✅ Deploy Preview for hyprnote-storybook canceled.
|
✅ Deploy Preview for hyprnote canceled.
|
✅ Deploy Preview for howto-fix-macos-audio-selection canceled.
|
Prevent local LLM providers (ollama, lmstudio) from being selectable in the provider dropdown when they are not connected. The change queries local provider connection status via useLocalProviderStatus and uses that to disable the SelectItem for those providers and to show a small connected indicator. Also treat connected local providers as eligible when building provider mapping so they bypass normal eligibility blockers.
7a02814 to
9cf7e03
Compare
…providers Co-Authored-By: [email protected] <[email protected]>
67f37bb to
6592041
Compare
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Adds connection status indicators for local LLM providers (Ollama and LM Studio) in the AI settings view. Previously, these providers were always shown as available options even when not running. Now users can see at a glance whether the local provider is connected or not.
Changes:
/api/tags) and LM Studio (via SDK) with a 2-second timeoutuseLocalProviderStatushook using React Query to check status on mount and every 15 secondsThis was inspired by how Zed handles local LLM providers (reference screenshot was provided).
Review & Testing Checklist for Human
ollama serve), open AI settings, verify "Connected" badge appears for OllamaNotes
ws:127.0.0.1:${port}) matches the existing pattern inlist-lmstudio.tsuseLocalProviderStatushook now returns{ status, refetch }instead of just the status stringLink to Devin run: https://app.devin.ai/sessions/73abfdc29f61482aa399cb6aa16ae370
Requested by: @ComputelessComputer