Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Jan 10, 2026

Summary

Adds connection status indicators for local LLM providers (Ollama and LM Studio) in the AI settings view. Previously, these providers were always shown as available options even when not running. Now users can see at a glance whether the local provider is connected or not.

Changes:

  • Added lightweight connection check functions that ping Ollama (/api/tags) and LM Studio (via SDK) with a 2-second timeout
  • Created useLocalProviderStatus hook using React Query to check status on mount and every 15 seconds
  • Added status badge in provider cards showing "Connected" (green) or "Not Running" (gray)
  • Added "Connect" button that appears when provider is not running (triggers a manual status recheck)
  • Added download and models links for local providers (e.g., "Download Ollama", "View All Models")
  • Integrated status into provider dropdown: local providers are disabled when not connected, and show a green dot when connected

This was inspired by how Zed handles local LLM providers (reference screenshot was provided).

Review & Testing Checklist for Human

  • Test with Ollama running: Start Ollama (ollama serve), open AI settings, verify "Connected" badge appears for Ollama
  • Test with Ollama stopped: Stop Ollama, verify "Not Running" badge appears (may take up to 15s to update)
  • Test with LM Studio running: Start LM Studio server, verify "Connected" badge appears
  • Test with LM Studio stopped: Verify "Not Running" badge appears
  • Test Connect button: When provider shows "Not Running", click Connect button and verify it triggers a status recheck
  • Test download/models links: Click the links (e.g., "Download Ollama", "View All Models") and verify they open in browser
  • Test provider dropdown: Verify local providers are disabled in the dropdown when not running, and show green dot when connected
  • Verify UI appearance: Check that the status badge and Connect button don't break the accordion trigger layout

Notes

  • The LM Studio WebSocket URL format (ws:127.0.0.1:${port}) matches the existing pattern in list-lmstudio.ts
  • I was unable to test this locally as it requires the full Tauri desktop environment - human testing is essential
  • The useLocalProviderStatus hook now returns { status, refetch } instead of just the status string

Link to Devin run: https://app.devin.ai/sessions/73abfdc29f61482aa399cb6aa16ae370
Requested by: @ComputelessComputer

- Add lightweight connection check functions for Ollama and LM Studio
- Create useLocalProviderStatus hook to check if local providers are running
- Display connection status badge (Connected/Not Running) in provider cards
- Status is checked on mount and periodically refreshed every 15 seconds

Co-Authored-By: [email protected] <[email protected]>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR that start with 'DevinAI' or '@devin'.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@netlify
Copy link

netlify bot commented Jan 10, 2026

Deploy Preview for hyprnote-storybook canceled.

Name Link
🔨 Latest commit 6592041
🔍 Latest deploy log https://app.netlify.com/projects/hyprnote-storybook/deploys/696291c0ed4b810007c96b94

@netlify
Copy link

netlify bot commented Jan 10, 2026

Deploy Preview for hyprnote canceled.

Name Link
🔨 Latest commit 6592041
🔍 Latest deploy log https://app.netlify.com/projects/hyprnote/deploys/696291c09edaee0008536d32

@netlify
Copy link

netlify bot commented Jan 10, 2026

Deploy Preview for howto-fix-macos-audio-selection canceled.

Name Link
🔨 Latest commit 6592041
🔍 Latest deploy log https://app.netlify.com/projects/howto-fix-macos-audio-selection/deploys/696291c0cc64ed00087c984e

devin-ai-integration bot and others added 2 commits January 10, 2026 16:25
Prevent local LLM providers (ollama, lmstudio) from being selectable in the provider dropdown when they are not connected. The change queries local provider connection status via useLocalProviderStatus and uses that to disable the SelectItem for those providers and to show a small connected indicator. Also treat connected local providers as eligible when building provider mapping so they bypass normal eligibility blockers.
@ComputelessComputer ComputelessComputer force-pushed the devin/1768061693-local-llm-connection-status branch from 7a02814 to 9cf7e03 Compare January 10, 2026 17:23
@ComputelessComputer ComputelessComputer force-pushed the devin/1768061693-local-llm-connection-status branch from 67f37bb to 6592041 Compare January 10, 2026 17:51
@ComputelessComputer ComputelessComputer merged commit ce765d9 into main Jan 10, 2026
21 of 22 checks passed
@github-project-automation github-project-automation bot moved this from Backlog to Done in Hyprnote v1 Jan 10, 2026
@ComputelessComputer ComputelessComputer deleted the devin/1768061693-local-llm-connection-status branch January 10, 2026 17:58
yujonglee added a commit that referenced this pull request Jan 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

2 participants