Skip to content

Remote LLM only supporting one chat. #436

@Jflick58

Description

@Jflick58

Describe the bug
I have a remote LLM that is our internal proxy for Azure OpenAI and Google Gemini. I have configured it properly as it does occasionally work. However, I often get an error pop up:

Error: LLM not found. Stack Trace: Error: LLM not found. at QA (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:148:644) at async O (file:///Applications/Reor.app/Contents/Resources/app.asar/dist/assets/index-92e5223c.js:259:6679)

To Reproduce
Steps to reproduce the behavior:

  1. Setup Remote LLM (proxied Azure OpenAI GPT-4o)
  2. Set default llm to GPT-4o
  3. Start a chat. (it will work)
  4. Start a new chat with that same LLM (or switch between chats)
  5. See error.

Expected behavior
Chats should function independently so that I can switch LLMs or have multiple separate chats with the same LLM.

Barring that, exposing logs or a more informative error message would be helpful.

Screenshots

Desktop (please complete the following information):

  • OS: MacOS
  • Hardware: MacBook Pro 14-inch, 2021 (M1 Pro 10 core), 16 GB ram
  • Version: Sonoma 14.6.1

Additional context
Love the tool and hope to contribute when I get some free time!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions