Skip to content

ailabsarg/open-chat

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10,060 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

πŸš€ Open Chat (Fork of Open WebUI v0.6.5)

GitHub stars
License

A self-hosted fork of Open WebUI v0.6.5 under the original BSD-3-Clause license. This version preserves the BSD-3-Clause license terms from 2024 and removes post-2025 modifications that introduced rebranding restrictions.


πŸ“¦ Live Demo

Try it for free:
http://ailabs.chat (served via aiLabs.ar infrastructure)


πŸ› οΈ Installation

βœ… Verified Docker Command

docker run -d \
  --network=host \
  -e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
  -e ENABLE_WEBSOCKET_SUPPORT=false \
  -v open-chat-data:/app/backend/data \
  --name open-chat \
  --restart always \
  ghcr.io/ailabsarg/open-chat:0.7.0

πŸ› οΈ Optional: GPU Support

docker run -d \
  --network=host \
  --gpus all \
  -e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
  -e ENABLE_WEBSOCKET_SUPPORT=false \
  -v open-chat-data:/app/backend/data \
  --name open-chat \
  --restart always \
  ghcr.io/ailabsarg/open-chat:0.7.0-cuda

⚠️ Important:

  • These commands use our fork's Docker image (ghcr.io/ailabsarg/open-chat)
  • The :0.7.0 tag ensures users get the original BSD-3-Clause version

πŸ“ Key Features

  • License Compliance: BSD-3-Clause (original terms only)
  • No Rebranding Restrictions: Unlike later versions
  • Self-Hosted: Full control over data and deployment
  • Modular Architecture: Easy to extend with plugins

πŸ“¦ Build from Source (Optional)

If you want to build directly from our fork's codebase:

git clone https://github.com/ailabsarg/open-chat.git
cd open-chat

# Build base version
docker build -t ghcr.io/ailabsarg/open-chat:0.7.0 .

# Build CUDA version
docker build -t ghcr.io/ailabsarg/open-chat:0.7.0-cuda --build-arg USE_CUDA=true .

πŸ›‘οΈ License

This project is licensed under the BSD-3-Clause License (original 2023 terms).

⚠️ Note: This fork removes any post v0.6.5 license modifications introduced in v0.6.6 of Open WebUI.

🀝 Support (Open WebUI)


πŸ† Credits

Created by Timothy Jaeryang Baek - πŸ’ͺ and the Open WebUI Community with license preservation by the aiLabs.ar Open-Chat team.
Special thanks to @tjbck for the original BSD-3-Clause implementation.



πŸ§ͺ Test Command (Verified Working)

docker run -d \
  --network=host \
  -e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
  -v open-chat-data:/app/backend/data \
  --name open-chat \
  --restart always \
  ghcr.io/ailabsarg/open-chat:0.7.0

Open WebUI Demo http://ailabs.chat (live demo)

Quick Start with Docker 🐳

Note

Please note that for certain Docker environments, additional configurations might be needed. If you encounter any connection issues, our detailed guide on Open WebUI Documentation is ready to assist you.

Warning

When using Docker to install Open Chat, make sure to include the -v open-chat-data:/app/backend/data in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.

Tip

If you wish to utilize Open Chat with Ollama included or CUDA acceleration, we recommend utilizing our official images tagged with either :cuda or :ollama. To enable CUDA, you must install the Nvidia CUDA container toolkit on your Linux/WSL system.

Open Chat: Server Connection Error

If you're experiencing connection issues, it’s often due to the Open Chat docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the --network=host flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: http://localhost:8080.

Example Docker Command:

docker run -d --network=host -v open-chat-data:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-chat --restart always ghcr.io/ailabsarg/open-chat:0.7.0

Created by Timothy Jaeryang Baek - πŸ’ͺ

About

User-friendly AI Interface (Supports Ollama, OpenAI API, ...)

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • JavaScript 39.0%
  • Svelte 28.5%
  • Python 23.6%
  • TypeScript 5.0%
  • CSS 3.5%
  • Shell 0.2%
  • Other 0.2%