A self-hosted fork of Open WebUI v0.6.5 under the original BSD-3-Clause license. This version preserves the BSD-3-Clause license terms from 2024 and removes post-2025 modifications that introduced rebranding restrictions.
Try it for free:
http://ailabs.chat (served via aiLabs.ar infrastructure)
docker run -d \
--network=host \
-e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
-e ENABLE_WEBSOCKET_SUPPORT=false \
-v open-chat-data:/app/backend/data \
--name open-chat \
--restart always \
ghcr.io/ailabsarg/open-chat:0.7.0docker run -d \
--network=host \
--gpus all \
-e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
-e ENABLE_WEBSOCKET_SUPPORT=false \
-v open-chat-data:/app/backend/data \
--name open-chat \
--restart always \
ghcr.io/ailabsarg/open-chat:0.7.0-cuda
β οΈ Important:
- These commands use our fork's Docker image (
ghcr.io/ailabsarg/open-chat)- The
:0.7.0tag ensures users get the original BSD-3-Clause version
- License Compliance: BSD-3-Clause (original terms only)
- No Rebranding Restrictions: Unlike later versions
- Self-Hosted: Full control over data and deployment
- Modular Architecture: Easy to extend with plugins
If you want to build directly from our fork's codebase:
git clone https://github.com/ailabsarg/open-chat.git
cd open-chat
# Build base version
docker build -t ghcr.io/ailabsarg/open-chat:0.7.0 .
# Build CUDA version
docker build -t ghcr.io/ailabsarg/open-chat:0.7.0-cuda --build-arg USE_CUDA=true .This project is licensed under the BSD-3-Clause License (original 2023 terms).
- License file: LICENSE
- License history: docs/LICENSE_HISTORY.md
β οΈ Note: This fork removes any post v0.6.5 license modifications introduced in v0.6.6 of Open WebUI.
Created by Timothy Jaeryang Baek - πͺ and the Open WebUI Community with license preservation by the aiLabs.ar Open-Chat team.
Special thanks to @tjbck for the original BSD-3-Clause implementation.
docker run -d \
--network=host \
-e OLLAMA_BASE_URL=http://127.0.0.1:11434 \
-v open-chat-data:/app/backend/data \
--name open-chat \
--restart always \
ghcr.io/ailabsarg/open-chat:0.7.0
http://ailabs.chat (live demo)
Note
Please note that for certain Docker environments, additional configurations might be needed. If you encounter any connection issues, our detailed guide on Open WebUI Documentation is ready to assist you.
Warning
When using Docker to install Open Chat, make sure to include the -v open-chat-data:/app/backend/data in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
Tip
If you wish to utilize Open Chat with Ollama included or CUDA acceleration, we recommend utilizing our official images tagged with either :cuda or :ollama. To enable CUDA, you must install the Nvidia CUDA container toolkit on your Linux/WSL system.
If you're experiencing connection issues, itβs often due to the Open Chat docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the --network=host flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: http://localhost:8080.
Example Docker Command:
docker run -d --network=host -v open-chat-data:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-chat --restart always ghcr.io/ailabsarg/open-chat:0.7.0Created by Timothy Jaeryang Baek - πͺ