Skip to content

Docker service stack for a comprehensive local ai experience, running on a normal home hardware setup with single GPU. DevOps-Dashboards included. WIP.

Notifications You must be signed in to change notification settings

maxron84/Ollama-OpenWebUi-ComfyUi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

85 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Services:

  • Ollama Local LLM Host + API
  • Open-WebUi Chat Interface
  • ComfyUi Stable-Diffusion Image/Video/Audio LLM Host + API
  • Grafana DevOps Metrics Tool
  • Watchtower for Automatic Docker Image Updates (Dev-Stack only)
  • VRAM Manager for effective resource sharing between Ollama and ComfyUi on a single GPU setup

Hardware:

  • Palit GeForce RTX 5080
  • AMD Ryzen 7 9800X3D
  • 64GB DDR5-6000
  • 2TB M.2 SSD PCIe 4.0
  • MSI X870E Motherboard
  • 1000W Max Consumption

Development in progress until production files published in this repo.

All content has been written using various AI assistants. Selection of models, prompting, content supervision, review, testing and refactoring is done by hand.

About

Docker service stack for a comprehensive local ai experience, running on a normal home hardware setup with single GPU. DevOps-Dashboards included. WIP.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published