Local AI Cloud Deployment

Cole Medin demonstrates a full cloud deployment of a local AI stack, integrating LLM (Ollama), databases (Supabase, Qdrant), automation agents (n8n with Flowise), local search (SearXNG) and a user interface (Open WebUI). The setup uses Docker containers, secured endpoints via Caddy, and detailed firewall/DNS configurations for subdomains. This free, open-source system extends an original n8n template and is available on GitHub for immediate use.

Category:

English 🇺🇸🇬🇧