|

Run Your No-Code AI Stack in Docker

LLM-powered automations are often stitched together with low-code tools, self-hosted UIs, and local models. That’s great for experimentation, but chaotic for production.

This article shows you how to containerize a complete no-code/LLM automation stack using Docker Compose, including:

  • Ollama (local LLM backend)
  • OpenWebUI (frontend chat interface)
  • N8N (automation/orchestration)
  • PostgreSQL (chat memory / data persistence)

Note: OpenWebUI and N8N don’t communicate directly, so you’ll need a lightweight intermediary (like a custom function or webhook handler) to translate and pass data between them.

Why This Matters

When you’re developing locally, it’s easy to install everything manually. But when you’re ready to deploy – or even just collaborate – you’ll run into issues like:

  • Dependency hell across tools
  • Conflicting ports
  • Hard-to-reproduce environments
  • Manual steps that don’t scale

Docker solves all of this by turning your AI automation stack into a declarative, portable environment.

What to Do

  • Run each component (Ollama, N8N, WebUI, Postgres) in its own container
  • Use docker-compose.yml to orchestrate startup and networking
  • Mount volumes for persistent data (e.g., N8N workflows, Ollama models)
  • Optionally expose ports only behind a reverse proxy in production

Production Tip

Even in local dev, isolate services in containers first. Then plug in cloud services or reverse proxies when you’re ready for staging or production.

Code Example: docker-compose.yml


Here’s a minimal working version that wires up all 4 services.

version: '3.8'

services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    volumes:
      - ollama_data:/root/.ollama
    ports:
      - "11434:11434"

  openwebui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: openwebui
    depends_on:
      - ollama
    ports:
      - "3000:8080"
    environment:
      - OLLAMA_API_BASE_URL=http://ollama:11434
    volumes:
      - openwebui_data:/app/backend/data

  postgres:
    image: postgres:15
    container_name: postgres
    restart: always
    environment:
      POSTGRES_USER: n8n
      POSTGRES_PASSWORD: n8npass
      POSTGRES_DB: n8n
    volumes:
      - postgres_data:/var/lib/postgresql/data

  n8n:
    image: n8nio/n8n
    container_name: n8n
    ports:
      - "5678:5678"
    environment:
      DB_TYPE: postgresdb
      DB_POSTGRESDB_HOST: postgres
      DB_POSTGRESDB_PORT: 5432
      DB_POSTGRESDB_DATABASE: n8n
      DB_POSTGRESDB_USER: n8n
      DB_POSTGRESDB_PASSWORD: n8npass
      N8N_BASIC_AUTH_ACTIVE: true
      N8N_BASIC_AUTH_USER: admin
      N8N_BASIC_AUTH_PASSWORD: adminpass
      N8N_HOST: localhost
      N8N_PORT: 5678
    depends_on:
      - postgres
    volumes:
      - n8n_data:/home/node/.n8n

volumes:
  ollama_data:
  openwebui_data:
  postgres_data:
  n8n_data:

Example Output

Once started with docker compose up, you’ll have:

  • http://localhost:11434 – Ollama API
  • http://localhost:3000 – OpenWebUI chat
  • http://localhost:5678 – N8N interface (user: admin, pass: adminpass)
  • PostgreSQL running in the background

You can now create N8N workflows that call Ollama, receive responses, and pass them to other steps like email, database inserts, etc.

Going Further

  • Add a reverse proxy (e.g., Traefik or NGINX) to expose services publicly with HTTPS
  • Persist logs to disk or external services
  • Create a .env file for sensitive values
  • Use N8N to schedule workflows that pull from external APIs or trigger AI tasks
  • Train or fine-tune models with local Ollama tools, then swap models by changing environment vars

Final Thought

No-code and low-code tools are powerful, but without containerization, they become brittle, fragile systems. Docker Compose gives you a repeatable way to bundle your automation logic, models, UI, and data into something portable, testable, and deployable.

Stop running everything manually. Dockerize your AI stack.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *