Open source project

Self-hosted workflow automation for local LLMs

Solux is a local-first AI workflow engine that chains inputs, transforms, and local LLM steps into automated pipelines. Everything runs on your machine — no cloud APIs, no vendor lock-in, no data leaving your network.

Quick start

Up and running in 60 seconds

Only requires Python 3.11+ and Ollama. No ffmpeg, whisper, or yt-dlp needed for your first run.

$ pip install solux
$ solux init
$ solux https://example.com/any-article

solux init creates your config, scaffolds a starter workflow, checks Ollama, and prints next steps. You go from zero to a working AI pipeline that fetches, cleans, and summarizes a webpage.

Features

What you get out of the box

30+ built-in modules

Input, transform, AI, output, and meta modules — from webpage fetching and PDF parsing to LLM summarization and Slack notifications.

YAML-defined workflows

Conditional steps, foreach iteration with parallelism, sub-workflows, branching, error handling, and per-step timeouts.

5 trigger types

Folder watch, RSS poll, cron schedules, email inbox polling, and inbound webhooks with HMAC verification and rate limiting.

CLI-first with web UI

Every feature is available from the terminal. The web UI adds a dashboard, YAML editors, module catalog, and live job history.

Security modes + RBAC

Trusted and untrusted execution modes, OIDC authentication, role-based access control, and encrypted audit logging.

MCP server mode

Expose workflows as MCP tools so AI agents like Claude Code, Cursor, and Windsurf can discover and invoke them directly.

Architecture

How it works

Sources enter via CLI, API, or triggers. A SQLite queue manages jobs with atomic claiming and exponential-backoff retry. The workflow engine executes each step through the module system.

Source (URL / file / folder)
       |
       v
CLI / API / Triggers --> SQLite Queue (WAL mode)
                              |
                         Worker Thread
                         (poll / retry / dead-letter)
                              |
                         Workflow Loader
                         (YAML + secrets interpolation)
                              |
                         Workflow Engine
                         (validate -> when? -> foreach? -> timeout -> run)
                              |
        +----------+----------+----------+----------+
        |          |          |          |          |
     input    transform      ai      output      meta
     fetch     split /      llm      file /     sub-wf
     rss /     clean /     whis-    webhook /   branch
     email      ocr        per      email

Example

A real workflow

Workflows are YAML files that define a pipeline of steps. This one fetches a webpage, cleans the text, analyzes sentiment with a local LLM, and sends a Slack notification when the result is not neutral.

name: webpage_sentiment
description: "Fetch a page, clean it, and analyze sentiment."
steps:
  - name: fetch
    type: input.webpage_fetch
    config: 

  - name: clean
    type: transform.text_clean
    config:
      input_key: webpage_text
      output_key: cleaned_text
      strip_html: true
      max_chars: 4000

  - name: sentiment
    type: ai.llm_sentiment
    config:
      input_key: cleaned_text
      scale: pos_neg_neu

  - name: notify
    type: output.slack_notify
    when: "sentiment != 'neutral'"
    config:
      webhook_url: "${env:SLACK_WEBHOOK}"
      message_template: "Sentiment: {sentiment[label]}"

License

Open source

Solux is licensed under the Apache License 2.0. Free to use, modify, and distribute for any purpose.

Deployment

Run it your way

  • pippip install solux with optional extras
  • Dockerdocker compose up -d starts Ollama + server + worker
  • systemd — hardened unit files with security sandboxing

Start building

Install Solux, run solux init, and have a working AI pipeline in under a minute. Explore the built-in modules, write your first custom workflow, and automate the things you do every day.