Local-only LLM inference
Higher hardware requirements than cloud AI APIs, but sensitive data never leaves the network. No vendor lock-in on model choice.
Projects
What I am building, what keeps breaking, and what I am learning as the architecture evolves.
Flagship project
A local-first AI workflow engine that chains inputs, transforms, and local LLM steps into automated pipelines. Everything runs on your machine — no cloud APIs, no vendor lock-in, no data leaving your network.
YAML-defined workflows with 30+ built-in modules, 5 trigger types, CLI and web UI, security modes with RBAC, and MCP server mode for AI agent integration.
Open source project
An access-control reverse proxy for Ollama. Butler adds multi-user authentication (API keys, JWT, OIDC), per-user model authorization, rate limiting, input filtering, and Prometheus observability — without changing Ollama or your clients.
Single static Go binary, one YAML config file, one external dependency. Fail-closed by design. Supports Keycloak, Okta, and Entra ID out of the box.
Open source project
Porthole gives security-conscious Android users a safe, controlled way to authenticate with captive portals — hotel WiFi, airport networks, coffee shops — without compromising their VPN tunnel. It opens an isolated, time-limited browser session that operates outside your VPN, authenticates with the portal, and shuts down cleanly when you're done.
Kotlin. Apache 2.0 licensed.
Open source project
In a world where deepfakes can impersonate anyone on a video call, TrueStream sits quietly in your browser and watches for signs of manipulation. When something looks off, it tells you. When you need certainty, it lets you prove who you are through the Vinsium cryptographic identity protocol.
TypeScript browser extension. Apache 2.0 licensed.
Active project
Private AI infrastructure that runs entirely on your network. Vinsium combines zero-trust mesh networking, a local AI workflow engine, and enterprise identity into a single platform with no cloud dependency.
Status: in active development. If you want to follow progress or test early versions, reach out through the contact page.
Why I am building this
Organizations that care about data sovereignty need to run AI workloads without sending sensitive data to cloud APIs. Vinsium gives them local LLM inference, composable processing pipelines, and serious security controls without the enterprise overhead.
What I am learning
The hard problems are not the AI models — they are identity federation at the edge, audit chain integrity across distributed nodes, and building operational visibility that operators actually trust.
Current challenges
Architecture decisions and trade-offs
Higher hardware requirements than cloud AI APIs, but sensitive data never leaves the network. No vendor lock-in on model choice.
29 input/transform/AI/output modules instead of monolithic workflows. More wiring, but each module is independently testable and replaceable.
HMAC-SHA256 chain signing adds write overhead, but gives operators a verifiable, tamper-proof audit trail.
Foundational project
Mistborn started as a personal experiment in making private networking and self-hosted services easier to run. It grew into a real platform used by builders who care about privacy without enterprise overhead.
How it started
A home-lab project that kept growing because each solved problem exposed the next frustrating one.
What went wrong
What I would do differently
I would optimize for simpler operational paths earlier, and treat docs as a product surface from day one.
Community contributions
Fork it, break it, and make it yours.
Exploring
Early DevelopmentA browser extension that helps with LinkedIn engagement — surfacing relevant conversations, drafting context-aware responses, and reducing the manual overhead of staying active on the platform.
Still in the early exploration phase. Following the same build-in-public approach as the other projects.
Linux Pro Magazine, Awesome Open Source, and DB Tech covered Mistborn. Those early reviews helped shape where the project went next.