CE vs Pro Community Edition delivers the complete local-first workflow. Pro is where larger orchestration and deployment scale are intended to expand. 5.x RC1
🟦 Community Edition
Built for running and managing models on your own hardware with chat, attachments, logs, GPU monitoring, analytics, and benchmarking in one local-first workspace.
🟥 Pro
Positioned for larger orchestration, broader automation, and more advanced multi-system deployment workflows built on top of the same platform foundation.
LLM Controller CE is a local-first AI control platform for running, managing, and evaluating language models on your own hardware.
It combines model launch controls, a modern chat workspace, file-aware conversations, live logs, GPU monitoring, analytics, and benchmarking inside one browser-based interface.
Built for practical local use, it keeps runtime visibility, conversation workflows, and administrator controls in the same daily workspace.
Key Features
Model Management — scan, launch, stop, and govern your local model library.
Modern Chat Workflows — streaming output, Markdown, code, math, stop, regenerate, and prompt editing.
File-Aware AI — attach supported text and code files directly to prompts.
Observability — live logs, NVIDIA GPU telemetry, process visibility, and analytics.
Benchmarking — built-in CE evaluation tools for comparing eligible models on your own hardware.
LLM Controller CE (Community Edition) Local AI, done right.