The production runtime
for agentic products.
An embeddable Rust library that enables your agents to run, learn, and adapt. Securely.
What is systemprompt?
You built something smart. But smart isn't shippable. systemprompt adds everything your AI needs to serve real users: auth, isolation, storage, orchestration. Focus on the intelligence, not the infrastructure.
EMBEDDED RUST LIBRARY
50MB binary with everything. A complete AI infrastructure you compile into your project.
- Complete stack in one binary
- Production auth built in
- MCP hosting, secured
- Extensible at compile time
- Observability without instrumentation
UNIFIED CONTROL PLANE
The same commands work for you and your AI agents. Local or remote. Every action audited.
- Superagent-ready interface
- Transparent remote routing
- Eight unified domains
- Complete audit visibility
- Config-driven operations
THE CLOSED LOOP
Your agents can query their own performance and adapt in real-time. The feedback loop is built in.
- MCP tool calls -> PostgreSQL -> CLI queries
- A2A self-analysis in real-time
- Behavioral pattern recognition
- Dynamic context & skill modification
- Production audit trail with trace IDs
Everything your AI needs. Built in.
AI execution. Agent orchestration. MCP. Auth. Production-ready from day one.
Runs AI
Anthropic, OpenAI, Gemini—one API. Vision, reasoning, streaming, cost tracking.
→ ExecuteOrchestrates agents
A2A protocol. Multi-agent workflows. Task management. Shared state.
→ ExecuteSchedules jobs
Cron-based automation. Background tasks. Content publishing. Maintenance.
→ IntegrateHosts MCP
Production tool servers with real OAuth. Your capabilities, secured.
→ IntegrateRuns workflows
Skills and playbooks. Define once, execute anywhere. YAML automation.
→ IntegrateServes web
Templates, navigation, theming. Your brand, your domain, your interface.
→ SecureHandles auth
OAuth2/OIDC + WebAuthn included. Ship AI, not login screens.
→ SecureIsolates users
Multi-tenant from day one. Per-user scopes enforced automatically.
→ OperateStores files
Upload, serve, permission. No S3 config. No CDN. Works.
→ OperateTracks everything
Costs, usage, audit trails. Every AI request logged automatically.
→ OperateManages content
Blog, docs, legal pages. Markdown in, indexed database out. SEO included.
→Get started in minutes
Clone
Create your project from the GitHub template
gh repo create my-ai --template systempromptio/systemprompt-template --clone
Build
Build the CLI binary (offline mode for first build)
SQLX_OFFLINE=true cargo build --release
Login
Authenticate with systemprompt.io
systemprompt cloud auth login
Profile
Create a local or cloud deployment profile
systemprompt cloud profile create local
Run
Start all services locally
just start
Machine-Native Guides Written by agents. For agents.
Point your super agent to a playbook and watch the magic happen. Deterministic. Self-repairing. Executable.
START HERE
The required entry point for every agent. Read this playbook first before any task.
$ systemprompt core playbooks show guide_start
Built for production workloads
MCP Server Hosting
Host your MCP servers with real authentication. Connect Claude Desktop to production APIs securely.
# services/mcp/systemprompt.yaml
mcp_servers:
systemprompt:
binary: "systemprompt-mcp"
port: 5010
oauth:
required: true
scopes: ["admin"]
audience: "mcp"
Learn more →
Multi-Tenant AI Products
Ship AI products where every user gets isolated agent access with proper permissions.
# services/agents/welcome.yaml
card:
securitySchemes:
oauth2:
type: oauth2
flows:
authorizationCode:
scopes:
anonymous: "Public access"
user: "Authenticated user access"
Learn more →
Internal AI Tooling
Give your team shared agents with proper permissions. No more credential sharing.
# services/agents/welcome.yaml
agents:
welcome:
enabled: true
card:
skills:
- id: "general_assistance"
name: "General Assistance"
- id: "content_writing"
name: "Content Writing"
Learn more →
Agent-to-Agent Orchestration
Multiple agents coordinating via A2A protocol with shared state and permissions.
# services/agents/welcome.yaml (A2A card)
card:
protocolVersion: "0.3.0"
preferredTransport: "JSONRPC"
capabilities:
streaming: true
pushNotifications: false
Learn more →
Agentic Mesh
Automated, scalable workflows run by agents. Deterministic scheduling meets agentic intelligence.
# services/scheduler/config.yaml
scheduler:
enabled: true
jobs:
- name: publish_content
extension: core
job: publish_content
schedule: "0 */30 * * * *"
Learn more →
Framework vs. Runtime
AI Frameworks (LangChain, CrewAI)
- Chains, agents, tools
- Your API keys
- Single-user by default
- Auth: build it yourself
- Multi-tenant: build it yourself
- 2-4 weeks on auth alone
Built on open standards
Your binary. Your rules. We're just a library.
No vendor lock-in. Production-grade from day one.
MCP
Connect any AI client to your tools
The open protocol from Anthropic for connecting AI clients to external capabilities. HTTP-native, OAuth2-protected, production-ready.
- Works with Claude Code, Claude Desktop, ChatGPT
- HTTP transport with real authentication
- Per-tool OAuth2 scopes
A2A
Agents that discover and collaborate
Google's open protocol for agent interoperability. Discovery, capabilities negotiation, and secure multi-agent communication.
- Automatic agent discovery
- Capability negotiation
- Secure cross-agent messaging
OAuth2
Battle-tested authorization
Full OAuth2 authorization server with OpenID Connect. PKCE flows, token introspection, scoped permissions for every tool call.
- Authorization code with PKCE
- Client credentials flow
- Standard OIDC discovery
WebAuthn
Passwordless by default
W3C standard for phishing-resistant authentication. Face ID, Touch ID, YubiKey - no passwords to steal.
- Face ID / Touch ID / Windows Hello
- Hardware security keys
- Phishing-resistant by design
Your extensions. Your binary.
systemprompt.io is a library you control, not a platform that controls you.
impl Extension for MyExtension {
fn id(&self) -> &'static str { "my-extension" }
fn name(&self) -> &'static str { "My Extension" }
}
register_extension!(MyExtension);
impl ApiExtension for MyExtension {
fn router(&self, ctx: &ExtensionContext) -> Option<Router> {
Some(Router::new()
.route("/items", get(list_items).post(create_item))
.with_state(ctx.clone()))
}
}
impl SchemaExtension for MyExtension {
fn schemas(&self) -> Vec<SchemaDefinition> {
vec![SchemaDefinition::new(
"my_items",
include_str!("../schema/items.sql")
)]
}
}
impl JobExtension for MyExtension {
fn jobs(&self) -> Vec<Arc<dyn Job>> {
vec![Arc::new(CleanupJob)]
}
}
impl ProviderExtension for MyExtension {
fn llm_providers(&self) -> Vec<Arc<dyn LlmProvider>> {
vec![Arc::new(MyCustomProvider)]
}
}
Compile-time discovery
Extensions register via the inventory crate. Compile-time discovery. No runtime reflection.
Library, not platform
You own the binary
Compile once, deploy anywhere. No runtime dependencies on us.
Library dependency
Add systemprompt.io as a Cargo dependency. Pin your version. Update when ready.
If it compiles, it works
Rust's type system catches errors at compile time. No runtime surprises.
Extensions stay private
Build proprietary logic on open core. We never see your code.
Ship YOUR product
White-label ready. Your domain. Your brand.
Pricing & Licensing
Source-available under BSL license (free during early access). Bring your own API keys - works with OpenAI, Gemini, and Anthropic. All execution is completely sandboxed. We only charge for cloud hosting - real machines cost real money.
Free
BSL license, self-hosted
- Self-hosted (just needs PostgreSQL)
- BYOK (OpenAI, Gemini, Anthropic)
- All core features
- Unlimited agents & MCP servers
- Completely sandboxed
- Community support
Cloud
Managed hosting, variable resources
- One-click deploy
- Managed PostgreSQL
- Automatic backups
- Your tenant URL
- Scale resources as needed
- Email support
Licensed Enterprise
Private infrastructure & business licensing
- Install on your private infrastructure
- Volume licensing
- Dedicated support
- SLA guarantee
- Custom integrations
Frequently asked questions
What's the difference between systemprompt.io and Claude/ChatGPT?
Claude, ChatGPT, and Clawdbot are single-player AI—they work for YOU. systemprompt.io is multiplayer AI infrastructure—it makes AI work for YOUR USERS. We don't compete with your AI; we're the runtime that lets you ship it as a product.
Can I self-host everything?
Yes. systemprompt.io compiles to a single binary—just point it at any PostgreSQL database. Run it on bare metal, in a VM, or containerized. The self-hosted version includes all core features. Cloud hosting is available for teams who want managed infrastructure.
What MCP clients are supported?
Any MCP client works with systemprompt.io: Claude Code, Claude Desktop, ChatGPT, and any tool that speaks the Model Context Protocol. Our servers use HTTP-native transports supported by modern clients.
How does authentication work?
systemprompt.io uses OAuth2/OIDC for API authentication and WebAuthn for passwordless user login. Every MCP tool call and agent interaction is authenticated and authorized against scoped permissions.
Is this open source?
systemprompt.io is source-available under FSL-1.1-ALv2 (Functional Source License). You can view, modify, and self-host the code. After 2 years, each version converts to Apache 2.0.
SHIP AI TO YOUR USERS
You bring the intelligence. We handle the multiplayer infrastructure.
Get Started Free