LLM Integration
Provider-agnostic client for OpenAI, Anthropic, Azure. Tool calling, context window management. Works standalone — no SDK required.
LLM Integration
Provider-agnostic client for OpenAI, Anthropic, Azure. Tool calling, context window management. Works standalone — no SDK required.
Agent-to-Agent (A2A)
Full Google A2A v1.0. JSON-RPC, SSE streaming, agent discovery. Build agents that find and delegate tasks to each other. WASM + native.
AG-UI Streaming
Stream agent execution to user interfaces via SSE. Run status, IO events, enrichers. Serve alongside A2A on one router.
MCP Tools
Connect to any MCP server. Tool discovery, execution, OAuth 2.1. FastMCP compatible. WASM + native.
Production Observability
Structured logging, OpenTelemetry, Prometheus — all WASM-safe. Use in any Rust project, not just agents.
Use What You Need
Every crate works standalone. Import just the LLM client, or just observability, or just MCP. Compose them when you’re ready for the full stack.
WASM + Native
Same codebase compiles to Fermyon Spin (WASM) and Axum (native). No rewrites, no feature-flag gymnastics.
Standards-First
Google A2A v1.0, AG-UI, MCP, JSON-RPC 2.0, OpenTelemetry. Built on open protocols — not locked into a proprietary runtime.
Feature-Flag Composition
Only pay for what you use. Every capability is behind an opt-in feature flag. Zero hidden dependencies.