Skip to content

LLM overview

These crates build on each other:

  1. llm_client — provider-neutral HTTP client with typed requests, streaming SSE, model profiles, and wire-format abstraction (OpenAI-compat / Anthropic Messages).
  2. llm_context_core — WASM-compatible context window management: token estimation, budget tracking, history strategies, and long-term memory traits.
  3. llm_tools — lightweight, WASM-safe tool registry and registry_from! macro for collecting LLM tools and dispatching by name.
  4. llm_tool_macros — proc-macro crate providing #[llm_tool] to register plain Rust functions as LLM tools with auto-generated JSON Schema.
  5. tool_web_search — reusable LLM tool that performs web search via the Tavily API, returns a ToolSpec for any agent’s registry.

Feature flags and public exports are defined in each crate’s Cargo.toml and src/lib.rs. Use those as the source of truth; this site summarizes them and does not replace cargo doc.

See also: LLM Client guide for end-to-end setup, and LLM Tools guide for registering custom tools.

Full API reference: coming soon via published rustdoc (see Getting started for local cargo doc).