Skip to content

LLM Tools

LLM tools let the agent’s language model call deterministic functions during a conversation. The SDK provides two layers: a low-level ToolRegistry / ToolSpec system inside agent_sdk, and a higher-level #[llm_tool] proc macro from llm_tool_macros that generates JSON Schema and executors from plain Rust functions.

[dependencies]
agent_sdk = { package = "pf_agent_sdk", path = "../../crates/agent_sdk", features = ["llm-engine"] }
llm_tools = { path = "../../crates/llm_tools" }
llm_tool_macros = { path = "../../crates/llm_tool_macros" }
  1. Define tools with #[llm_tool]

    Annotate plain Rust functions. The macro generates _llm_tool_info, _llm_tool_exec, and schema companions automatically.

    use llm_tool_macros::llm_tool;
    #[llm_tool(name = "get_weather", description = "Get current weather for a city")]
    fn get_weather(city: String) -> String {
    format!("Weather in {city}: 22°C, sunny")
    }

    The macro derives a GetWeatherParams struct with serde::Deserialize and schemars::JsonSchema, so the LLM sees a proper JSON Schema.

  2. Build a registry with registry_from!

    use llm_tools::registry_from;
    let tools = registry_from!(get_weather);
    // tools.schemas() → Vec<ToolSchema> for the LLM request
    // tools.exec("get_weather", args) → serde_json::Value
  3. Wire into the agent via configure_llm_runtime

    agent.configure_llm_runtime(
    llm_client, // impl IntoLlmInvoker + IntoLlmStreamInvoker
    "gpt-4o", // model name
    tools, // impl IntoTools (ToolRegistry, Vec<ToolSpec>, or llm_tools::ToolRegistry)
    Some("You are a helpful weather assistant.".into()), // system message
    None, // LlmPolicy (default)
    None, // LlmRequestDefaults (default)
    )?;

    This installs the tool-loop handler: the agent automatically calls tools when the LLM requests them and feeds results back.

use agent_sdk::Agent;
use llm_tool_macros::llm_tool;
use llm_tools::registry_from;
#[llm_tool(name = "calculate", description = "Evaluate a math expression")]
fn calculate(expression: String) -> String {
// Simplified — real implementation would parse the expression
format!("Result of '{expression}': 42")
}
#[llm_tool(name = "lookup_user", description = "Look up a user by ID")]
fn lookup_user(user_id: u64) -> String {
format!("User {user_id}: Alice (alice@example.com)")
}
fn build_agent(llm_client: impl agent_sdk::agent::llm_orchestrator::IntoLlmInvoker
+ agent_sdk::agent::llm_orchestrator::IntoLlmStreamInvoker + Clone,
) -> agent_sdk::SdkResult<Agent> {
let mut agent = Agent::new("tool-demo")?;
agent
.add_skill("math")
.description("Evaluate math expressions")
.register()?;
let tools = registry_from!(calculate, lookup_user);
agent.configure_llm_runtime(
llm_client,
"gpt-4o",
tools,
Some("You are a helpful assistant. Use tools when needed.".into()),
None,
None,
)?;
Ok(agent)
}

For tools that need async I/O or custom executors, bypass the macro and build ToolSpec directly:

use agent_sdk::agent::tools::{ToolExecutor, ToolSpec, ToolRegistry};
use std::sync::Arc;
let mut registry = ToolRegistry::new();
registry.register(ToolSpec {
name: "fetch_url".to_string(),
description: Some("Fetch content from a URL".to_string()),
parameters: serde_json::json!({
"type": "object",
"properties": {
"url": { "type": "string", "description": "The URL to fetch" }
},
"required": ["url"]
}),
strict: true,
parallel_ok: false,
executor: ToolExecutor::Simple(Arc::new(|args| {
Box::pin(async move {
let url = args["url"].as_str().unwrap_or("");
Ok(serde_json::json!({ "content": format!("Fetched: {url}") }))
})
})),
});
AttributeTypeDescription
name = "…"StringTool name exposed to the LLM (defaults to function name)
description = "…"StringTool description for the LLM
contextflagRequire a ToolContext parameter for access to request headers and trace emitter

When context is set, one of the function parameters must have type ToolContext:

use agent_sdk::agent::tool_context::ToolContext;
#[llm_tool(name = "ctx_tool", description = "Tool with context", context)]
fn ctx_tool(query: String, ctx: ToolContext) -> String {
// ctx.request_headers(), ctx.emit(event)
format!("Handled: {query}")
}
TypeModulePurpose
ToolSpecagent_sdk::agent::toolsTool definition: name, description, JSON Schema, executor
ToolRegistryagent_sdk::agent::toolsName-indexed collection of ToolSpec entries
ToolExecutoragent_sdk::agent::toolsSimple(fn) or WithContext(fn) — the callable
ToolExecutionResultagent_sdk::agent::toolsName + output value after execution
IntoToolsagent_sdk::agent::toolsTrait for ergonomic conversion (accepts ToolRegistry, Vec<ToolSpec>, or llm_tools::ToolRegistry)
llm_tools::ToolRegistryllm_toolsMacro-generated registry with schemas(), exec(), names()
registry_from!llm_toolsMacro to collect #[llm_tool]-annotated functions into a llm_tools::ToolRegistry