client
[LlmClient] facade and builder — primary public entry point.
Structs
Section titled “Structs”LlmClientBuilder
Section titled “LlmClientBuilder”Configures an [LlmClient].
Methods
fn new(wire_format: WireFormat) -> Selfbase_url
Section titled “base_url”fn base_url<impl Into<String>>(self, url: impl Into) -> Selffn auth<impl AuthProvider + 'static>(self, auth: impl AuthProvider + ?) -> Selfapi_mode
Section titled “api_mode”fn api_mode(self, mode: ApiMode) -> SelfFor OpenAI-compatible endpoints only. Ignored for Anthropic.
streaming_policy
Section titled “streaming_policy”fn streaming_policy(self, policy: StreamingPolicy) -> Selfdefault_headers
Section titled “default_headers”fn default_headers(self, headers: HashMap<String, String>) -> Selfopenai_paths
Section titled “openai_paths”fn openai_paths(self, chat: String, responses: String) -> SelfOverride OpenAI chat and responses URL paths (Azure uses /chat/completions, etc.).
fn build(self) -> Result<LlmClient, LlmError>LlmClient
Section titled “LlmClient”High-level LLM client (single entry point for apps).
Methods
builder
Section titled “builder”fn builder(wire_format: WireFormat) -> LlmClientBuilderazure_openai_builder
Section titled “azure_openai_builder”fn azure_openai_builder(resource_name: &str, deployment_id: &str, api_version: &str, credential: AzureCredential) -> LlmClientBuilderConvenience for Azure OpenAI chat deployments.
async fn chat(&self, req: LlmRequest) -> Result<LlmResponse, LlmError>chat_stream
Section titled “chat_stream”async fn chat_stream(&self, req: LlmRequest) -> Result<LlmEventStream, LlmError>capabilities
Section titled “capabilities”fn capabilities(&self) -> ClientCapabilitiesWireFormat
Section titled “WireFormat”JSON schema family for the remote endpoint.
Variants
| Variant | Description |
|---|---|
OpenAiCompat | OpenAI chat completions / responses shape (incl. Azure OpenAI, OpenRouter, vLLM, …). |
AnthropicMessages | Anthropic Messages API shape (incl. Bedrock / Vertex Claude when routed that way). |