Skip to content

client

[LlmClient] facade and builder — primary public entry point.

Configures an [LlmClient].

Methods

fn new(wire_format: WireFormat) -> Self
fn base_url<impl Into<String>>(self, url: impl Into) -> Self
fn auth<impl AuthProvider + 'static>(self, auth: impl AuthProvider + ?) -> Self
fn api_mode(self, mode: ApiMode) -> Self

For OpenAI-compatible endpoints only. Ignored for Anthropic.

fn streaming_policy(self, policy: StreamingPolicy) -> Self
fn default_headers(self, headers: HashMap<String, String>) -> Self
fn openai_paths(self, chat: String, responses: String) -> Self

Override OpenAI chat and responses URL paths (Azure uses /chat/completions, etc.).

fn build(self) -> Result<LlmClient, LlmError>

High-level LLM client (single entry point for apps).

Methods

fn builder(wire_format: WireFormat) -> LlmClientBuilder
fn azure_openai_builder(resource_name: &str, deployment_id: &str, api_version: &str, credential: AzureCredential) -> LlmClientBuilder

Convenience for Azure OpenAI chat deployments.

async fn chat(&self, req: LlmRequest) -> Result<LlmResponse, LlmError>
async fn chat_stream(&self, req: LlmRequest) -> Result<LlmEventStream, LlmError>
fn capabilities(&self) -> ClientCapabilities

JSON schema family for the remote endpoint.

Variants

VariantDescription
OpenAiCompatOpenAI chat completions / responses shape (incl. Azure OpenAI, OpenRouter, vLLM, …).
AnthropicMessagesAnthropic Messages API shape (incl. Bedrock / Vertex Claude when routed that way).