Skip to content

llm_context_core

Crate: llm_context_core · Path: crates/llm_context_core
Description (Cargo.toml): WASM-compatible LLM context window management — token estimation, budget tracking, history strategies, and memory traits

WASM-compatible LLM context window management library. Provides token estimation, context budget computation, pluggable history management strategies, and trait ports for long-term memory backends. Three-tier architecture:

  • Tier 1 (Working) — current turn messages, bounded by token budget
  • Tier 2 (Short-Term) — conversation history with sliding window / summarization
  • Tier 3 (Long-Term) — cross-conversation semantic retrieval (Qdrant, etc.)

From crates/llm_context_core/Cargo.toml:

FeaturePurpose
defaultEmpty — no features enabled by default

Modules: budget, history, memory, strategy, tokens

Re-exports:

  • ContextBudget — token budget computation
  • HistoryManager, HistoryManagerConfig — conversation history management
  • LongTermMemory, MemoryEntry, MemoryFilters, MemoryType — trait ports and types for long-term memory backends
  • ContextStrategy, ContextStrategyKind — pluggable context strategies
  • estimate_tokens — token count estimation
use llm_context_core::{ContextBudget, HistoryManager, HistoryManagerConfig, estimate_tokens};
let budget = ContextBudget::new(128_000);
let config = HistoryManagerConfig::default();
let mut history = HistoryManager::new(config);
let tokens = estimate_tokens("Hello, how are you?");

Full API reference: cargo doc -p llm_context_core --no-deps