What it does
Provider-agnostic
Anthropic, OpenAI, Gemini, Ollama, or any OpenAI-compatible API. Switch with one line, zero other changes.
MCP client
Connect any MCP server via Streamable HTTP or Stdio. Tools are auto-discovered and namespaced.
Concurrent tool dispatch
Multiple tool calls from a single LLM response run in parallel goroutines automatically.
Multi-agent orchestration
Sequential pipelines and parallel fan-out with partial-failure safety.
Memory stores
Five built-in backends: in-memory, SQLite (zero infra), PostgreSQL, Redis (with TTL), and Qdrant (semantic search). All swap in with one line.
HTTP server
Production-ready chi router with SSE streaming, CORS, liveness/readiness probes, and graceful shutdown. One import away.
Observability
OpenTelemetry tracing, structured slog logging, and Prometheus metrics via drop-in middleware. Zero changes to agent code required.
Resilience middleware
Token-bucket rate limiting, multi-provider fallback chains, and tool result caching — all composable via
ProviderBuilder.Minimal example
Quickstart
Build your first agent in under 5 minutes.