Skip to main content
chainforge is a Go library for building AI agent applications without coupling your code to any LLM SDK. Swap providers, tools, and memory stores with a single line.
go get github.com/lioarce01/chainforge

What it does

Provider-agnostic

Anthropic, OpenAI, Gemini, Ollama, or any OpenAI-compatible API. Switch with one line, zero other changes.

MCP client

Connect any MCP server via Streamable HTTP or Stdio. Tools are auto-discovered and namespaced.

Concurrent tool dispatch

Multiple tool calls from a single LLM response run in parallel goroutines automatically.

Multi-agent orchestration

Sequential pipelines and parallel fan-out with partial-failure safety.

Memory stores

Five built-in backends: in-memory, SQLite (zero infra), PostgreSQL, Redis (with TTL), and Qdrant (semantic search). All swap in with one line.

HTTP server

Production-ready chi router with SSE streaming, CORS, liveness/readiness probes, and graceful shutdown. One import away.

Observability

OpenTelemetry tracing, structured slog logging, and Prometheus metrics via drop-in middleware. Zero changes to agent code required.

Resilience middleware

Token-bucket rate limiting, multi-provider fallback chains, and tool result caching — all composable via ProviderBuilder.

Minimal example

import (
    chainforge "github.com/lioarce01/chainforge"
    "github.com/lioarce01/chainforge/pkg/tools/calculator"
)

agent, err := chainforge.NewAgent(
    chainforge.WithAnthropic(os.Getenv("ANTHROPIC_API_KEY"), "claude-sonnet-4-6"),
    chainforge.WithSystemPrompt("You are a helpful assistant."),
    chainforge.WithTools(calculator.New()),
)

result, err := agent.Run(ctx, "session-1", "What is 2^10 + 144?")

// Need token usage?
result, usage, err := agent.RunWithUsage(ctx, "session-1", "What is 2^10 + 144?")
fmt.Printf("tokens: %d in / %d out\n", usage.InputTokens, usage.OutputTokens)

Quickstart

Build your first agent in under 5 minutes.