Install
1. Create an agent
Use a provider shorthand to set both provider and model in one call:| Shorthand | Provider |
|---|---|
WithAnthropic(key, model) | Anthropic (Claude) |
WithOpenAI(key, model) | OpenAI |
WithGemini(key, model) | Google Gemini |
WithOllama(url, model) | Ollama (local) |
2. Run it
3. Use any OpenAI-compatible provider
chainforge works with OpenRouter, Ollama, or any provider that speaks the OpenAI API:4. Serve over HTTP
Turn any agent into an HTTP service with one line:POST /v1/chat, POST /v1/chat/stream (SSE), and GET /healthz.
It blocks until SIGINT/SIGTERM and shuts down gracefully.
5. Add middleware
ProviderBuilder:
6. Get token usage
Run discards token counts. Use RunWithUsage when you need them:
Next steps
Providers
Anthropic, OpenAI, Ollama, and OpenAI-compatible APIs.
Tools
Built-in tools, custom tools, and MCP servers.
MCP
Connect any MCP server with one line.
Orchestration
Sequential pipelines and parallel fan-out.
Observability
Structured logging and OpenTelemetry tracing.
Deployment
Docker, Kubernetes, Helm, and embedding the HTTP server.