AI Providers
OpenAlice supports two AI backends that can be switched at runtime without restarting. Both implement the same interface and have access to the same tools.
Claude Agent SDK
The default provider. Uses @anthropic-ai/claude-agent-sdk to call Claude.
Authentication methods:
- claudeai (default) — Uses your Claude Pro or Max subscription via the Claude Code CLI. No API key needed — just have Claude Code installed and authenticated.
- api-key — Direct Anthropic API key. Set in
ai-provider-manager.json.
How tools work: An in-process MCP server is created from ToolCenter's registered tools. The Agent SDK consumes tools through this MCP server, the same protocol Claude Code uses.
Session handling: History is serialized as text (not structured messages). The AgentCenter builds a text prompt from session entries and sends it as a single string.
Vercel AI SDK
The alternative provider. Makes direct API calls using the Vercel AI SDK.
Supported model providers:
- Anthropic — Claude models (requires API key)
- OpenAI — GPT models (requires API key)
- Google — Gemini models (requires API key)
How tools work: Tools from ToolCenter are exported in Vercel AI SDK format. The provider runs a ToolLoopAgent that handles the tool call loop in-process.
Session handling: History is serialized as structured ModelMessage[] arrays (role-based messages with content blocks). This preserves tool_use and tool_result blocks natively.
Configuration
The AI provider is configured in data/config/ai-provider-manager.json:
{
"backend": "agent-sdk",
"provider": "anthropic",
"model": "claude-sonnet-4-6",
"loginMethod": "claudeai",
"apiKeys": {}
}
| Field | Description |
|---|---|
backend | agent-sdk or vercel-ai-sdk |
provider | Model provider: anthropic, openai, google |
model | Model identifier (e.g. claude-sonnet-4-6, gpt-4o) |
loginMethod | Agent SDK auth: claudeai (Pro/Max) or api-key |
apiKeys | API keys per provider: { "anthropic": "sk-ant-...", "openai": "sk-..." } |
baseUrl | Optional custom API endpoint |
Hot-reload — The GenerateRouter re-reads this file on every call. Change the backend or model, and the very next message uses the new config.
Runtime Switching
You can switch providers in several ways:
- Web UI — Use the provider selector in the settings panel
- Edit config — Modify
ai-provider-manager.jsondirectly - Ask Alice — "Switch to GPT-4o" (if evolution mode is on, Alice can edit her own config)
The switch is immediate. No restart, no session loss.
Per-Channel Overrides
Web UI sub-channels can override the AI provider on a per-channel basis. This lets you run different models in different chat tabs:
{
"id": "research",
"label": "Research (GPT-4o)",
"provider": "vercel-ai-sdk",
"vercelAiSdk": {
"provider": "openai",
"model": "gpt-4o",
"apiKey": "sk-..."
}
}
Configure in data/config/web-subchannels.json. Each sub-channel can also override the system prompt and disable specific tools.
Choosing a Provider
| Agent SDK | Vercel AI SDK | |
|---|---|---|
| Best for | Claude Pro/Max subscribers | API key users, multi-model |
| Auth | Claude Code login or API key | API keys per provider |
| Models | Claude only | Anthropic, OpenAI, Google |
| Tool delivery | In-process MCP server | Vercel native tools |
| Session format | Text history | Structured messages |
For most users, the default Agent SDK with Claude Code login is the simplest setup — no API keys, no configuration. Switch to Vercel AI SDK when you need access to non-Claude models or prefer direct API key auth.