Provider Setup Guides
This guide covers setup for the most common providers. AgentZero supports 37 providers — run agentzero providers for the full list.
OpenAI
Section titled “OpenAI”- Get an API key from platform.openai.com/api-keys.
- Configure:
agentzero onboard --provider openai --model gpt-4o --yesagentzero auth setup-token --provider openai --token sk-...Or set the environment variable:
export OPENAI_API_KEY="sk-..."TOML config:
[provider]kind = "openai"base_url = "https://api.openai.com/v1"model = "gpt-4o"Available models: gpt-4o, gpt-4o-mini, gpt-4-turbo, o1, o1-mini, o3-mini
Anthropic
Section titled “Anthropic”- Get an API key from console.anthropic.com/settings/keys.
- Configure:
agentzero onboard --provider anthropic --model claude-sonnet-4-6 --yesagentzero auth setup-token --provider anthropic --token sk-ant-...Or set the environment variable:
export ANTHROPIC_API_KEY="sk-ant-..."TOML config:
[provider]kind = "anthropic"base_url = "https://api.anthropic.com"model = "claude-sonnet-4-6"Available models: claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5-20251001
OpenRouter
Section titled “OpenRouter”OpenRouter gives you access to hundreds of models through a single API key.
- Get an API key from openrouter.ai/keys.
- Configure:
agentzero onboard --provider openrouter --model anthropic/claude-sonnet-4-6 --yesagentzero auth setup-token --provider openrouter --token sk-or-v1-...Or set the environment variable:
export OPENROUTER_API_KEY="sk-or-v1-..."TOML config:
[provider]kind = "openrouter"base_url = "https://openrouter.ai/api/v1"model = "anthropic/claude-sonnet-4-6"Model names use the format provider/model — e.g., openai/gpt-4o, google/gemini-pro, meta-llama/llama-3.1-70b-instruct.
Ollama (local)
Section titled “Ollama (local)”Ollama runs models locally. No API key needed.
- Install Ollama from ollama.com.
- Pull a model:
ollama pull llama3.1:8b- Start Ollama (it runs on
http://localhost:11434by default):
ollama serve- Configure AgentZero:
agentzero onboard --provider ollama --model llama3.1:8b --yesTOML config:
[provider]kind = "ollama"base_url = "http://localhost:11434/v1"model = "llama3.1:8b"AgentZero can auto-discover local Ollama instances:
agentzero local discoverOther Local Providers
Section titled “Other Local Providers”LM Studio
Section titled “LM Studio”[provider]kind = "lmstudio"base_url = "http://localhost:1234/v1"model = "your-model-name"llama.cpp server
Section titled “llama.cpp server”[provider]kind = "llamacpp"base_url = "http://localhost:8080/v1"model = "default"[provider]kind = "vllm"base_url = "http://localhost:8000/v1"model = "your-model-name"Cloud Providers with Default URLs
Section titled “Cloud Providers with Default URLs”These providers have built-in base URLs — you only need to set the API key:
| Provider | Kind | Env Var |
|---|---|---|
| Groq | groq | GROQ_API_KEY |
| Mistral | mistral | MISTRAL_API_KEY |
| xAI (Grok) | xai | XAI_API_KEY |
| DeepSeek | deepseek | DEEPSEEK_API_KEY |
| Together AI | together | TOGETHER_API_KEY |
| Fireworks AI | fireworks | — |
| Perplexity | perplexity | — |
| Cohere | cohere | — |
| NVIDIA NIM | nvidia | — |
Example for Groq:
agentzero onboard --provider groq --model llama-3.1-70b-versatile --yesexport GROQ_API_KEY="gsk_..."Custom Endpoints
Section titled “Custom Endpoints”For any OpenAI-compatible API not in the catalog:
[provider]kind = "custom:https://my-api.example.com/v1"model = "my-model"For Anthropic-compatible APIs:
[provider]kind = "anthropic-custom:https://my-proxy.example.com"model = "claude-sonnet-4-6"Transport Configuration
Section titled “Transport Configuration”Per-provider transport settings can be configured for timeout, retries, and circuit breaking:
[provider.transport]timeout_ms = 30000 # request timeout (default: 30s)max_retries = 2 # retry count on failurecircuit_breaker_threshold = 5 # failures before circuit openscircuit_breaker_reset_ms = 60000 # time before half-open retryChecking Provider Status
Section titled “Checking Provider Status”# List all supported providers (marks active one)agentzero providers
# Check provider quota and API key statusagentzero providers quota
# Diagnose model availabilityagentzero doctor models