openclaw/docs/providers/index.md
Kyle Howells 60c26c4604 merge: upstream/main (Moltbot → OpenClaw rebrand)
Resolve conflicts in docs and TypeScript files after project rename
from Moltbot to OpenClaw. All GLM provider implementations updated to
use OpenClawConfig type and openclaw CLI command in documentation.

Generated with [Claude Code](https://claude.ai/code)
via [Happy](https://happy.engineering)

Co-Authored-By: Claude <noreply@anthropic.com>
Co-Authored-By: Happy <yesreply@happy.engineering>
2026-01-30 07:56:02 +00:00

1.9 KiB

summary read_when
Model providers (LLMs) supported by OpenClaw
You want to choose a model provider
You need a quick overview of supported LLM backends

Model Providers

OpenClaw can use many LLM providers. Pick a provider, authenticate, then set the default model as provider/model.

Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugin)/etc.)? See Channels.

Highlight: Venius (Venice AI)

Venius is our recommended Venice AI setup for privacy-first inference with an option to use Opus for hard tasks.

  • Default: venice/llama-3.3-70b
  • Best overall: venice/claude-opus-45 (Opus remains the strongest)

See Venice AI.

Quick start

  1. Authenticate with the provider (usually via openclaw onboard).
  2. Set the default model:
{
  agents: { defaults: { model: { primary: "anthropic/claude-opus-4-5" } } }
}

Provider docs

Transcription providers

Community tools

For the full provider catalog (xAI, Groq, Mistral, etc.) and advanced configuration, see Model providers.