Skip to main content
The Model Router automatically selects the right model for each task — routing cheap tasks to cheaper models and complex tasks to more capable ones. It learns from outcomes over time.

How it works

Five-tier priority system, evaluated in order:
  1. Task-type override — if you’ve pinned a specific model to a task type, that wins
  2. Self-learning — if a model has a strong success record for this task type (min 20 samples), use it
  3. Budget fallback — if monthly spend exceeds your threshold, downgrade non-critical tasks to Haiku
  4. Ollama — if Ollama is running locally and the task is eligible (summary, chat), route there
  5. Task-type defaults — built-in sensible defaults per task type

Default routing

Task typeDefault model
codeClaude Sonnet 4.6
researchClaude Sonnet 4.6
creativeClaude Opus 4.6
summaryClaude Haiku 4.5
chatClaude Haiku 4.5

Configuration

Configure in Settings → Model Router in the dashboard, or directly in ~/.openclaw/openclaw.json:
{
  "modelRouter": {
    "enabled": true,
    "primaryModel": "anthropic/claude-sonnet-4-6",
    "fallbackModel": "anthropic/claude-haiku-4-5",
    "budgetThreshold": 80,
    "selfLearning": true,
    "selfLearningSampleThreshold": 20,
    "ollamaEnabled": false,
    "ollamaBaseUrl": "http://localhost:11434",
    "ollamaModel": "llama3.2",
    "taskTypeOverrides": {
      "code": "anthropic/claude-opus-4-6"
    },
    "lockedTaskTypes": ["code"]
  }
}

Options

OptionTypeDefaultDescription
enabledbooleantrueEnable or disable the router
primaryModelstringanthropic/claude-sonnet-4-6Default model for most tasks
fallbackModelstringanthropic/claude-haiku-4-5Used when over budget
budgetThresholdnumber80% of monthly budget that triggers fallback
selfLearningbooleantrueLearn from task outcomes
selfLearningSampleThresholdnumber20Minimum samples before self-learning overrides defaults
taskTypeOverridesobject{}Pin specific models to task types
lockedTaskTypesarray[]Prevent self-learning from changing these task types
ollamaEnabledbooleanfalseRoute eligible tasks to local Ollama