Files
openclaw/docs/providers/arcee.md

5.1 KiB

title, summary, read_when
title summary read_when
Arcee AI Arcee AI setup (auth + model selection)
You want to use Arcee AI with OpenClaw
You need the API key env var or CLI auth choice

Arcee AI

Arcee AI provides access to the Trinity family of mixture-of-experts models through an OpenAI-compatible API. All Trinity models are Apache 2.0 licensed.

Arcee AI models can be accessed directly via the Arcee platform or through OpenRouter.

Property Value
Provider arcee
Auth ARCEEAI_API_KEY (direct) or OPENROUTER_API_KEY (via OpenRouter)
API OpenAI-compatible
Base URL https://api.arcee.ai/api/v1 (direct) or https://openrouter.ai/api/v1 (OpenRouter)

Getting started

Create an API key at [Arcee AI](https://chat.arcee.ai/). ```bash openclaw onboard --auth-choice arceeai-api-key ``` ```json5 { agents: { defaults: { model: { primary: "arcee/trinity-large-thinking" }, }, }, } ``` Create an API key at [OpenRouter](https://openrouter.ai/keys). ```bash openclaw onboard --auth-choice arceeai-openrouter ``` ```json5 { agents: { defaults: { model: { primary: "arcee/trinity-large-thinking" }, }, }, } ```
    The same model refs work for both direct and OpenRouter setups (for example `arcee/trinity-large-thinking`).
  </Step>
</Steps>

Non-interactive setup

```bash openclaw onboard --non-interactive \ --mode local \ --auth-choice arceeai-api-key \ --arceeai-api-key "$ARCEEAI_API_KEY" ``` ```bash openclaw onboard --non-interactive \ --mode local \ --auth-choice arceeai-openrouter \ --openrouter-api-key "$OPENROUTER_API_KEY" ```

Built-in catalog

OpenClaw currently ships this bundled Arcee catalog:

Model ref Name Input Context Cost (in/out per 1M) Notes
arcee/trinity-large-thinking Trinity Large Thinking text 256K $0.25 / $0.90 Default model; reasoning enabled
arcee/trinity-large-preview Trinity Large Preview text 128K $0.25 / $1.00 General-purpose; 400B params, 13B active
arcee/trinity-mini Trinity Mini 26B text 128K $0.045 / $0.15 Fast and cost-efficient; function calling
The onboarding preset sets `arcee/trinity-large-thinking` as the default model.

Supported features

Feature Supported
Streaming Yes
Tool use / function calling Yes
Structured output (JSON mode and JSON schema) Yes
Extended thinking Yes (Trinity Large Thinking)
If the Gateway runs as a daemon (launchd/systemd), make sure `ARCEEAI_API_KEY` (or `OPENROUTER_API_KEY`) is available to that process (for example, in `~/.openclaw/.env` or via `env.shellEnv`). When using Arcee models via OpenRouter, the same `arcee/*` model refs apply. OpenClaw handles routing transparently based on your auth choice. See the [OpenRouter provider docs](/providers/openrouter) for OpenRouter-specific configuration details. Access Arcee models and many others through a single API key. Choosing providers, model refs, and failover behavior.