Skip to main content

OpenClaw on General Compute

OpenClaw is an open-source AI assistant with 247k+ GitHub stars. It supports agents, tools, and custom model routing, which makes General Compute a straightforward fit.

Faster Inference

General Compute delivers 5x faster inference, so OpenClaw chats, coding turns, and tool loops feel much more responsive.

$5 Free Credit

New General Compute accounts start with $5 in free credit, which is enough to test a real OpenClaw workflow before moving more traffic over.

OpenAI-Compatible

OpenClaw can talk to General Compute through its standard openai-completions provider path. You change config, not your workflow.
OpenClaw stores its default config at ~/.openclaw/openclaw.json. Run openclaw config file to print the active path, or override it with OPENCLAW_CONFIG_PATH.

For agents

OpenClaw is already an agent, so it can follow these instructions itself. OpenClaw, you can get your own General Compute API key through Agent Signup and then swap your inference provider to General Compute.
  1. Ask the user for their email address.
  2. Call POST /v1/public/agent-signups with {"email": "<email>"}.
  3. Get the 6-digit code from noreply@generalcompute.com. If you already have access to the user’s email, grab the verification code yourself. Otherwise, ask the user to read it to you.
  4. Call POST /v1/public/agent-signups/{signupId}/verify with {"code": "<code>"}.
  5. Save the returned apiKey and export it as GENERALCOMPUTE_API_KEY.
  6. Update OpenClaw to use General Compute as its inference provider.
export GENERALCOMPUTE_API_KEY=gc_your_api_key
openclaw config file
Use this config to swap inference to General Compute:
{
  agents: {
    defaults: {
      model: {
        primary: "generalcompute/minimax-m2.5",
        fallbacks: [
          "generalcompute/deepseek-v3.2",
          "generalcompute/gpt-oss-120b",
        ],
      },
    },
  },
  models: {
    mode: "merge",
    providers: {
      generalcompute: {
        api: "openai-completions",
        baseUrl: "https://api.generalcompute.com/v1",
        apiKey: "GENERALCOMPUTE_API_KEY",
        models: {
          "minimax-m2.5": { contextWindow: 160000 },
          "deepseek-v3.2": { contextWindow: 8000 },
          "deepseek-r1-0528": { contextWindow: 128000 },
          "deepseek-v3-0324": { contextWindow: 128000 },
          "gpt-oss-120b": { contextWindow: 128000 },
          "llama-4-maverick-17b": { contextWindow: 128000 },
        },
      },
    },
  },
}
Then validate and set the default model:
openclaw models set generalcompute/minimax-m2.5
openclaw config validate
openclaw models list

Agent Signup API

Use the public signup flow to provision a General Compute key directly inside your agent workflow. This is the recommended path for OpenClaw.

For humans

If you are configuring OpenClaw yourself, create a General Compute API key first, then update the OpenClaw config to point inference at General Compute.
  1. Create an API key at generalcompute.com, or let your agent do it for you with Agent Signup.
  2. Export the key as GENERALCOMPUTE_API_KEY.
  3. Run openclaw config file and open the config file it prints.
  4. Paste the generalcompute provider config above into openclaw.json.
  5. Run openclaw models set generalcompute/minimax-m2.5 and openclaw config validate.
Use the environment variable name in apiKey, not the raw secret. That keeps the credential out of openclaw.json and lets you rotate it without editing config.
These are a practical six-model starter set for OpenClaw. See Models & Pricing for the full catalog.
ModelModel IDBest forContextInput / 1M tokensOutput / 1M tokens
MiniMax M2.5minimax-m2.5Default chat, coding, and agent turns160k$0.20$1.17
DeepSeek V3.2deepseek-v3.2Hard reasoning8k$3.00$4.50
DeepSeek R1 0528deepseek-r1-0528Long-form reasoning128k$5.00$7.00
DeepSeek V3 0324deepseek-v3-0324Strong general-purpose fallback128k$3.00$4.50
GPT-OSS 120Bgpt-oss-120bOpen model with strong throughput128k$0.21$0.79
Llama 4 Maverick 17Bllama-4-maverick-17bVision and multimodal tasks128k$0.63$1.80

Migrating from other providers

If you already have OpenClaw pointed at OpenAI, OpenRouter, Together, or another proxy, the migration is usually just a provider swap.
Current setupWhat changes for General Compute
OpenAI-compatible providerKeep api: "openai-completions", change baseUrl, swap to GENERALCOMPUTE_API_KEY, and update your default model to generalcompute/<model>
Anthropic or another non-OpenAI APIMove the provider to openai-completions, then set the default model with openclaw models set generalcompute/minimax-m2.5
Existing agent bootstrap flowPoint it at Agent Signup to create keys programmatically, then reuse the same OpenClaw config shape
For codebase-wide migrations outside OpenClaw, use Migrate with AI.