Claude Code
Codex
OpenCode
Cursor
Migration prompt
Copy and paste this into your AI coding agent:OpenAI SDK → General Compute
General Compute is a drop-in replacement — keep using the OpenAI SDK, just point it at General Compute.
Migrate this project from OpenAI to General Compute. General Compute is a drop-in replacement — keep using the OpenAI SDK, just point it at General Compute.
1. Change the base URL and API key in the client constructor:
- Node.js: new OpenAI({ baseURL: "https://api.generalcompute.com/v1", apiKey: process.env.GENERALCOMPUTE_API_KEY })
- Python: OpenAI(base_url="https://api.generalcompute.com/v1", api_key=os.getenv("GENERALCOMPUTE_API_KEY"))
2. Update model names:
- gpt-4o, gpt-4, gpt-3.5-turbo → minimax-m2.5 (general purpose) or deepseek-v3.2 (reasoning)
- See all models: https://docs.generalcompute.com/models
3. Update environment variables:
- Add GENERALCOMPUTE_API_KEY to all .env files, CI configs, and deployment settings.
Everything else stays the same — imports, client.chat.completions.create(), streaming, tool calling, JSON mode, and all parameters work identically.
Docs: https://docs.generalcompute.com/quickstart
Anthropic SDK → General Compute
General Compute has its own SDK that’s a drop-in replacement for OpenAI — swap the Anthropic SDK for ours.
Migrate this project from the Anthropic SDK to the General Compute SDK. General Compute uses an OpenAI-compatible API, so you'll replace the Anthropic SDK with @generalcompute/sdk (Node.js) or generalcompute (Python).
1. Replace the Anthropic SDK with the General Compute SDK:
- Node.js: npm uninstall @anthropic-ai/sdk && npm install @generalcompute/sdk
- Python: pip uninstall anthropic && pip install generalcompute
2. Replace the client constructor:
- Node.js (before): import Anthropic from "@anthropic-ai/sdk"; const client = new Anthropic();
- Node.js (after): import GeneralCompute from "@generalcompute/sdk"; const client = new GeneralCompute();
- Python (before): from anthropic import Anthropic; client = Anthropic()
- Python (after): from generalcompute import GeneralCompute; client = GeneralCompute()
3. Replace all API calls — the method names and response shapes are different:
- client.messages.create() → client.chat.completions.create()
- message.content[0].text → completion.choices[0].message.content
- max_tokens is optional on General Compute (not required like Anthropic)
- system parameter → put system message as first item in messages array with role: "system"
- Streaming: client.messages.stream() → client.chat.completions.create({ stream: true }), then iterate chunks with chunk.choices[0].delta.content
- Tool use: Anthropic's tool_use/tool_result blocks → OpenAI-style tools array and tool_calls in responses
4. Update model names:
- claude-sonnet-4-20250514, claude-opus-4-20250514, claude-haiku-3-5-20241022 → minimax-m2.5 (general purpose) or deepseek-v3.2 (reasoning)
- See all models: https://docs.generalcompute.com/models
5. Update environment variables:
- Replace ANTHROPIC_API_KEY with GENERALCOMPUTE_API_KEY in all .env files, CI configs, and deployment settings.
Docs: https://docs.generalcompute.com/quickstart
API reference: https://docs.generalcompute.com/api-reference/introduction
Vercel AI SDK → General Compute provider
The Vercel AI SDK has an OpenAI-compatible provider that works with General Compute out of the box.
Migrate this project's Vercel AI SDK setup to use General Compute as the inference provider.
1. Install the provider: npm install @ai-sdk/openai-compatible
2. Replace the model provider. Find where you create your AI model (e.g. openai("gpt-4o"), anthropic("claude-sonnet-4-20250514"), etc.) and replace it with:
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
const generalcompute = createOpenAICompatible({
name: "generalcompute",
baseURL: "https://api.generalcompute.com/v1",
headers: {
Authorization: `Bearer ${process.env.GENERALCOMPUTE_API_KEY}`,
},
});
Then use generalcompute("minimax-m2.5") wherever you previously passed a model.
3. Update model names:
- Use minimax-m2.5 for general purpose or deepseek-v3.2 for reasoning
- See all models: https://docs.generalcompute.com/models
4. Update environment variables:
- Add GENERALCOMPUTE_API_KEY to .env.local and deployment settings
- Remove any previous provider API keys that are no longer needed
5. Important if using AI SDK v6 (ai@6 or later):
- The useChat hook comes from @ai-sdk/react (not ai/react) — install it: npm install @ai-sdk/react
- Use toUIMessageStreamResponse() (not toDataStreamResponse()) in API routes
- Messages use a parts array — render with: m.parts.filter(p => p.type === "text").map(p => p.text).join("")
- handleSubmit/handleInputChange/isLoading are removed — use sendMessage(), manage input with useState, check status
Docs: https://docs.generalcompute.com/quickstart#build-a-streaming-chat-with-vercel-ai-sdk
Any OpenAI-compatible client → General Compute
If your project already uses an OpenAI-compatible client (LangChain, LiteLLM, curl, etc.), just change the base URL.
Migrate this project to use General Compute for inference. General Compute is OpenAI-compatible, so any client that works with OpenAI works with General Compute.
1. Change the base URL to: https://api.generalcompute.com/v1
2. Change the API key to use GENERALCOMPUTE_API_KEY instead of whatever provider key is currently set.
3. Update model names:
- Use minimax-m2.5 for general purpose or deepseek-v3.2 for reasoning
- See all models: https://docs.generalcompute.com/models
4. Update environment variables in all .env files, CI configs, and deployment settings.
That's it. All endpoints, parameters, streaming, tool calling, and JSON mode work identically.
Docs: https://docs.generalcompute.com/quickstart
API reference: https://docs.generalcompute.com/api-reference/introduction
These prompts reference our documentation so your agent can look up model names, capabilities, and code examples if it needs more context.

