Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.generalcompute.com/llms.txt

Use this file to discover all available pages before exploring further.

General Compute Logo

Installation

General Compute offers OpenAI-compatible SDKs for Node.js and Python, providing a drop-in replacement for OpenAI’s SDK.
npm install @generalcompute/sdk

API Key Setup

Create a key from the dashboard and configure your base URL by following the API Keys & Base URLs guide. Once the key is stored (for example, in GENERALCOMPUTE_API_KEY) you can initialize the SDKs as shown below.

Quick Start

Node.js / TypeScript

import GeneralCompute from "@generalcompute/sdk";

const client = new GeneralCompute();

const completion = await client.chat.completions.create({
  model: "minimax-m2.7",
  messages: [{ role: "user", content: "Hello!" }],
});

console.log(completion.choices[0].message.content);

Python

from generalcompute import GeneralCompute

client = GeneralCompute()

response = client.chat.completions.create(
    model="minimax-m2.7",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ],
)

print(response.choices[0].message.content)

Streaming

Both SDKs support streaming responses for real-time output.
import GeneralCompute from "@generalcompute/sdk";

const client = new GeneralCompute();

const stream = await client.chat.completions.create({
  model: "minimax-m2.7",
  messages: [{ role: "user", content: "Write a short poem" }],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    process.stdout.write(content);
  }
}

Build a Streaming Chat with Vercel AI SDK

The Vercel AI SDK works seamlessly with General Compute. Here’s a complete Next.js chat app with streaming. Check out the full example project in our examples/vercel-ai-chat directory.

1. Install dependencies

npx create-next-app@latest my-chat-app
cd my-chat-app
npm install ai @ai-sdk/openai-compatible @ai-sdk/react

2. Create the API route

app/api/chat/route.ts
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { streamText } from "ai";

const generalcompute = createOpenAICompatible({
  name: "generalcompute",
  baseURL: "https://api.generalcompute.com/v1",
  headers: {
    Authorization: `Bearer ${process.env.GENERALCOMPUTE_API_KEY}`,
  },
});

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: generalcompute("minimax-m2.7"),
    // Convert UI messages (parts format) to model messages (content format)
    messages: messages.map((m: { role: string; parts?: { type: string; text: string }[]; content?: string }) => ({
      role: m.role,
      content: m.parts
        ? m.parts.filter((p) => p.type === "text").map((p) => p.text).join("")
        : m.content,
    })),
  });

  return result.toUIMessageStreamResponse();
}

3. Create the chat UI

app/page.tsx
"use client";

import { useChat } from "@ai-sdk/react";
import { useState } from "react";

export default function Chat() {
  const { messages, sendMessage, status } = useChat();
  const [input, setInput] = useState("");

  const isLoading = status === "streaming" || status === "submitted";

  return (
    <div style={{ maxWidth: 600, margin: "0 auto", padding: 20 }}>
      <h1>Chat with General Compute</h1>

      <div style={{ marginBottom: 20 }}>
        {messages.map((m) => (
          <div key={m.id} style={{ marginBottom: 10 }}>
            <strong>{m.role === "user" ? "You" : "AI"}:</strong>
            <p>
              {m.parts
                ?.filter((p) => p.type === "text")
                .map((p) => p.text)
                .join("")}
            </p>
          </div>
        ))}
      </div>

      <form
        onSubmit={(e) => {
          e.preventDefault();
          if (!input.trim()) return;
          sendMessage({ text: input });
          setInput("");
        }}
        style={{ display: "flex", gap: 8 }}
      >
        <input
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Say something..."
          style={{ flex: 1, padding: 8 }}
        />
        <button type="submit" disabled={isLoading}>
          Send
        </button>
      </form>
    </div>
  );
}

4. Add your API key

.env.local
GENERALCOMPUTE_API_KEY=gc_your_api_key_here
Run npm run dev and open http://localhost:3000 — you have a streaming chat app.

Build a Streaming Chat with Node.js

No framework needed — here’s a complete streaming chat using just Node.js and the General Compute SDK. Check out the full example project in our examples/node-streaming directory.
index.ts
import GeneralCompute from "@generalcompute/sdk";
import * as readline from "readline";

const client = new GeneralCompute();

const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
});

const messages: { role: "system" | "user" | "assistant"; content: string }[] = [
  { role: "system", content: "You are a helpful assistant." },
];

async function chat(userMessage: string) {
  messages.push({ role: "user", content: userMessage });

  const stream = await client.chat.completions.create({
    model: "minimax-m2.7",
    messages,
    stream: true,
  });

  let assistantMessage = "";

  process.stdout.write("\nAssistant: ");
  for await (const chunk of stream) {
    const content = chunk.choices[0]?.delta?.content;
    if (content) {
      process.stdout.write(content);
      assistantMessage += content;
    }
  }
  console.log("\n");

  messages.push({ role: "assistant", content: assistantMessage });
}

function prompt() {
  rl.question("You: ", async (input) => {
    if (input.toLowerCase() === "exit") {
      rl.close();
      return;
    }
    await chat(input);
    prompt();
  });
}

console.log('Chat with General Compute (type "exit" to quit)\n');
prompt();
Run it with:
npx tsx index.ts

Migration from OpenAI

Switching from OpenAI’s SDK to GeneralCompute is a one-line change:

Before (OpenAI)

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: "Hello!" }],
});

After (GeneralCompute)

import GeneralCompute from "@generalcompute/sdk";

const client = new GeneralCompute({
  apiKey: process.env.GENERALCOMPUTE_API_KEY, // Changed
});

const completion = await client.chat.completions.create({
  model: "minimax-m2.7", // Changed
  messages: [{ role: "user", content: "Hello!" }],
});
What changed:
  1. Import: openai@generalcompute/sdk (Node) or generalcompute (Python)
  2. API key: OPENAI_API_KEYGENERALCOMPUTE_API_KEY
  3. Model name: gpt-4minimax-m2.7 (or your chosen model)
What stayed the same:
  • Method name: client.chat.completions.create()
  • Parameters: messages, temperature, stream, etc.
  • Response format: Same structure and types
  • Streaming: Same async iteration pattern

Next Steps

Models & Pricing

See all available models with pricing and capabilities.

Rate Limits

Understand rate limits and plan quotas.

API Reference

Explore the complete API documentation.

Example Projects

Clone and run complete example projects.
Need help? Contact us at support@generalcompute.com or visit our documentation.