Skip to main content
General Compute Logo

Installation

General Compute offers OpenAI-compatible SDKs for Node.js and Python, providing a drop-in replacement for OpenAI’s SDK.
npm install @generalcompute/sdk

Quick Start

Node.js / TypeScript

import GeneralCompute from "@generalcompute/sdk";

const client = new GeneralCompute({
  apiKey: process.env.GENERALCOMPUTE_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "llama3.1-8b",
  messages: [{ role: "user", content: "Hello!" }],
});

console.log(completion.choices[0].message.content);

Python

import os
from generalcompute import GeneralCompute

client = GeneralCompute(api_key=os.getenv("GENERALCOMPUTE_API_KEY"))

response = client.chat.completions.create(
    model="llama3.1-8b",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response.choices[0].message.content)

API Key Setup

You can provide your API key in two ways:
  1. Environment variable (recommended):
export GENERALCOMPUTE_API_KEY=gc_your_api_key_here
  1. Constructor option:
const client = new GeneralCompute({
  apiKey: "gc_your_api_key_here",
});

Streaming

Both SDKs support streaming responses for real-time output.
const stream = await client.chat.completions.create({
  model: "llama3.1-8b",
  messages: [{ role: "user", content: "Write a short poem" }],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    process.stdout.write(content);
  }
}

Migration from OpenAI

Switching from OpenAI’s SDK to GeneralCompute is simple:

Before (OpenAI)

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

const completion = await client.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: "Hello!" }],
});

After (GeneralCompute)

import GeneralCompute from "@generalcompute/sdk";

const client = new GeneralCompute({
  apiKey: process.env.GENERALCOMPUTE_API_KEY, // Changed
});

const completion = await client.chat.completions.create({
  model: "llama-3.3-70b", // Changed
  messages: [{ role: "user", content: "Hello!" }],
});
What changed:
  1. Import: openai@generalcompute/sdk (Node) or generalcompute (Python)
  2. API key: OPENAI_API_KEYGENERALCOMPUTE_API_KEY
  3. Model name: gpt-4llama-3.3-70b (or your chosen model)
What stayed the same:
  • Method name: client.chat.completions.create()
  • Parameters: messages, temperature, stream, etc.
  • Response format: Same structure and types
  • Streaming: Same async iteration pattern

Model Compatibility

Model IDModel NameNotes
llama3.1-8bLlama 3.1 8BFast, cost-effective
llama-3.3-70bLlama 3.3 70BHigh capability, latest generation
qwen-3-32bQwen 3 32BBalanced performance and speed
qwen-3-235b-a22b-instruct-2507Qwen 3 235B InstructLargest model, highest capability
gpt-oss-120bOpenAI GPT OSSOpen-source GPT alternative
zai-glm-4.7Z.ai GLM 4.7Advanced reasoning capabilities

OpenAI Model Equivalents

OpenAI ModelRecommended GeneralCompute ModelNotes
gpt-3.5-turbollama3.1-8bFast, cost-effective
gpt-4llama-3.3-70bHigh capability, similar performance
gpt-4-turboqwen-3-235b-a22b-instruct-2507Highest capability option

Next Steps

API Reference

Explore the complete API documentation.

Error Handling

Learn how to handle errors and rate limits.

Advanced Usage

Custom timeouts, headers, and configurations.

List Models

Discover all available models via the API.
Need help? Contact us at support@generalcompute.com or visit our documentation.