Skip to main content
Build a complete streaming chat application using Next.js and the Vercel AI SDK, powered by General Compute’s fast inference. Clone the full example project:
git clone https://github.com/generalcompute/docs
cd docs/examples/vercel-ai-chat
npm install

1. Install dependencies

npx create-next-app@latest my-chat-app
cd my-chat-app
npm install ai @ai-sdk/openai-compatible

2. Create the API route

The Vercel AI SDK has an @ai-sdk/openai-compatible provider that works with any OpenAI-compatible API — including General Compute.
app/api/chat/route.ts
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { streamText } from "ai";

const generalcompute = createOpenAICompatible({
  name: "generalcompute",
  baseURL: "https://api.generalcompute.com/v1",
  headers: {
    Authorization: `Bearer ${process.env.GENERALCOMPUTE_API_KEY}`,
  },
});

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: generalcompute("minimax-m2.5"),
    messages,
  });

  return result.toDataStreamResponse();
}
We’re using minimax-m2.5 here — our best general-purpose model with 160k context at an unbeatable price. Swap it for deepseek-v3.2 if you need stronger reasoning. See all models on the Models & Pricing page.

3. Create the chat UI

The useChat hook from the Vercel AI SDK handles message state, streaming, and form submission.
app/page.tsx
"use client";

import { useChat } from "ai/react";

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit, isLoading } =
    useChat();

  return (
    <div style={{ maxWidth: 600, margin: "0 auto", padding: 20 }}>
      <h1>Chat with General Compute</h1>

      <div style={{ marginBottom: 20 }}>
        {messages.map((m) => (
          <div key={m.id} style={{ marginBottom: 10 }}>
            <strong>{m.role === "user" ? "You" : "AI"}:</strong>
            <p>{m.content}</p>
          </div>
        ))}
      </div>

      <form onSubmit={handleSubmit} style={{ display: "flex", gap: 8 }}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Say something..."
          style={{ flex: 1, padding: 8 }}
        />
        <button type="submit" disabled={isLoading}>
          Send
        </button>
      </form>
    </div>
  );
}

4. Add your API key

Create a .env.local file in your project root:
.env.local
GENERALCOMPUTE_API_KEY=gc_your_api_key_here

5. Run it

npm run dev
Open http://localhost:3000 and start chatting. Responses stream in token-by-token.

Next steps

  • Swap minimax-m2.5 for deepseek-v3.2 to try reasoning capabilities
  • Add system prompts by prepending to the messages array in the API route
  • Deploy to Vercel with vercel deploy — just add GENERALCOMPUTE_API_KEY to your environment variables