1. Install dependencies
2. Create the API route
The Vercel AI SDK has an@ai-sdk/openai-compatible provider that works with any OpenAI-compatible API — including General Compute.
app/api/chat/route.ts
We’re using
minimax-m2.5 here — our best general-purpose model with 160k context at an unbeatable price. Swap it for deepseek-v3.2 if you need stronger reasoning. See all models on the Models & Pricing page.3. Create the chat UI
TheuseChat hook from the Vercel AI SDK handles message state, streaming, and form submission.
app/page.tsx
4. Add your API key
Create a.env.local file in your project root:
.env.local
5. Run it
Next steps
- Swap
minimax-m2.5fordeepseek-v3.2to try reasoning capabilities - Add system prompts by prepending to the
messagesarray in the API route - Deploy to Vercel with
vercel deploy— just addGENERALCOMPUTE_API_KEYto your environment variables

