Docs/Integrations

Integrations

Vercel AI SDK

Last updated April 17, 2026

Build AI UIs with the Vercel AI SDK and Cencori's first-party provider.

Cencori works directly with the Vercel AI SDK. The recommended path is the first-party cencori/vercel provider so you do not have to wire an OpenAI-compatible client manually.

1. Install Dependencies

Codetext
npm install ai cencori

2. Set Your API Key

Codetext
# .env.local
CENCORI_API_KEY=csk_live_...

3. Export The Provider

Codetext
// lib/cencori.ts
import { cencori } from 'cencori/vercel';
 
export { cencori };

4. Stream In Your Route Handler

Codetext
// app/api/chat/route.ts
import { streamText } from 'ai';
import { cencori } from '@/lib/cencori';
 
export async function POST(req: Request) {
  const { messages, model = 'gpt-4o' } = await req.json();
 
  const result = streamText({
    model: cencori(model),
    messages,
  });
 
  return result.toDataStreamResponse();
}

5. Use useChat() In The Client

Codetext
'use client';
 
import { useChat } from 'ai/react';
 
export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
    body: { model: 'gpt-4o' },
  });
 
  return (
    <div>
      {messages.map((message) => (
        <div key={message.id}>{message.content}</div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
      </form>
    </div>
  );
}

Alternative: OpenAI-Compatible Mode

If you must use @ai-sdk/openai, point it to Cencori's OpenAI-compatible base URL:

Codetext
import { createOpenAI } from '@ai-sdk/openai';
 
export const cencori = createOpenAI({
  baseURL: 'https://api.cencori.com/v1',
  apiKey: process.env.CENCORI_API_KEY,
});

Use this mode only when a project already depends on the OpenAI provider abstraction. For new Cencori integrations, prefer cencori/vercel.

Security

Keep CENCORI_API_KEY on the server. Do not expose it through NEXT_PUBLIC_* environment variables or browser-side fetch calls.