Docs/AI SDK

AI

Vercel AI SDK

Last updated April 17, 2026

Use Cencori's first-party provider with streamText(), generateText(), and useChat().

Cencori ships a first-party provider for the Vercel AI SDK. Use it when you want Vercel's chat and streaming primitives with Cencori's routing, security, and observability behind the model call.

Installation

Codetext
npm install ai cencori

Environment

Codetext
# .env.local
CENCORI_API_KEY=csk_live_...

Create a shared provider export for your server code:

Codetext
// lib/cencori.ts
import { cencori } from 'cencori/vercel';
 
export { cencori };

If you need custom provider settings, use createCencori() instead:

Codetext
import { createCencori } from 'cencori/vercel';
 
export const cencori = createCencori({
  apiKey: process.env.CENCORI_API_KEY!,
});

Route Handler

Codetext
// app/api/chat/route.ts
import { streamText } from 'ai';
import { cencori } from '@/lib/cencori';
 
export async function POST(req: Request) {
  const { messages, model = 'gpt-4o' } = await req.json();
 
  const result = streamText({
    model: cencori(model),
    messages,
  });
 
  return result.toDataStreamResponse();
}

React Chat UI

Codetext
// app/page.tsx
'use client';
 
import { useChat } from 'ai/react';
 
export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
    body: {
      model: 'gpt-4o',
    },
  });
 
  return (
    <div>
      {messages.map((message) => (
        <div key={message.id}>
          {message.role}: {message.content}
        </div>
      ))}
 
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
      </form>
    </div>
  );
}

Text Generation

Codetext
import { generateText } from 'ai';
import { cencori } from '@/lib/cencori';
 
const result = await generateText({
  model: cencori('gemini-2.5-flash'),
  prompt: 'Write a haiku about AI infrastructure.',
});
 
console.log(result.text);

Switching Models

Codetext
await generateText({
  model: cencori('gpt-4o'),
  prompt: 'Hello',
});
 
await generateText({
  model: cencori('claude-sonnet-4.5'),
  prompt: 'Hello',
});
 
await generateText({
  model: cencori('gemini-2.5-flash'),
  prompt: 'Hello',
});

When To Use The OpenAI-Compatible Base URL

If a library expects an OpenAI-compatible baseURL and apiKey instead of a first-party provider, use:

  • baseURL: https://api.cencori.com/v1
  • apiKey: your Cencori project key (csk_...)

Use cencori/vercel when you are already inside the Vercel AI SDK.