Docs/AI SDK

AI

Vercel AI SDK

Last updated March 3, 2026

Drop-in integration with Vercel AI SDK. Works with streamText(), generateText(), and useChat().

Cencori integrates seamlessly with Vercel AI SDK. Use the same familiar API with added security, multi-provider routing, and observability.

Installation

Codetext
npm install ai cencori

Setup

Create the Cencori provider:

Codetext
import { createCencoriProvider } from 'cencori/vercel';
 
const cencori = createCencoriProvider({
  apiKey: process.env.CENCORI_API_KEY,
});

Text Generation

Codetext
import { generateText } from 'ai';
 
const result = await generateText({
  model: cencori('gpt-4o'),
  prompt: 'Write a haiku about AI',
});
 
console.log(result.text);

Streaming

Codetext
import { streamText } from 'ai';
 
const result = await streamText({
  model: cencori('claude-opus-4'),
  prompt: 'Explain quantum computing',
});
 
for await (const textPart of result.textStream) {
  process.stdout.write(textPart);
}

Chat (React)

Codetext
// app/api/chat/route.ts
import { streamText } from 'ai';
import { cencori } from '@/lib/cencori';
 
export async function POST(req: Request) {
  const { messages } = await req.json();
 
  const result = await streamText({
    model: cencori('gpt-4o'),
    messages,
  });
 
  return result.toDataStreamResponse();
}
Codetext
// app/page.tsx
'use client';
import { useChat } from 'ai/react';
 
export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();
 
  return (
    <div>
      {messages.map(m => (
        <div key={m.id}>
          {m.role}: {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
      </form>
    </div>
  );
}

Switching Providers

The power of Cencori is easy provider switching. Just change the model:

Codetext
// Use OpenAI
const openai = await generateText({
  model: cencori('gpt-4o'),
  prompt: 'Hello',
});
 
// Use Anthropic
const anthropic = await generateText({
  model: cencori('claude-opus-4'),
  prompt: 'Hello',
});
 
// Use Google
const google = await generateText({
  model: cencori('gemini-2.5-flash'),
  prompt: 'Hello',
});

All requests are logged, secured, and tracked automatically.