AI
Vercel AI SDK
Last updated March 3, 2026
Drop-in integration with Vercel AI SDK. Works with streamText(), generateText(), and useChat().
Cencori integrates seamlessly with Vercel AI SDK. Use the same familiar API with added security, multi-provider routing, and observability.
Installation
npm install ai cencoriSetup
Create the Cencori provider:
import { createCencoriProvider } from 'cencori/vercel';
const cencori = createCencoriProvider({
apiKey: process.env.CENCORI_API_KEY,
});Text Generation
import { generateText } from 'ai';
const result = await generateText({
model: cencori('gpt-4o'),
prompt: 'Write a haiku about AI',
});
console.log(result.text);Streaming
import { streamText } from 'ai';
const result = await streamText({
model: cencori('claude-opus-4'),
prompt: 'Explain quantum computing',
});
for await (const textPart of result.textStream) {
process.stdout.write(textPart);
}Chat (React)
// app/api/chat/route.ts
import { streamText } from 'ai';
import { cencori } from '@/lib/cencori';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: cencori('gpt-4o'),
messages,
});
return result.toDataStreamResponse();
}// app/page.tsx
'use client';
import { useChat } from 'ai/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div>
{messages.map(m => (
<div key={m.id}>
{m.role}: {m.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
</div>
);
}Switching Providers
The power of Cencori is easy provider switching. Just change the model:
// Use OpenAI
const openai = await generateText({
model: cencori('gpt-4o'),
prompt: 'Hello',
});
// Use Anthropic
const anthropic = await generateText({
model: cencori('claude-opus-4'),
prompt: 'Hello',
});
// Use Google
const google = await generateText({
model: cencori('gemini-2.5-flash'),
prompt: 'Hello',
});All requests are logged, secured, and tracked automatically.