Guides
Migrating from Anthropic
Last updated March 3, 2026
Step-by-step guide to migrate from Anthropic SDK to Cencori. Keep Claude while gaining security, logging, and multi-provider flexibility.
Why Migrate to Cencori?
If you're already using Claude, switching to Cencori adds powerful infrastructure without sacrificing model quality:
- Keep using Claude: Same models, same quality
- Add OpenAI & Gemini: Switch providers without code changes
- Built-in Security: Automatic PII and prompt injection detection
- Cost Tracking: See exact costs per request
- Complete Logging: Audit trail for compliance
Code Comparison
Before (Anthropic SDK)
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
const response = await anthropic.messages.create({
model: 'claude-3-opus-20240229',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});After (Cencori SDK)
import { Cencori } from 'cencori';
const cencori = new Cencori({
apiKey: process.env.CENCORI_API_KEY,
});
const response = await cencori.ai.chat({
model: 'claude-3-opus',
maxTokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});Migration Steps
Step 1: Install Cencori SDK
npm uninstall @anthropic-ai/sdk
npm install cencoriStep 2: Get Cencori API Key
- Sign up at the Cencori dashboard
- Create a project
- Generate an API key
Step 3: Add Your Anthropic Key to Cencori
- In Cencori dashboard, go to Project Settings
- Navigate to "Provider Keys"
- Add your Anthropic API key and Save
[!NOTE] Cencori uses your Anthropic key to make requests on your behalf. You keep full control over your provider accounts.
Step 4: Update Your Code
Old:
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});New:
import { Cencori } from 'cencori';
const cencori = new Cencori({
apiKey: process.env.CENCORI_API_KEY,
});API Mapping Reference
| Anthropic SDK | Cencori SDK |
|---|---|
anthropic.messages.create() | cencori.ai.chat() |
claude-3-opus-20240229 | claude-3-opus |
max_tokens | maxTokens |
stream: true | cencori.ai.chatStream() |
System Message Handling
Anthropic SDK has a separate system parameter. Cencori handles this automatically via the messages array:
Anthropic Native:
await anthropic.messages.create({
model: 'claude-3-opus-20240229',
system: 'You are helpful',
messages: [{ role: 'user', content: 'Hello' }],
});Cencori (standard format):
await cencori.ai.chat({
model: 'claude-3-opus',
messages: [
{ role: 'system', content: 'You are helpful' },
{ role: 'user', content: 'Hello' }
],
});[!TIP] Cencori automatically converts system messages to Anthropic's specific format when routing.
Migrating Streaming Code
Anthropic Streaming
const stream = await anthropic.messages.create({
model: 'claude-3-opus-20240229',
messages: messages,
stream: true,
});
for await (const event of stream) {
if (event.type === 'content_block_delta') {
process.stdout.write(event.delta.text || '');
}
}Cencori Streaming
const stream = cencori.ai.chatStream({
model: 'claude-3-opus',
messages: messages,
});
for await (const chunk of stream) {
process.stdout.write(chunk.delta);
if (chunk.finish_reason) {
console.log('\nDone!');
}
}Bonus: Multi-Provider Freedom
Now that you're on Cencori, switching to other providers is trivial:
// Use Claude (same as before)
const claudeResponse = await cencori.ai.chat({
model: 'claude-3-opus',
messages: messages,
});
// Try GPT-4o (just change model!)
const gptResponse = await cencori.ai.chat({
model: 'gpt-4o',
messages: messages,
});
// Or Gemini 2.5 Flash
const geminiResponse = await cencori.ai.chat({
model: 'gemini-2.5-flash',
messages: messages,
});