Docs/Documentation

Getting Started

Introduction

Last updated March 3, 2026

Cencori is the infrastructure layer for AI production. We provide security, multi-provider routing, and observability so you can build resilient AI applications.

Cencori is a Cloud Intelligence Provider (CIP) — the unified infrastructure layer for the AI-first world. We bridge the gap between building a "demo" and shipping a production-ready application that is secure, observable, and multi-cloud by default.

The Production Gap

Most AI applications break when they hit production. They fail because of:

  • Security: Prompt injections, PII leakage, and jailbreaks are trivial without a proxy layer.
  • Reliability: Relying on a single provider (like OpenAI) creates a single point of failure.
  • Visibility: Tracking costs and latency across multiple models is complex.

Cencori solves this by sitting between your application and AI providers, providing a unified API with built-in security and routing.


Before you begin

To get the most out of Cencori, we recommend having the following ready:

  • A Cencori account to access your project dashboard.
  • An API key for at least one AI provider (OpenAI, Anthropic, or Google).
  • A basic understanding of AI Gateway concepts.

Core Primitives

Cencori is built on 5 fundamental primitives that handle the heavy lifting of AI infrastructure:

  1. AI Gateway: A single, secure endpoint for all your model routing.
  2. Compute: Secure, ephemeral execution for AI agents and logic.
  3. Workflow: Visual orchestration for multi-step AI pipelines.
  4. Data Storage: AI-native storage for context, vector sync, and integrity.
  5. Integration: Pre-built connectors to external tools and databases.

SDKs & Integrations

Cencori is designed to work where your code already lives. We provide three primary integration paths:

  • Official SDKs: Dedicated, feature-rich libraries for TypeScript, Python, and Go.
  • Framework Adapters: Deep integrations for Vercel AI SDK, TanStack AI, and LangChain.
  • Universal Proxy: Use your existing OpenAI or Anthropic SDKs by simply changing the base_url. We are 100% compatible with the native ecosystem.

Quick Example

Making your first request with the Cencori SDK is simple. We normalize the response format across all 14+ supported providers.

Codetext
import { Cencori } from 'cencori';
 
// Initialize with your Cencori Project Key
const cencori = new Cencori({ apiKey: 'csk_live_...' });
 
const response = await cencori.ai.chat({
  model: 'gpt-4o', // Or 'claude-3-5-sonnet', 'gemini-1.5-pro', etc.
  messages: [
    { role: 'user', content: 'What is a Cloud Intelligence Provider?' }
  ]
});
 
console.log(response.content);

Next Steps

Ready to dive in? Here is the best way to navigate our documentation:

GoalPath
Get running fastQuick Start
Secure your appCencori Scan
Route to providersAI Gateway
Manage contextAI Memory