Custom Providers

Connect Cencori to your own AI models or proxy services while maintaining full security and observability.

What are Custom Providers?

Custom providers allow you to add your own AI model endpoints to Cencori. This is useful if you:

  • Self-host open-source models (Llama, Mistral)
  • Use Azure OpenAI Service with your own deployment
  • Have a custom AI gateway or proxy
  • Want unified observability across all AI vendors

Supported API Formats

Cencori supports providers that use:

  • OpenAI API format - Most common (vLLM, Ollama, Azure OpenAI)
  • Anthropic API format - Claude-compatible endpoints

Adding a Custom Provider

Via Dashboard:

  1. Go to your organization settings
  2. Click "Custom Providers" in the sidebar
  3. Click "Add Custom Provider"
  4. Fill in the provider details:

Name: Friendly name (e.g., "Azure GPT-4")

Base URL: Your endpoint (e.g., https://your-api.openai.azure.com)

API Key: Your provider's API key

Format: OpenAI or Anthropic

  1. Click "Test Connection" to verify
  2. Click "Add Provider" to save

Example: Azure Open AI Service

azure-setup

After adding, use it in your code:

use-azure.ts

Example: Self-Hosted Model (vLLM/Ollama)

vllm-setup
use-vllm.ts

Security: API Key Storage

When you add a custom provider, Cencori securely stores your API key:

  • Encrypted at rest in our database
  • Never exposed in logs or responses
  • Only used to proxy requests to your endpoint
  • Can be rotated or removed at any time

Benefits of Custom Providers

Unified Observability

View usage, costs, and security incidents for all providers in one dashboard—whether it's OpenAI, Anthropic, or your self-hosted model.

Security Everywhere

Cencori's PII detection, prompt injection protection, and content filtering work with custom providers too.

Cost Tracking

Track token usage and compute costs across all providers, including your own infrastructure.

Seamless Migration

Easily switch between providers by just changing the model name—no code changes required.

Troubleshooting

Connection Test Failed

  • Verify the Base URL is correct and accessible
  • Check that the API key is valid
  • Ensure the endpoint supports OpenAI or Anthropic format
  • Check firewall rules allow Cencori's IP addresses

Model Not Found Error

  • Ensure the model name matches exactly
  • For Azure, use the deployment name, not the model name
  • Check that the model is available on your endpoint