Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.costhawk.ai/llms.txt

Use this file to discover all available pages before exploring further.

Wrapped keys are for provider SDK traffic through the CostHawk proxy. If you only remember one thing, remember this:
A wrapped key must be used with a CostHawk proxy URL like https://costhawk.ai/api/proxy/anthropic. It will fail if you send it directly to api.anthropic.com, api.openai.com, or Google’s direct endpoint.

The Easy Rule

  • If you are configuring MCP, OTel, or the CostHawk API, use a CostHawk access token with COSTHAWK_API_KEY.
  • If you are configuring an OpenAI, Anthropic, or Google SDK/client, use a wrapped key and change the base URL to the CostHawk proxy.

Two Different CostHawk Credentials

Both credentials may look like ch_sk_.... Do not identify them by prefix alone.
CredentialCreated InUsed ForSend It ToDo Not Send It To
CostHawk access tokenBrowser login or Dashboard → Access Setup → MCP + OTel TokensMCP, OTel ingest, CostHawk API authCOSTHAWK_API_KEY or Authorization: Bearer ... to CostHawk API routesProvider SDKs and direct OpenAI/Anthropic/Google endpoints
Wrapped keyDashboard → Access Setup → Wrapped Proxy KeysOpenAI / Anthropic / Google model requests through CostHawk proxyAuthorization: Bearer ch_sk_... or provider SDK api_key against https://costhawk.ai/api/proxy/{provider}Direct provider endpoints

How To Use A Wrapped Key

1

Create a wrapped key

Go to Access Setup, then open Wrapped Proxy Keys. When prompted, paste your real provider inference key. CostHawk stores that real key behind the wrapped key.
2

Keep provider and proxy aligned

An Anthropic wrapped key must use the Anthropic proxy. An OpenAI wrapped key must use the OpenAI proxy.
3

Change your SDK base URL

Replace the provider’s base URL with https://costhawk.ai/api/proxy/{provider}.
4

Use the wrapped key as the API key

Keep the wrapped key in the auth header or SDK api_key field. CostHawk resolves it to your real provider key server-side.
Wrapped keys sit in front of real inference-capable provider keys. Admin or read-only provider keys belong in Admin API Sync, not in proxy traffic.

Wrong vs Right

Wrong

This will fail, because the wrapped key is being sent directly to the provider:
from anthropic import Anthropic

client = Anthropic(
    api_key="ch_sk_your_wrapped_key_here",
    base_url="https://api.anthropic.com",
)
This works, because the base URL points to the CostHawk proxy:
from anthropic import Anthropic

client = Anthropic(
    api_key="ch_sk_your_wrapped_key_here",
    base_url="https://costhawk.ai/api/proxy/anthropic",
)

Proxy Endpoints

ProviderBase URLTypical Auth
OpenAIhttps://costhawk.ai/api/proxy/openaiAuthorization: Bearer ch_sk_...
Anthropichttps://costhawk.ai/api/proxy/anthropicAuthorization: Bearer ch_sk_...
Google Geminihttps://costhawk.ai/api/proxy/googleAuthorization: Bearer ch_sk_... or x-goog-api-key: ch_sk_...

Provider-Specific Setup

OpenAI Proxy

OpenAI SDK examples and endpoint details

Anthropic Proxy

Anthropic SDK examples and endpoint details

Google Gemini Proxy

Google Gemini proxy setup and request format

When Not To Use Wrapped Keys

  • If you want fast org-level visibility with no app code changes, start with Admin API Sync.
  • If you want Claude Code / Codex analytics and assistant-native tools, use MCP.
  • If you need CostHawk API auth, see Authentication.
Wrapped keys make more sense when you understand the architecture around them:

LLM Gateway

Why centralized routing layers exist and what controls they add.

Serverless Inference

The default runtime model you are buying when requests flow through provider APIs.

Max Tokens

The request-level output cap most teams enforce once traffic is routed through a proxy.