OpenAI supports two CostHawk integration modes.
Mode A: Admin API Sync (Recommended start)
Use an OpenAI organization admin key for low-friction reporting.
- Endpoint coverage:
/v1/organization/usage/completions
/v1/organization/costs
- No app code changes required
- Best for quick organization-wide visibility
Admin API Setup
Use this first for reporting coverage.
Mode B: Proxy + Wrapped Key (Advanced controls)
Use proxy routing when you need strict runtime controls and enforcement.
A wrapped key only works against the CostHawk proxy. Do not send ch_sk_... directly to api.openai.com.
If you are not sure whether you have a wrapped key or a CostHawk access token, read Wrapped Keys first.
Proxy URL
POST https://costhawk.ai/api/proxy/openai
This replaces https://api.openai.com/v1/chat/completions.
Python (OpenAI SDK)
from openai import OpenAI
client = OpenAI(
api_key="ch_sk_your_wrapped_key_here",
base_url="https://costhawk.ai/api/proxy/openai",
)
response = client.chat.completions.create(
model="gpt-4.1",
messages=[{"role": "user", "content": "Hello, world!"}]
)
print(response.choices[0].message.content)
Node.js (OpenAI SDK)
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "ch_sk_your_wrapped_key_here",
baseURL: "https://costhawk.ai/api/proxy/openai",
});
const response = await client.chat.completions.create({
model: "gpt-4.1",
messages: [{ role: "user", content: "Hello, world!" }],
});
console.log(response.choices[0].message.content);
curl
curl -X POST https://costhawk.ai/api/proxy/openai \
-H "Content-Type: application/json" \
-H "Authorization: Bearer ch_sk_your_wrapped_key_here" \
-d '{
"model": "gpt-4.1",
"messages": [{"role": "user", "content": "Hello, world!"}]
}'
Recommended Pattern
- Start with Admin API sync for baseline visibility.
- Keep MCP enabled for assistant-native analysis and operations.
- Add proxy routing for workloads that need policy enforcement or hard-stop limits.