Skip to content

Migrating from OpenAI

If you’re already using the OpenAI API, switching to Gonka Broker takes two changes: the base URL and the API key.

from openai import OpenAI
client = OpenAI(
api_key="gk-prx-your-managed-key", # ← your Gonka key
base_url="https://proxy.gonkabroker.com/v1", # ← Gonka endpoint
)
# Everything else stays exactly the same
response = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages=[{"role": "user", "content": "Hello!"}],
)
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "gk-prx-your-managed-key", // ← your Gonka key
baseURL: "https://proxy.gonkabroker.com/v1", // ← Gonka endpoint
});
// Everything else stays exactly the same
const response = await client.chat.completions.create({
model: "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages: [{ role: "user", content: "Hello!" }],
});

Many projects read the API key and base URL from environment variables. A clean way to migrate:

.env
OPENAI_API_KEY=gk-prx-your-managed-key
OPENAI_BASE_URL=https://proxy.gonkabroker.com/v1

If your OpenAI client reads these variables automatically (the official Python and JS SDKs do), no code changes are needed at all.

Replace your OpenAI model name with a Gonka-supported model. See Supported Models for the full list.

Common mappings:

Instead ofTry
gpt-4oQwen/Qwen3-235B-A22B-Instruct-2507-FP8
gpt-3.5-turboCheck Supported Models for smaller models

What about streaming, function calling, etc.?

Section titled “What about streaming, function calling, etc.?”

Managed keys support the same features as the OpenAI Chat Completions API, including streaming. See API Compatibility for details on supported parameters.