Migrating from OpenAI
If you’re already using the OpenAI API, switching to Gonka Broker takes two changes: the base URL and the API key.
Python
Section titled “Python”from openai import OpenAI
client = OpenAI( api_key="gk-prx-your-managed-key", # ← your Gonka key base_url="https://proxy.gonkabroker.com/v1", # ← Gonka endpoint)
# Everything else stays exactly the sameresponse = client.chat.completions.create( model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8", messages=[{"role": "user", "content": "Hello!"}],)JavaScript / TypeScript
Section titled “JavaScript / TypeScript”import OpenAI from "openai";
const client = new OpenAI({ apiKey: "gk-prx-your-managed-key", // ← your Gonka key baseURL: "https://proxy.gonkabroker.com/v1", // ← Gonka endpoint});
// Everything else stays exactly the sameconst response = await client.chat.completions.create({ model: "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8", messages: [{ role: "user", content: "Hello!" }],});Environment variables
Section titled “Environment variables”Many projects read the API key and base URL from environment variables. A clean way to migrate:
OPENAI_API_KEY=gk-prx-your-managed-keyOPENAI_BASE_URL=https://proxy.gonkabroker.com/v1If your OpenAI client reads these variables automatically (the official Python and JS SDKs do), no code changes are needed at all.
Choosing a model
Section titled “Choosing a model”Replace your OpenAI model name with a Gonka-supported model. See Supported Models for the full list.
Common mappings:
| Instead of | Try |
|---|---|
gpt-4o | Qwen/Qwen3-235B-A22B-Instruct-2507-FP8 |
gpt-3.5-turbo | Check Supported Models for smaller models |
What about streaming, function calling, etc.?
Section titled “What about streaming, function calling, etc.?”Managed keys support the same features as the OpenAI Chat Completions API, including streaming. See API Compatibility for details on supported parameters.