Skip to content

Third-Party Tools

Managed keys are fully OpenAI-compatible, which means they work with any tool or framework that supports the OpenAI API. Here are setup instructions for popular tools.

from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
api_key="gk-prx-your-managed-key",
base_url="https://proxy.gonkabroker.com/v1",
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
)
response = llm.invoke("What is the capital of France?")
print(response.content)
from llama_index.llms.openai import OpenAI
llm = OpenAI(
api_key="gk-prx-your-managed-key",
api_base="https://proxy.gonkabroker.com/v1",
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
)
response = llm.complete("What is the capital of France?")
print(response)
  1. Open Cursor settings
  2. Go to Models > OpenAI API Key
  3. Enter your managed key: gk-prx-your-managed-key
  4. Set the API base URL to: https://proxy.gonkabroker.com/v1
  5. Select a Gonka-supported model from the model list

Add this to your Continue configuration (~/.continue/config.json):

{
"models": [
{
"title": "Gonka Broker",
"provider": "openai",
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"apiKey": "gk-prx-your-managed-key",
"apiBase": "https://proxy.gonkabroker.com/v1"
}
]
}

The general pattern for any tool that supports a custom OpenAI endpoint:

  1. Find the setting for API key and enter your managed key
  2. Find the setting for Base URL (sometimes called API base, endpoint, or host) and set it to https://proxy.gonkabroker.com/v1
  3. Select a model from the Supported Models list

If the tool uses environment variables:

Terminal window
OPENAI_API_KEY=gk-prx-your-managed-key
OPENAI_BASE_URL=https://proxy.gonkabroker.com/v1