Third-Party Tools
Managed keys are fully OpenAI-compatible, which means they work with any tool or framework that supports the OpenAI API. Here are setup instructions for popular tools.
LangChain
Section titled “LangChain”from langchain_openai import ChatOpenAI
llm = ChatOpenAI( api_key="gk-prx-your-managed-key", base_url="https://proxy.gonkabroker.com/v1", model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",)
response = llm.invoke("What is the capital of France?")print(response.content)LlamaIndex
Section titled “LlamaIndex”from llama_index.llms.openai import OpenAI
llm = OpenAI( api_key="gk-prx-your-managed-key", api_base="https://proxy.gonkabroker.com/v1", model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",)
response = llm.complete("What is the capital of France?")print(response)Cursor
Section titled “Cursor”- Open Cursor settings
- Go to Models > OpenAI API Key
- Enter your managed key:
gk-prx-your-managed-key - Set the API base URL to:
https://proxy.gonkabroker.com/v1 - Select a Gonka-supported model from the model list
Continue (VS Code extension)
Section titled “Continue (VS Code extension)”Add this to your Continue configuration (~/.continue/config.json):
{ "models": [ { "title": "Gonka Broker", "provider": "openai", "model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8", "apiKey": "gk-prx-your-managed-key", "apiBase": "https://proxy.gonkabroker.com/v1" } ]}Any OpenAI-compatible tool
Section titled “Any OpenAI-compatible tool”The general pattern for any tool that supports a custom OpenAI endpoint:
- Find the setting for API key and enter your managed key
- Find the setting for Base URL (sometimes called API base, endpoint, or host) and set it to
https://proxy.gonkabroker.com/v1 - Select a model from the Supported Models list
If the tool uses environment variables:
OPENAI_API_KEY=gk-prx-your-managed-keyOPENAI_BASE_URL=https://proxy.gonkabroker.com/v1