Skip to content

Managed Keys

Managed keys route your API requests through Gonka Broker’s proxy layer, which translates them into a format compatible with the Gonka decentralized network and returns responses in OpenAI-compatible format.

Your app → Gonka Broker Proxy → Gonka GPU Network → Response
  1. Your application sends a request using the OpenAI API format
  2. Gonka Broker receives it and forwards it to the decentralized GPU network
  3. The inference runs on available GPUs
  4. The response is converted to OpenAI format and returned to your app

From your application’s perspective, this is identical to calling the OpenAI API.

Managed keys support the OpenAI Chat Completions API, including:

  • Chat completions (/v1/chat/completions)
  • Streaming responses (stream: true)
  • System, user, and assistant messages
  • Temperature, top_p, max_tokens, and other standard parameters

See API Compatibility for the full list of supported endpoints and parameters.

All managed key requests go to:

https://proxy.gonkabroker.com/v1

Set this as the base_url (Python) or baseURL (JavaScript) in your OpenAI client configuration.

Use caseManaged key?
Replacing OpenAI in existing codeYes
Using LangChain, LlamaIndex, or similar frameworksYes
Using Cursor, Continue, or other AI-powered dev toolsYes
Building a product with the Gonka SDKNo — use a direct key
  • Your key secret starts with gk-prx- and should be treated like any API credential
  • Gonka Broker does not log or store your prompts and completions
  • If a key is compromised, rotate it immediately — this invalidates the old secret and issues a new one