Custom proxies
Custom proxies let you route Junie's LLM traffic through a self-hosted or third-party proxy endpoint instead of the default JetBrains AI service. This is useful when your organization runs its own inference gateway, needs to add custom authentication headers, or wants to use a private Ingrazzio-compatible deployment.
About the Ingrazzio proxy
Ingrazzio is JetBrains' internal proxy protocol that Junie uses to communicate with LLM providers. The production Ingrazzio endpoint is https://ingrazzio-cloud-prod.labs.jb.gg.
When Junie connects through an Ingrazzio proxy, it uses the base URL to access several sub-endpoints:
LLM chat — the base path handles streaming chat completion requests.
Web search — the
/searchpath provides web search capabilities.URL extraction — the
/extractpath fetches and extracts content from URLs.
When using the default JetBrains provider, Junie authenticates automatically via JetBrains Account (JBA). When using a custom proxy, JBA authentication is bypassed — you must supply any required credentials (such as Authorization: Bearer <token>) through the proxy's headers field.
Configure a proxy
Add a proxies array to your config.json. Each entry describes a named proxy endpoint:
Proxy fields
Field | Required | Description |
|---|---|---|
| Yes | A unique name for this proxy. Used to reference it from the |
| No | The proxy protocol type. Defaults to |
| Yes | The base URL of the proxy endpoint. |
| No | A list of extra HTTP headers to send with every request. Each entry uses the format |
Select a proxy as the active provider
To make Junie use a configured proxy, set the provider field to the proxy name:
You can also override the provider at runtime with the --provider CLI flag:
When a proxy is selected as the provider, Junie does not use JetBrains AI authentication. All authentication must be handled through the headers you configure on the proxy entry.
Supported proxy kinds
The kind field determines which protocol Junie uses to communicate with the proxy.
Kind | Status | Description |
|---|---|---|
| Supported | Junie's native proxy protocol. Compatible with JetBrains Ingrazzio deployments. This is the default when |
| Planned | OpenAI-compatible API. |
| Planned | Anthropic-compatible API. |
| Planned | JetBrains AI Gateway. |
| Planned | OpenRouter-compatible API. |
Quick setup example
Create a configuration file (for example,
my-config.json):
Run Junie with the custom config:
You can still override the model at runtime with the --model flag:
Merging proxies across configuration files
When multiple configuration files define proxies, they are merged by name:
If two files define a proxy with the same
name, the higher-priority file's fields override the lower-priority file's fields on a per-field basis.Headers from both files are combined (deduplicated).
Proxies with different names are all included in the final list.
For the configuration precedence order, see Configuration files.
Limitations
Only the
Ingrazzioproxy kind is currently supported. Other kinds are reserved for future use.Proxy configuration is only available through
config.json. There are no dedicated CLI flags for defining proxy entries.When using a proxy provider, JetBrains AI authentication is bypassed entirely. You must supply any required credentials via the
headersfield.