Model selection
Environment variable | CLI equivalent | Description |
|---|---|---|
JUNIE_MODEL | --model | LLM model to use. If not specified, Junie uses the |
To see all available models, run junie --help or use /model in interactive mode.
How Junie uses models internally
The model you select (or the Default) is your primary model — Junie uses it for the main reasoning and code generation work.
However, Junie also uses a separate model for certain internal tasks such as:
Summarizing context
Classifying and routing tasks
Extracting information for memory
Filtering capabilities
Other helper tasks
These internal tasks don't need the full power of the primary model, so Junie automatically picks an alternative from the same provider. Note that these models are not always smaller — for example, routing may use GPT-4.1 by default. If your primary model is from Anthropic, the helper model will typically be Claude Haiku; if it's from Google, it will be Gemini Flash.
This applies even when using BYOK. If you bring your own API key, Junie will still use a separate model from the same provider for these internal tasks. This means you may see API calls to a model you didn't explicitly select — that's expected and helps keep costs down and responses fast. For more details on configuring custom LLM providers, see Custom LLM Models.
Junie CLI also supports model aliases. Each model alias ID always points to the latest supported version of that model.
Model ID | Provider | Description |
|---|---|---|
| Anthropic | The latest Claude Sonnet model. |
| Anthropic | The latest Claude Opus model. |
| OpenAI | The latest GPT model. |
| OpenAI | The latest GPT Codex model. |
| The latest Gemini Pro model. | |
| The latest Gemini Flash model. | |
| xAI | The latest Grok model. |