Skip to content

Commit dccdd6e

Browse files
Numman AliNumman Ali
authored andcommitted
Add variant config support and modern presets
1 parent a8c1395 commit dccdd6e

17 files changed

+545
-138
lines changed

CHANGELOG.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ All notable changes to this project are documented here. Dates use the ISO forma
1818
### Changed
1919
- **Prompt selection alignment**: GPT 5.2 general now uses `gpt_5_2_prompt.md` (Codex CLI parity).
2020
- **Reasoning configuration**: GPT 5.2 Codex supports `xhigh` but does **not** support `"none"`; `"none"` auto-upgrades to `"low"` and `"minimal"` normalizes to `"low"`.
21-
- **Config presets**: `config/full-opencode.json` now includes 22 pre-configured variants (adds GPT 5.2 Codex).
21+
- **Config presets**: `config/opencode-legacy.json` includes the 22 pre-configured presets (adds GPT 5.2 Codex); `config/opencode-modern.json` provides the variant-based setup.
2222
- **Docs**: Updated README/AGENTS/config docs to include GPT 5.2 Codex and new model family behavior.
2323

2424
## [4.1.1] - 2025-12-17
@@ -161,12 +161,12 @@ This release brings full parity with Codex CLI's prompt engineering:
161161

162162
## [3.2.0] - 2025-11-14
163163
### Added
164-
- GPT 5.1 model family support: normalization for `gpt-5.1`, `gpt-5.1-codex`, and `gpt-5.1-codex-mini` plus new GPT 5.1-only presets in the canonical `config/full-opencode.json`.
164+
- GPT 5.1 model family support: normalization for `gpt-5.1`, `gpt-5.1-codex`, and `gpt-5.1-codex-mini` plus new GPT 5.1-only presets in the canonical `config/opencode-legacy.json`.
165165
- Documentation updates (README, docs, AGENTS) describing the 5.1 families, their reasoning defaults, and how they map to ChatGPT slugs and token limits.
166166

167167
### Changed
168168
- Model normalization docs and tests now explicitly cover both 5.0 and 5.1 Codex/general families and the two Codex Mini tiers.
169-
- The legacy GPT 5.0 full configuration is now published as `config/full-opencode-gpt5.json`; new installs should prefer the 5.1 presets.
169+
- The legacy GPT 5.0 full configuration is now published separately; new installs should prefer the 5.1 presets in `config/opencode-legacy.json`.
170170

171171
## [3.1.0] - 2025-11-11
172172
### Added
@@ -179,7 +179,7 @@ This release brings full parity with Codex CLI's prompt engineering:
179179
## [3.0.0] - 2025-11-04
180180
### Added
181181
- Codex-style usage-limit messaging that mirrors the 5-hour and weekly windows reported by the Codex CLI.
182-
- Documentation guidance noting that OpenCode's context auto-compaction and usage sidebar require the canonical `config/full-opencode.json`.
182+
- Documentation guidance noting that OpenCode's context auto-compaction and usage sidebar require the canonical `config/opencode-legacy.json`.
183183

184184
### Changed
185185
- Prompt caching now relies solely on the host-supplied `prompt_cache_key`; conversation/session headers are forwarded only when OpenCode provides one.

README.md

Lines changed: 64 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@ Follow me on [X @nummanthinks](https://x.com/nummanthinks) for future updates an
3434

3535
-**ChatGPT Plus/Pro OAuth authentication** - Use your existing subscription
3636
-**22 pre-configured model variants** - GPT 5.2, GPT 5.2 Codex, GPT 5.1, GPT 5.1 Codex, GPT 5.1 Codex Max, and GPT 5.1 Codex Mini presets for all reasoning levels
37+
-**OpenCode variants system support** - Works with v1.0.210+ variant cycling (Ctrl+T) and legacy per-model presets
3738
-**GPT 5.2 + GPT 5.2 Codex support** - Latest models with `low/medium/high/xhigh` reasoning levels (Codex excludes `none`)
3839
-**Full image input support** - All models configured with multimodal capabilities for reading screenshots, diagrams, and images
3940
- ⚠️ **GPT 5.1+ only** - Older GPT 5.0 models are deprecated and may not work reliably
@@ -101,43 +102,44 @@ If you’re using an AI agent (Codex/Claude/etc.) to install or update this plug
101102

102103
1. **Use the official repo**: https://github.com/numman-ali/opencode-openai-codex-auth
103104
2. **Pin the plugin version** in the user’s OpenCode config (`~/.config/opencode/opencode.json` or project `.opencode.json`).
104-
3. **Replace the config** with the latest `config/full-opencode.json` from the repo.
105+
3. **Replace the config** with the latest `config/opencode-modern.json` (OpenCode v1.0.210+) or `config/opencode-legacy.json` (v1.0.209 and below).
105106
4. **Refresh the plugin cache** so OpenCode reinstalls the updated version.
106107
5. **Restart OpenCode**.
107108

108109
```bash
109110
# 1) Update plugin version (replace <latest> with newest release tag)
110111
111112

112-
# 2) Copy full config
113-
cp <repo>/config/full-opencode.json ~/.config/opencode/opencode.json
113+
# 2) Copy config (choose based on your OpenCode version)
114+
# opencode --version
115+
# Modern (v1.0.210+):
116+
cp <repo>/config/opencode-modern.json ~/.config/opencode/opencode.json
117+
# Legacy (v1.0.209 and below):
118+
cp <repo>/config/opencode-legacy.json ~/.config/opencode/opencode.json
114119

115120
# 3) Refresh OpenCode plugin cache
116121
rm -rf ~/.cache/opencode/node_modules ~/.cache/opencode/bun.lock
117122

118-
# 4) Optional sanity check for GPT-5.2-Codex presets
119-
jq '.provider.openai.models | keys | map(select(startswith("gpt-5.2-codex")))' \
120-
~/.config/opencode/opencode.json
123+
# 4) Optional sanity check for GPT-5.2 models
124+
jq '.provider.openai.models | keys' ~/.config/opencode/opencode.json
121125
```
122126

123127
> **Note**: If using a project-local config, replace the target path with `<project>/.opencode.json`.
124128
125129
---
126130

127-
#### ⚠️ REQUIRED: Full Configuration (Only Supported Setup)
131+
#### ⚠️ REQUIRED: Use the Supported Configuration
128132

129-
**IMPORTANT**: You MUST use the full configuration from [`config/full-opencode.json`](./config/full-opencode.json). Other configurations are not officially supported and may not work reliably.
133+
**Pick the config file that matches your OpenCode version:**
134+
- **OpenCode v1.0.210+**`config/opencode-modern.json` (variants system)
135+
- **OpenCode v1.0.209 and below**`config/opencode-legacy.json` (legacy per-variant model list)
130136

131-
**Why the full config is required:**
132-
- GPT 5 models can be temperamental - some work, some don't, some may error
133-
- The full config has been tested and verified to work
134-
- Minimal configs lack proper model metadata for OpenCode features
137+
**Why this is required:**
138+
- GPT 5 models can be temperamental and need proper configuration
139+
- Full model metadata is required for OpenCode features (limits, usage widgets, compaction)
135140
- Older GPT 5.0 models are deprecated and being phased out by OpenAI
136141

137-
1. **Copy the full configuration** from [`config/full-opencode.json`](./config/full-opencode.json) to your opencode config file.
138-
139-
The config includes 22 models with image input support. Here's a condensed example showing the structure:
140-
142+
**Modern config (variants) example:**
141143
```json
142144
{
143145
"$schema": "https://opencode.ai/config.json",
@@ -152,52 +154,44 @@ jq '.provider.openai.models | keys | map(select(startswith("gpt-5.2-codex")))' \
152154
"store": false
153155
},
154156
"models": {
155-
"gpt-5.2-high": {
156-
"name": "GPT 5.2 High (OAuth)",
157-
"limit": { "context": 272000, "output": 128000 },
158-
"modalities": { "input": ["text", "image"], "output": ["text"] },
159-
"options": {
160-
"reasoningEffort": "high",
161-
"reasoningSummary": "detailed",
162-
"textVerbosity": "medium",
163-
"include": ["reasoning.encrypted_content"],
164-
"store": false
165-
}
166-
},
167-
"gpt-5.1-codex-max-high": {
168-
"name": "GPT 5.1 Codex Max High (OAuth)",
157+
"gpt-5.2": {
158+
"name": "GPT 5.2 (OAuth)",
169159
"limit": { "context": 272000, "output": 128000 },
170160
"modalities": { "input": ["text", "image"], "output": ["text"] },
171-
"options": {
172-
"reasoningEffort": "high",
173-
"reasoningSummary": "detailed",
174-
"textVerbosity": "medium",
175-
"include": ["reasoning.encrypted_content"],
176-
"store": false
161+
"variants": {
162+
"low": { "reasoningEffort": "low", "reasoningSummary": "auto", "textVerbosity": "medium" },
163+
"high": { "reasoningEffort": "high", "reasoningSummary": "detailed", "textVerbosity": "medium" }
177164
}
178165
}
179-
// ... 20 more models - see config/full-opencode.json for complete list
180166
}
181167
}
182168
}
183169
}
184170
```
185171

186-
**⚠️ Copy the complete file** from [`config/full-opencode.json`](./config/full-opencode.json) - don't use this truncated example.
172+
**Usage (modern config):**
173+
```bash
174+
opencode run "task" --model=openai/gpt-5.2 --variant=medium
175+
opencode run "task" --model=openai/gpt-5.2 --variant=high
176+
```
187177

188-
**Global config**: `~/.config/opencode/opencode.json`
189-
**Project config**: `<project>/.opencode.json`
178+
**Usage (legacy config):**
179+
```bash
180+
opencode run "task" --model=openai/gpt-5.2-medium
181+
opencode run "task" --model=openai/gpt-5.2-high
182+
```
190183

191-
This gives you 22 model variants with different reasoning levels:
192-
- **gpt-5.2** (none/low/medium/high/xhigh) - Latest GPT 5.2 model with full reasoning support
193-
- **gpt-5.2-codex** (low/medium/high/xhigh) - GPT 5.2 Codex presets
194-
- **gpt-5.1-codex-max** (low/medium/high/xhigh) - Codex Max presets
195-
- **gpt-5.1-codex** (low/medium/high) - Codex model presets
196-
- **gpt-5.1-codex-mini** (medium/high) - Codex mini tier presets
197-
- **gpt-5.1** (none/low/medium/high) - General-purpose reasoning presets
184+
This gives you 22 model variants with different reasoning levels:
185+
- **gpt-5.2** (none/low/medium/high/xhigh) - Latest GPT 5.2 model with full reasoning support
186+
- **gpt-5.2-codex** (low/medium/high/xhigh) - GPT 5.2 Codex presets
187+
- **gpt-5.1-codex-max** (low/medium/high/xhigh) - Codex Max presets
188+
- **gpt-5.1-codex** (low/medium/high) - Codex model presets
189+
- **gpt-5.1-codex-mini** (medium/high) - Codex mini tier presets
190+
- **gpt-5.1** (none/low/medium/high) - General-purpose reasoning presets
198191

199-
All appear in the opencode model selector as "GPT 5.1 Codex Low (OAuth)", "GPT 5.1 High (OAuth)", etc.
192+
All appear in the opencode model selector as "GPT 5.1 Codex Low (OAuth)", "GPT 5.1 High (OAuth)", etc.
200193

194+
> **⚠️ IMPORTANT:** Use the config file above. Minimal configs are NOT supported and may fail unpredictably.
201195
### Prompt caching & usage limits
202196

203197
Codex backend caching is enabled automatically. When OpenCode supplies a `prompt_cache_key` (its session identifier), the plugin forwards it unchanged so Codex can reuse work between turns. The plugin no longer synthesizes its own cache IDs—if the host omits `prompt_cache_key`, Codex will treat the turn as uncached. The bundled CODEX_MODE bridge prompt is synchronized with the latest Codex CLI release, so opencode and Codex stay in lock-step on tool availability. When your ChatGPT subscription nears a limit, opencode surfaces the plugin's friendly error message with the 5-hour and weekly windows, mirroring the Codex CLI summary.
@@ -245,27 +239,25 @@ If you're on SSH/WSL/remote and the browser callback fails, choose **"ChatGPT Pl
245239

246240
## Usage
247241

248-
If using the full configuration, select from the model picker in opencode, or specify via command line:
242+
If using the supported configuration, select from the model picker in opencode, or specify via command line.
249243

250244
```bash
251-
# Use different reasoning levels for gpt-5.1-codex
252-
opencode run "simple task" --model=openai/gpt-5.1-codex-low
253-
opencode run "complex task" --model=openai/gpt-5.1-codex-high
254-
opencode run "large refactor" --model=openai/gpt-5.1-codex-max-high
255-
opencode run "research-grade analysis" --model=openai/gpt-5.1-codex-max-xhigh
245+
# Modern config (v1.0.210+): use --variant
246+
opencode run "simple task" --model=openai/gpt-5.1-codex --variant=low
247+
opencode run "complex task" --model=openai/gpt-5.1-codex --variant=high
248+
opencode run "large refactor" --model=openai/gpt-5.1-codex-max --variant=high
249+
opencode run "research-grade analysis" --model=openai/gpt-5.1-codex-max --variant=xhigh
256250

257-
# Use different reasoning levels for gpt-5.1
251+
# Legacy config: use model names
258252
opencode run "quick question" --model=openai/gpt-5.1-low
259253
opencode run "deep analysis" --model=openai/gpt-5.1-high
260-
261-
# Use Codex Mini variants
262-
opencode run "balanced task" --model=openai/gpt-5.1-codex-mini-medium
263-
opencode run "complex code" --model=openai/gpt-5.1-codex-mini-high
264254
```
265255

266-
### Available Model Variants (Full Config)
256+
### Available Model Variants (Legacy Config)
267257

268-
When using [`config/full-opencode.json`](./config/full-opencode.json), you get these pre-configured variants:
258+
When using [`config/opencode-legacy.json`](./config/opencode-legacy.json), you get these pre-configured variants:
259+
260+
For the modern config (`opencode-modern.json`), use the same variant names via `--variant` or `Ctrl+T` in the TUI (e.g., `--model=openai/gpt-5.2 --variant=high`).
269261

270262
| CLI Model ID | TUI Display Name | Reasoning Effort | Best For |
271263
|--------------|------------------|-----------------|----------|
@@ -299,7 +291,7 @@ When using [`config/full-opencode.json`](./config/full-opencode.json), you get t
299291
>
300292
> **Note**: GPT 5.2, GPT 5.2 Codex, and Codex Max all support `xhigh` reasoning. Use explicit reasoning levels (e.g., `gpt-5.2-high`, `gpt-5.2-codex-xhigh`, `gpt-5.1-codex-max-xhigh`) for precise control.
301293
302-
> **⚠️ Important**: GPT 5 models can be temperamental - some variants may work better than others, some may give errors, and behavior may vary. Stick to the presets above configured in `full-opencode.json` for best results.
294+
> **⚠️ Important**: GPT 5 models can be temperamental - some variants may work better than others, some may give errors, and behavior may vary. Stick to the presets above configured in `opencode-legacy.json` or the variants in `opencode-modern.json` for best results.
303295
304296
All accessed via your ChatGPT Plus/Pro subscription.
305297

@@ -339,19 +331,21 @@ These defaults are tuned for Codex CLI-style usage and can be customized (see Co
339331

340332
## Configuration
341333

342-
### ⚠️ REQUIRED: Use Pre-Configured File
334+
### ⚠️ REQUIRED: Use a Supported Config File
335+
336+
Choose the config file that matches your OpenCode version:
343337

344-
**YOU MUST use [`config/full-opencode.json`](./config/full-opencode.json)** - this is the only officially supported configuration:
345-
- 22 pre-configured model variants (GPT 5.2, GPT 5.2 Codex, GPT 5.1, Codex, Codex Max, Codex Mini)
338+
- **OpenCode v1.0.210+**[`config/opencode-modern.json`](./config/opencode-modern.json)
339+
- **OpenCode v1.0.209 and below**[`config/opencode-legacy.json`](./config/opencode-legacy.json)
340+
341+
Both provide:
342+
- 22 reasoning variants across GPT 5.2, GPT 5.2 Codex, GPT 5.1, Codex, Codex Max, Codex Mini
346343
- Image input support enabled for all models
347-
- Optimal configuration for each reasoning level
348-
- All variants visible in the opencode model selector
349-
- Required metadata for OpenCode features to work properly
344+
- Required metadata for OpenCode features (limits, usage widgets, compaction)
350345

351-
**Do NOT use other configurations** - they are not supported and may fail unpredictably with GPT 5 models.
346+
**Do NOT use other configurations** — minimal configs are not supported and may fail unpredictably with GPT5 models.
352347

353348
See [Installation](#installation) for setup instructions.
354-
355349
### Custom Configuration
356350

357351
If you want to customize settings yourself, you can configure options at provider or model level.

0 commit comments

Comments
 (0)