refactor(ai): use opencode v2 SDK session.prompt for structured output#101
refactor(ai): use opencode v2 SDK session.prompt for structured output#101StumHuang wants to merge 1 commit intotickernelz:mainfrom
Conversation
Replaces the auth.json-reading + manual OAuth refresh + direct provider
HTTPS flow with opencode's v2 client SDK. Per call we open a transient
session, prompt with format:json_schema, then delete the session.
Why:
- opencode already owns the user's auth, token refresh, and provider
routing for every connected provider; reimplementing that here was
fragile (broke whenever opencode tweaked auth.json layout) and limited
us to anthropic + openai.
- Going through session.prompt unlocks every provider opencode supports
out of the box, including github-copilot personal/business plans, OAuth
flows like Claude Pro/Max, and any custom provider users wire up.
Changes:
- src/services/ai/opencode-provider.ts: 363 -> 147 lines. Drop auth.json
parser, OAuthFetch with token refresh, and AI-SDK Anthropic/OpenAI
clients. Add createV2Client + getV2Client + new generateStructuredOutput
signature taking { client, providerID, modelID, ... }.
- src/index.ts: wire setV2Client(createV2Client(ctx.serverUrl)) at plugin
init; drop now-unused setStatePath(client.path.get()) call.
- src/services/auto-capture.ts and user-memory-learning.ts: migrate to
the new generateStructuredOutput signature; fail fast if v2 client is
not yet initialized.
- tests/opencode-provider.test.ts: rewrite to mock OpencodeClient and
cover session lifecycle, schema serialization, and error paths.
- package.json + bun.lock: drop @ai-sdk/anthropic, @ai-sdk/openai, ai.
- src/config.ts + README.md: document github-copilot as a first-class
supported provider.
Tests: 11/11 passing in tests/opencode-provider.test.ts. Typecheck clean.
There was a problem hiding this comment.
Pull request overview
This PR refactors the plugin’s “opencodeProvider” structured-output path to use opencode’s v2 SDK (session.prompt) instead of reading auth.json and calling provider HTTPS endpoints directly, enabling broader provider support and simplifying auth/refresh handling.
Changes:
- Replace the legacy auth.json + manual OAuth/provider client flow with opencode v2
session.create→session.prompt(format: json_schema)→session.delete. - Wire v2 client initialization at plugin startup and migrate auto-capture + user-profile learning to the new
generateStructuredOutputsignature. - Update tests to mock the v2 client/session lifecycle and remove unused AI SDK dependencies.
Reviewed changes
Copilot reviewed 8 out of 9 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
src/services/ai/opencode-provider.ts |
Implements v2 SDK-based structured output with transient sessions and Zod validation. |
src/index.ts |
Initializes v2 client + connected providers state during plugin startup. |
src/services/auto-capture.ts |
Switches auto-capture structured output calls to v2 client usage with initialization checks. |
src/services/user-memory-learning.ts |
Switches user-profile learning structured output calls to v2 client usage with initialization checks. |
tests/opencode-provider.test.ts |
Rewrites tests to mock v2 session API calls and validate lifecycle/error handling. |
src/config.ts |
Updates config template docs for opencode providers (incl. GitHub Copilot). |
README.md |
Documents opencode session-based provider support more broadly. |
package.json |
Removes unused AI SDK dependencies after switching to opencode v2 SDK flow. |
bun.lock |
Lockfile updates to reflect removed dependencies. |
Comments suppressed due to low confidence (1)
src/index.ts:54
setV2Client(createV2Client(ctx.serverUrl))is currently executed inside a fire-and-forget async IIFE that also awaitsctx.client.provider.list(). This introduces a race where calls intogetV2Client()(e.g., auto-capture / profile learning) can seeundefinedand hard-fail even thoughctx.serverUrlis already available. Consider initializing the v2 client synchronously (outside the IIFE) and keeping only the provider listing async, or exposing/awaiting a shared initialization promise before first use.
(async () => {
try {
const { setConnectedProviders, setV2Client, createV2Client } =
await import("./services/ai/opencode-provider.js");
setV2Client(createV2Client(ctx.serverUrl));
const providerResult = await ctx.client.provider.list();
if (providerResult.data?.connected) {
setConnectedProviders(providerResult.data.connected);
}
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // zod v4 exposes JSON Schema export natively (instance `.toJSONSchema()` | ||
| // and global `z.toJSONSchema()`); we prefer instance, fall back to global. | ||
| // This avoids pulling in a separate `zod-to-json-schema` dependency. | ||
| const jsonSchema = | ||
| ( | ||
| schema as unknown as { | ||
| toJSONSchema?: () => Record<string, unknown>; | ||
| } | ||
| ).toJSONSchema?.() ?? (await import("zod")).z.toJSONSchema(schema); | ||
|
|
There was a problem hiding this comment.
generateStructuredOutput dynamically imports zod on every call to access z.toJSONSchema. Even though module imports are cached, this still adds per-call async overhead in a hot path (auto-capture/profile learning). Consider a static import of z at module scope, or caching the z module / the generated JSON schema alongside the schema instance.
| // Use any provider that is already authenticated in opencode for auto-capture | ||
| // and user profile learning. The plugin calls opencode's session.prompt API | ||
| // (with structured output) instead of talking to provider HTTPS endpoints | ||
| // directly, so opencode owns the auth, token refresh, and provider routing. | ||
| // | ||
| // No separate API key is needed in this plugin — whatever you configured in | ||
| // opencode (OAuth like Claude Pro/Max, GitHub Copilot personal/business, | ||
| // bring-your-own API key, custom provider, ...) just works. |
There was a problem hiding this comment.
In the config template string, the newly added OpenCode Provider section is indented inconsistently compared to surrounding sections (extra leading spaces before // lines). This makes the generated JSONC harder to read and looks like an accidental formatting regression; consider re-aligning indentation to match the rest of the template.
| export interface StructuredOutputOptions<T> { | ||
| client: OpencodeClient; | ||
| providerID: string; | ||
| modelID: string; | ||
| systemPrompt: string; | ||
| userPrompt: string; | ||
| schema: z.ZodType<T>; | ||
| directory?: string; | ||
| retryCount?: number; | ||
| } |
There was a problem hiding this comment.
generateStructuredOutput dropped the previous temperature option (and call sites no longer forward CONFIG.memoryTemperature). This is a behavior change: users can still configure memoryTemperature, but it no longer affects the opencodeProvider/opencodeModel path. If opencode v2 session.prompt supports temperature (or equivalent sampling controls), consider adding it back to StructuredOutputOptions and forwarding it; otherwise, document clearly that temperature is ignored when using opencodeProvider so the config knob isn't misleading.
Summary
Replaces the auth.json-reading + manual OAuth refresh + direct provider HTTPS flow with opencode's v2 client SDK. Per call we open a transient session, prompt with
format: json_schema, then delete the session.Why
session.promptunlocks every provider opencode supports out of the box, including github-copilot personal/business plans, OAuth flows like Claude Pro/Max, and any custom provider users wire up.What changed
src/services/ai/opencode-provider.ts: 363 → 147 lines. Drop auth.json parser, OAuthFetch with token refresh, and AI-SDK Anthropic/OpenAI clients. AddcreateV2Client+getV2Client+ newgenerateStructuredOutputsignature taking{ client, providerID, modelID, ... }.src/index.ts: wiresetV2Client(createV2Client(ctx.serverUrl))at plugin init; drop now-unusedsetStatePath(client.path.get())call.src/services/auto-capture.tsanduser-memory-learning.ts: migrate to the newgenerateStructuredOutputsignature; fail fast if v2 client is not yet initialized.tests/opencode-provider.test.ts: rewrite to mockOpencodeClientand cover session lifecycle, schema serialization, and error paths.package.json+bun.lock: drop@ai-sdk/anthropic,@ai-sdk/openai,ai.src/config.ts+README.md: documentgithub-copilotas a first-class supported provider.Verification
bun run typecheck— cleanbun test tests/opencode-provider.test.ts— 11 pass / 0 fail (27 expect calls)Net diff
+468 / −437 across 9 files. The +s are mostly new test coverage for the v2 client mock; the production module shrinks substantially.
Note on ordering
Independent of #100 in scope, but the diffs touch overlapping lines in
src/index.ts(warmup wiring). Recommend merging #100 first; happy to rebase this on top once #100 lands.