Skip to content

refactor(ai): use opencode v2 SDK session.prompt for structured output#101

Open
StumHuang wants to merge 1 commit intotickernelz:mainfrom
StumHuang:feat/opencode-native-auth
Open

refactor(ai): use opencode v2 SDK session.prompt for structured output#101
StumHuang wants to merge 1 commit intotickernelz:mainfrom
StumHuang:feat/opencode-native-auth

Conversation

@StumHuang
Copy link
Copy Markdown

Summary

Replaces the auth.json-reading + manual OAuth refresh + direct provider HTTPS flow with opencode's v2 client SDK. Per call we open a transient session, prompt with format: json_schema, then delete the session.

Why

  • opencode already owns the user's auth, token refresh, and provider routing for every connected provider; reimplementing that here was fragile (broke whenever opencode tweaked auth.json layout) and limited us to anthropic + openai.
  • Going through session.prompt unlocks every provider opencode supports out of the box, including github-copilot personal/business plans, OAuth flows like Claude Pro/Max, and any custom provider users wire up.

What changed

  • src/services/ai/opencode-provider.ts: 363 → 147 lines. Drop auth.json parser, OAuthFetch with token refresh, and AI-SDK Anthropic/OpenAI clients. Add createV2Client + getV2Client + new generateStructuredOutput signature taking { client, providerID, modelID, ... }.
  • src/index.ts: wire setV2Client(createV2Client(ctx.serverUrl)) at plugin init; drop now-unused setStatePath(client.path.get()) call.
  • src/services/auto-capture.ts and user-memory-learning.ts: migrate to the new generateStructuredOutput signature; fail fast if v2 client is not yet initialized.
  • tests/opencode-provider.test.ts: rewrite to mock OpencodeClient and cover session lifecycle, schema serialization, and error paths.
  • package.json + bun.lock: drop @ai-sdk/anthropic, @ai-sdk/openai, ai.
  • src/config.ts + README.md: document github-copilot as a first-class supported provider.

Verification

  • bun run typecheck — clean
  • bun test tests/opencode-provider.test.ts11 pass / 0 fail (27 expect calls)
  • Pre-commit hook (typecheck + prettier) passed

Net diff

+468 / −437 across 9 files. The +s are mostly new test coverage for the v2 client mock; the production module shrinks substantially.

Note on ordering

Independent of #100 in scope, but the diffs touch overlapping lines in src/index.ts (warmup wiring). Recommend merging #100 first; happy to rebase this on top once #100 lands.

Replaces the auth.json-reading + manual OAuth refresh + direct provider
HTTPS flow with opencode's v2 client SDK. Per call we open a transient
session, prompt with format:json_schema, then delete the session.

Why:
- opencode already owns the user's auth, token refresh, and provider
  routing for every connected provider; reimplementing that here was
  fragile (broke whenever opencode tweaked auth.json layout) and limited
  us to anthropic + openai.
- Going through session.prompt unlocks every provider opencode supports
  out of the box, including github-copilot personal/business plans, OAuth
  flows like Claude Pro/Max, and any custom provider users wire up.

Changes:
- src/services/ai/opencode-provider.ts: 363 -> 147 lines. Drop auth.json
  parser, OAuthFetch with token refresh, and AI-SDK Anthropic/OpenAI
  clients. Add createV2Client + getV2Client + new generateStructuredOutput
  signature taking { client, providerID, modelID, ... }.
- src/index.ts: wire setV2Client(createV2Client(ctx.serverUrl)) at plugin
  init; drop now-unused setStatePath(client.path.get()) call.
- src/services/auto-capture.ts and user-memory-learning.ts: migrate to
  the new generateStructuredOutput signature; fail fast if v2 client is
  not yet initialized.
- tests/opencode-provider.test.ts: rewrite to mock OpencodeClient and
  cover session lifecycle, schema serialization, and error paths.
- package.json + bun.lock: drop @ai-sdk/anthropic, @ai-sdk/openai, ai.
- src/config.ts + README.md: document github-copilot as a first-class
  supported provider.

Tests: 11/11 passing in tests/opencode-provider.test.ts. Typecheck clean.
Copilot AI review requested due to automatic review settings April 23, 2026 01:37
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors the plugin’s “opencodeProvider” structured-output path to use opencode’s v2 SDK (session.prompt) instead of reading auth.json and calling provider HTTPS endpoints directly, enabling broader provider support and simplifying auth/refresh handling.

Changes:

  • Replace the legacy auth.json + manual OAuth/provider client flow with opencode v2 session.createsession.prompt(format: json_schema)session.delete.
  • Wire v2 client initialization at plugin startup and migrate auto-capture + user-profile learning to the new generateStructuredOutput signature.
  • Update tests to mock the v2 client/session lifecycle and remove unused AI SDK dependencies.

Reviewed changes

Copilot reviewed 8 out of 9 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/services/ai/opencode-provider.ts Implements v2 SDK-based structured output with transient sessions and Zod validation.
src/index.ts Initializes v2 client + connected providers state during plugin startup.
src/services/auto-capture.ts Switches auto-capture structured output calls to v2 client usage with initialization checks.
src/services/user-memory-learning.ts Switches user-profile learning structured output calls to v2 client usage with initialization checks.
tests/opencode-provider.test.ts Rewrites tests to mock v2 session API calls and validate lifecycle/error handling.
src/config.ts Updates config template docs for opencode providers (incl. GitHub Copilot).
README.md Documents opencode session-based provider support more broadly.
package.json Removes unused AI SDK dependencies after switching to opencode v2 SDK flow.
bun.lock Lockfile updates to reflect removed dependencies.
Comments suppressed due to low confidence (1)

src/index.ts:54

  • setV2Client(createV2Client(ctx.serverUrl)) is currently executed inside a fire-and-forget async IIFE that also awaits ctx.client.provider.list(). This introduces a race where calls into getV2Client() (e.g., auto-capture / profile learning) can see undefined and hard-fail even though ctx.serverUrl is already available. Consider initializing the v2 client synchronously (outside the IIFE) and keeping only the provider listing async, or exposing/awaiting a shared initialization promise before first use.
  (async () => {
    try {
      const { setConnectedProviders, setV2Client, createV2Client } =
        await import("./services/ai/opencode-provider.js");
      setV2Client(createV2Client(ctx.serverUrl));
      const providerResult = await ctx.client.provider.list();
      if (providerResult.data?.connected) {
        setConnectedProviders(providerResult.data.connected);
      }

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +61 to +70
// zod v4 exposes JSON Schema export natively (instance `.toJSONSchema()`
// and global `z.toJSONSchema()`); we prefer instance, fall back to global.
// This avoids pulling in a separate `zod-to-json-schema` dependency.
const jsonSchema =
(
schema as unknown as {
toJSONSchema?: () => Record<string, unknown>;
}
).toJSONSchema?.() ?? (await import("zod")).z.toJSONSchema(schema);

Copy link

Copilot AI Apr 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generateStructuredOutput dynamically imports zod on every call to access z.toJSONSchema. Even though module imports are cached, this still adds per-call async overhead in a hot path (auto-capture/profile learning). Consider a static import of z at module scope, or caching the z module / the generated JSON schema alongside the schema instance.

Copilot uses AI. Check for mistakes.
Comment thread src/config.ts
Comment on lines +265 to +272
// Use any provider that is already authenticated in opencode for auto-capture
// and user profile learning. The plugin calls opencode's session.prompt API
// (with structured output) instead of talking to provider HTTPS endpoints
// directly, so opencode owns the auth, token refresh, and provider routing.
//
// No separate API key is needed in this plugin — whatever you configured in
// opencode (OAuth like Claude Pro/Max, GitHub Copilot personal/business,
// bring-your-own API key, custom provider, ...) just works.
Copy link

Copilot AI Apr 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the config template string, the newly added OpenCode Provider section is indented inconsistently compared to surrounding sections (extra leading spaces before // lines). This makes the generated JSONC harder to read and looks like an accidental formatting regression; consider re-aligning indentation to match the rest of the template.

Copilot uses AI. Check for mistakes.
Comment on lines +40 to 49
export interface StructuredOutputOptions<T> {
client: OpencodeClient;
providerID: string;
modelID: string;
systemPrompt: string;
userPrompt: string;
schema: z.ZodType<T>;
directory?: string;
retryCount?: number;
}
Copy link

Copilot AI Apr 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generateStructuredOutput dropped the previous temperature option (and call sites no longer forward CONFIG.memoryTemperature). This is a behavior change: users can still configure memoryTemperature, but it no longer affects the opencodeProvider/opencodeModel path. If opencode v2 session.prompt supports temperature (or equivalent sampling controls), consider adding it back to StructuredOutputOptions and forwarding it; otherwise, document clearly that temperature is ignored when using opencodeProvider so the config knob isn't misleading.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants