The @lightweight-ai/opencode-plugin connects the OpenCode CLI to the Lightweight AI API gateway. It provides seamless access to a wide range of large language models through a single unified interface.
This plugin integrates the Lightweight AI API (api.lightweight.one) into OpenCode, allowing users to leverage over 50 models including GPT-5, Claude Opus 4.6, and Gemini. It handles model discovery, capability mapping, and authentication automatically.
Install the package via npm:
npm install @lightweight-ai/opencode-pluginTo use this plugin, you must register at the Lightweight AI platform to obtain an API key. Your API key should follow the format lw_sk_....
The plugin requires the LIGHTWEIGHT_API_KEY environment variable to be set.
export LIGHTWEIGHT_API_KEY=lw_sk_your_api_key_hereOnce installed and configured, OpenCode will automatically detect the "lightweight" provider. The plugin fetches available models from the API and caches them for one hour.
Reasoning models automatically include effort level variants:
- low
- medium
- high
- xhigh
The plugin implements two primary hooks:
- config: Registers the "lightweight" provider using
@ai-sdk/openaias the underlying SDK. It points tohttps://api.lightweight.one/v1and maps models based on their reported capabilities (tool calling, attachments, reasoning). - chat.headers: Adds
X-Client: opencodeandX-Plugin-Version: 1.0.0headers to all outgoing requests for better telemetry and support.
Model discovery is performed by fetching /v1/models with the provided Bearer token. The plugin maps technical constraints such as context and output limits automatically.
MIT