Use AlphaNeural with OpenCode

OpenCode can talk to any OpenAI-compatible API by setting a provider baseURL and giving it an API key. AlphaNeural fits that shape, so the integration is mostly config and credentials.

OpenCode can talk to any OpenAI-compatible API by setting a provider baseURL and giving it an API key. AlphaNeural fits that shape, so the integration is mostly config and credentials.

Add your AlphaNeural API key to OpenCode

In OpenCode, run:

/connect

Choose Other, then enter a provider id like alphaneural, then paste your AlphaNeural API key. OpenCode stores credentials in ~/.local/share/opencode/auth.json.

Create a project config

Create opencode.json in your project root. Project config has high precedence and is safe to commit. It overrides global config at ~/.config/opencode/opencode.json.

{
  "$schema": "https://opencode.ai/config.json",
  "model": "alphaneural/qwen3",
  "provider": {
    "alphaneural": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "AlphaNeural",
      "options": {
        "baseURL": "https://proxy.alfnrl.io/v1"
      },
      "models": {
        "qwen3": { "name": "Qwen 3" }
      }
    }
  }
}

Notes:

  • OpenCode supports overriding a provider’s baseURL via options.baseURL.

  • The model format is providerId/modelId, like anthropic/claude-sonnet-4-5 in their examples.

  • Replace qwen3 with any model id exposed by your AlphaNeural /v1/models endpoint.

Pick a model in the UI and run

OpenCode will detect your provider and models. Use:

Select AlphaNeural, then pick the model you configured.

Quick smoke test from the CLI is also fine:


Troubleshooting

“It ignores my baseURL” or “NotFoundError” for custom OpenAI-compatible providers

There is an open issue where OpenCode’s bundled @ai-sdk/openai-compatible provider may fail to forward options like baseURL, causing requests to be sent without your custom endpoint settings.

What to do

  • Upgrade OpenCode to the latest version available for your platform, then retry.

  • If it still fails, use the workaround below.

Workaround. Use the built-in openai provider with a custom baseURL

OpenCode docs state you can customise the base URL for any provider by setting options.baseURL. So you can route the built-in openai provider to AlphaNeural:

This keeps you on the built-in OpenAI provider path while still pointing traffic at AlphaNeural.


Optional. Put AlphaNeural in global config

If you want AlphaNeural available in every repo, add the provider block to ~/.config/opencode/opencode.json. OpenCode merges global and project configs, with project taking precedence.

Last updated