# Authentication

### Where to put the API key

#### Option A. OpenAI-style (recommended for OpenAI SDK compatibility)

Send the key as a Bearer token:

```bash
-H "Authorization: Bearer $ALPHANEURAL_API_KEY"
```

This is the style used throughout the proxy examples in the OpenAPI spec.

#### Option B. Proxy-native header style

Send the key in `x-alphaneural-api-key`:

```bash
-H "x-alphaneural-api-key: $ALPHANEURAL_API_KEY"
```

This header is defined as the security scheme in the OpenAPI spec (source of truth).

### Quick test

List models (any authenticated endpoint works):

```bash
curl https://proxy.alfnrl.io/v1/models \
  -H "Authorization: Bearer $ALPHANEURAL_API_KEY"
```

### Using the OpenAI SDKs

Because AlphaNeural is OpenAI-compatible, you can point the OpenAI SDKs at the AlphaNeural base URL and keep the rest of your code the same.

#### Python

```python
from openai import OpenAI
import os

client = OpenAI(
  api_key=os.environ["ALPHANEURAL_API_KEY"],
  base_url="https://proxy.alfnrl.io/v1",
)

resp = client.chat.completions.create(
  model="qwen3",
  messages=[{"role":"user","content":"Hello AlphaNeural"}],
)
print(resp.choices[0].message.content)
```

#### Node.js / Javascript

```js
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.ALPHANEURAL_API_KEY,
  baseURL: "https://proxy.alfnrl.io/v1",
});

const resp = await client.chat.completions.create({
  model: "qwen3",
  messages: [{ role: "user", content: "Hello AlphaNeural" }],
});

console.log(resp.choices[0].message.content);
```

### Security notes

* Treat API keys like passwords. Keep them server-side and load them from environment variables or a secrets manager.
* Rotate keys if they are ever exposed. Admin endpoints and usage reporting are also protected by the same API key mechanism in the spec.
