Documentation
Interactive API Reference
OpenAPI 3.1 仕様 + Scalar UI で全エンドポイントを試せます
LLM API: OpenAI互換 チャット補完API • 「Copy」ボタンでコピーし、Claude Code や GitHub Copilot に貼り付けてください。
# AICU LLM API Documentation
> OpenAI互換のチャット補完API - Beta版(認証なしで利用可能)
## Base URL
```
https://api.aicu.ai/v1
```
## Quick Start
```bash
curl -X POST https://api.aicu.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-v3",
"messages": [{"role": "user", "content": "Hello!"}],
"max_tokens": 100
}'
```
---
## POST /v1/chat/completions
OpenAI互換のチャット補完エンドポイント。
**Request:**
```json
{
"model": "deepseek-v3",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
"max_tokens": 1000,
"temperature": 0.7,
"stream": false
}
```
**Parameters:**
| Name | Type | Required | Description |
|------|------|----------|-------------|
| model | string | Yes | モデルID |
| messages | array | Yes | メッセージ配列 |
| max_tokens | number | No | 最大トークン数 (default: 4096) |
| temperature | number | No | サンプリング温度 0-2 (default: 1) |
| stream | boolean | No | ストリーミング (default: false) |
**Response:**
```json
{
"id": "gen-xxx",
"object": "chat.completion",
"created": 1234567890,
"model": "deepseek-v3",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 15,
"total_tokens": 25
}
}
```
**Response Headers:**
- `X-AICU-Model`: 使用モデル
- `X-AICU-Provider`: プロバイダ (openrouter/groq/aicu)
- `X-AICU-Tokens`: 消費トークン数
- `X-AICU-AP-Cost`: AP消費
- `X-AICU-Latency-Ms`: レイテンシ
---
## GET /v1/chat/models
利用可能なモデル一覧を取得。
```bash
curl https://api.aicu.ai/v1/chat/models
```
**Response:**
```json
{
"object": "list",
"data": [
{
"id": "deepseek-v3",
"owned_by": "openrouter",
"pricing": {"ap_per_1k_tokens": 3},
"stage": "beta",
"description": "汎用、コスパ良好"
}
]
}
```
---
## Available Models
| Model | Provider | Cost | Description |
|-------|----------|------|-------------|
| groq-llama-3.3-70b | Groq | 0 AP (free) | 高品質、30 RPM制限 |
| groq-llama-3.1-8b | Groq | 0 AP (free) | 高速、30 RPM制限 |
| deepseek-v3 | OpenRouter | 3 AP/1K | 汎用、コスパ良好 |
| gemini-flash | OpenRouter | 5 AP/1K | マルチモーダル |
| llama-3.1-8b | OpenRouter | 2 AP/1K | 軽量 |
| llama-3.1-70b | OpenRouter | 20 AP/1K | 高品質 |
| qwen3-32b | OpenRouter | 10 AP/1K | 大規模コンテキスト |
---
## OpenAI SDK Compatibility
```python
from openai import OpenAI
client = OpenAI(
base_url="https://api.aicu.ai/v1",
api_key="dummy" # Beta: 認証不要
)
response = client.chat.completions.create(
model="deepseek-v3",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
```
```typescript
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.aicu.ai/v1',
apiKey: 'dummy',
});
const response = await client.chat.completions.create({
model: 'deepseek-v3',
messages: [{ role: 'user', content: 'Hello!' }],
});
```
---
## Credits (課金)
- 100 AP = ¥1
- Groqモデル: 無料(レート制限あり)
- Beta版: 認証なしで利用可能
---
## Rate Limits
| Provider | Limit |
|----------|-------|
| Groq | 30 RPM |
| OpenRouter | プランに依存 |
---
## Support
- Skill: https://api.aicu.ai/skills/llm
- Dashboard: https://api.aicu.ai/dashboard
- Contact: https://corp.aicu.ai/ja/contact
© 2026 AICU Japan K.K.