Skip to main content
Get started with Eden AI’s OpenAI-compatible LLM endpoint in minutes.
You need credits to make API calls. For testing without using credits, use a sandbox token.

Prerequisites

  1. API Token - Get your token from the Eden AI dashboard
  2. Credits - Ensure your account has sufficient credits

Make Your First Call

Eden AI provides access to 100+ LLM models through a single OpenAI-compatible endpoint.
import requests

url = "https://api.edenai.run/v3/llm/chat/completions"
headers = {
    "Authorization": "Bearer YOUR_API_KEY",
    "Content-Type": "application/json"
}

payload = {
    "model": "openai/gpt-4",
    "messages": [
        {"role": "user", "content": "Hello! How are you?"}
    ]
}

response = requests.post(url, headers=headers, json=payload)
result = response.json()
print(result["choices"][0]["message"]["content"])

Response

{
  "id": "chatcmpl-123",
  "object": "chat.completion",
  "choices": [
    {
      "message": {"role": "assistant", "content": "Hello! I'm doing well, thank you."},
      "index": 0,
      "finish_reason": "stop"
    }
  ],
  "usage": {"prompt_tokens": 12, "completion_tokens": 10, "total_tokens": 22}
}

Using OpenAI SDK

You can use the official OpenAI SDK with Eden AI:
Python
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_EDEN_AI_API_KEY",
    base_url="https://api.edenai.run/v3/llm"
)

response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4-5",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Model Format

Use the format provider/model:
  • openai/gpt-4
  • anthropic/claude-sonnet-4-5
  • google/gemini-2.5-flash
  • mistral/mistral-large

List all available models

Discover all 100+ LLM models available through Eden AI.

Next Steps