Overview
Eden AI V3 provides full OpenAI API compatibility with multi-provider support. The endpoint follows OpenAI’s exact format, making it a drop-in replacement. Endpoint:Model Format
Use the simplified model string format for LLM:openai/gpt-4anthropic/claude-sonnet-4-5google/gemini-2.5-flashcohere/command-r-plus
Basic Chat Completion
Multi-Turn Conversations
Build conversations with message history:System Messages
Guide the model’s behavior with system messages:Temperature and Parameters
Control response creativity and behavior:Available Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
model | string | Required | Model string (e.g., openai/gpt-4) |
messages | array | Required | Conversation messages |
stream | boolean | false | Enable streaming (uses SSE when true) |
temperature | float | 1.0 | Randomness (0-2) |
max_tokens | integer | - | Maximum response tokens |
top_p | float | 1.0 | Nucleus sampling threshold |
frequency_penalty | float | 0.0 | Penalize repeated tokens (-2 to 2) |
presence_penalty | float | 0.0 | Penalize topic repetition (-2 to 2) |
Response Format
Standard JSON response:Available Models
For the full list of supported models and their capabilities (PDF support, reasoning, web search, tool calling), see List LLM Models.OpenAI Python SDK Integration
Use Eden AI with the OpenAI SDK:Next Steps
First Expert Model Call
Try OCR, image, and audio features with the Expert Models endpoint
LLMs vs Expert Models
Understand when to use each endpoint