LibreChat
Configure LibreChat, the open-source ChatGPT alternative, to use Eden AI for access to 200+ AI models.Overview
LibreChat is a free, open-source AI chat platform that supports multiple providers. By connecting it to Eden AI, you get:- 200+ models: Access OpenAI, Anthropic, Google, Cohere, and more through one interface
- Self-hosted: Full control over your data and infrastructure
- Cost savings: Leverage Eden AI’s competitive pricing
- Unified experience: Single chat interface for all providers
Prerequisites
- Docker and Docker Compose installed
- Eden AI API key from https://app.edenai.run
- Basic knowledge of environment variables
Installation
Option 1: Docker Compose (Recommended)
Clone LibreChat and set up with Docker:Option 2: Manual Installation
Configuration
Step 1: Configure Environment Variables
Edit the.env file to add Eden AI configuration:
Step 2: Configure librechat.yaml
Create or editlibrechat.yaml for advanced configuration:
Step 3: Start LibreChat
Access LibreChat athttp://localhost:3080
Available Models
Configure which models appear in the LibreChat interface:Features
Multi-Model Conversations
Switch between models mid-conversation:- Start a conversation with Claude
- Click the model selector
- Switch to GPT-4 or Gemini
- Continue the conversation seamlessly
File Attachments
Upload files for vision-capable models:Preset Prompts
Create custom prompts for common tasks:Advanced Configuration
Custom Model Parameters
Configure temperature, max tokens, and other parameters:User Authentication
Enable user registration and authentication:Rate Limiting
Protect your API key with rate limiting:Conversation History
Configure MongoDB for persistent conversations:Docker Deployment
Production Docker Compose
Deploy to Production
Troubleshooting
Models Not Appearing
If models don’t show up in the interface:-
Check librechat.yaml syntax:
-
Verify API key:
-
Clear cache and restart:
Authentication Errors
If you see 401 errors:- Check
.envhas correct API key - Ensure no extra spaces in the key
- Verify
OPENAI_REVERSE_PROXYURL is correct - Restart services after changing
.env
Slow Responses
If responses are slow:-
Use faster models for chat titles:
-
Disable unnecessary features:
- Check your internet connection and Eden AI status
Connection Refused
If LibreChat can’t connect to MongoDB:-
Check MongoDB is running:
-
Verify MONGO_URI in .env:
-
Check network connectivity:
Security Best Practices
1. Secure API Keys
Never commit API keys to version control:2. Use Environment-Specific Configs
3. Enable HTTPS
Use a reverse proxy like Nginx:4. Implement Rate Limiting
Protect against abuse:Cost Optimization
1. Use Appropriate Models
Configure cheaper models for simple tasks:2. Monitor Usage
Track costs through Eden AI dashboard:- View usage at https://app.edenai.run
- Set up billing alerts
- Review monthly reports
3. Limit Token Usage
Example Use Cases
1. Team Collaboration
Set up LibreChat for your team:- Enable user registration for team members
- Configure multiple endpoints for different projects
- Use presets for common workflows (code review, documentation, etc.)
2. Customer Support
Deploy as an internal support tool:- Create presets for support responses
- Use conversation history to maintain context
- Configure rate limits to prevent abuse
3. Development Assistant
Integrate with your development workflow:- Code assistance with Claude or GPT-4
- Documentation generation with presets
- Bug analysis with vision models (screenshots)
Next Steps
- Open WebUI - Alternative chat interface
- Python SDK - Programmatic access
- API Reference - Complete API documentation
- Cost Management - Track spending