Skip to main content
Skip to content

Use Echoo with LiteLLM

Use LiteLLM as a proxy to connect Echoo to 100+ AI providers through a unified interface. Maximum flexibility for your AI stack.

Setup Guide

1

Install LiteLLM

Run: pip install litellm, or use Docker.

2

Configure providers

Set up your API keys for any supported providers in LiteLLM config.

3

Start the proxy

Run: litellm --model gpt-4o (or your preferred model).

4

Open Echoo settings

Click the Echoo icon and open Settings > AI Provider.

5

Connect to LiteLLM

Select "Internal Provider" (the OpenAI-compatible endpoint option) and enter the LiteLLM proxy URL (default: localhost:4000). LiteLLM does not have a dedicated provider dropdown — it connects through the Internal Provider since it exposes an OpenAI-compatible API.

Features

Unified interface for 100+ LLM providers
Load balancing across multiple providers
Fallback routing if one provider is down
Cost tracking and usage monitoring
OpenAI-compatible API format

Supported Models

Any model from 100+ providersOpenAIAnthropicGoogleAzureAWS Bedrock

Frequently Asked Questions

Related Articles

Explore More

Get Started with LiteLLM

Download Echoo and connect to LiteLLM in minutes.