Skip to main content
Skip to content

Use Echoo with LocalAI

Self-host AI models with LocalAI and use them with Echoo for private, offline text transformation on macOS.

Setup Guide

1

Install LocalAI

Follow the LocalAI installation guide at localai.io. Docker is the easiest method.

2

Download a model

Use the LocalAI model gallery to download a compatible model.

3

Start LocalAI

Run LocalAI. It provides an OpenAI-compatible API endpoint.

4

Open Echoo settings

Click the Echoo icon and open Settings > AI Provider.

5

Configure endpoint

Select "Internal Provider" (the OpenAI-compatible endpoint option) and enter your LocalAI endpoint URL. LocalAI does not have a dedicated provider dropdown — it connects through the Internal Provider since it exposes an OpenAI-compatible API.

Features

OpenAI-compatible API for easy integration
Self-hosted for maximum control
Supports multiple model formats (GGML, GGUF, etc.)
GPU acceleration support

Supported Models

Any GGUF modelLlama-based modelsWhisper (for voice)Stable Diffusion

Frequently Asked Questions

Related Articles

Explore More

Get Started with LocalAI

Download Echoo and connect to LocalAI in minutes.