Skip to main content
Skip to content

Use Echoo with Ollama (Local LLM)

Run AI text transformation 100% locally on your Mac with Ollama and Echoo. Maximum privacy, zero API costs, offline capable.

Setup Guide

1

Install Ollama

Download Ollama from ollama.com and install it on your Mac.

2

Pull a model

Open Terminal and run: ollama pull llama3 (or any model you prefer).

3

Start Ollama

Run ollama serve in Terminal, or it starts automatically.

4

Open Echoo settings

Click the Echoo icon and open Settings > AI Provider.

5

Select Ollama

Choose Ollama as your provider. The default endpoint (localhost:11434) is pre-configured.

Features

Secured processing - your data never leaves your Mac
Zero API costs - runs entirely on your hardware
Works offline without internet
Choose from hundreds of open-source models
Apple Silicon optimized for fast inference

Supported Models

Llama 3.2MistralPhi-3CodeLlamaGemmaQwen

Frequently Asked Questions

Related Articles

Explore More

Get Started with Ollama

Download Echoo and connect to Ollama in minutes.