v0.9.11 Beta: Local LLM Support & Commands Marketplace
Local LLMs on Mac - Run Ollama, LocalAI, and LiteLLM with Echoo
Want to run AI models locally on your Mac with zero API costs and maximum privacy? v0.9.11-beta makes it possible. Connect Ollama, LocalAI, or LiteLLM and keep every byte of data on your machine. This release also introduces the Commands Marketplace - a community hub for sharing and discovering workflows.
Local / In-House LLM Support
Why Local LLMs?
- Maximum security - your data never leaves your machine
- Zero costs - no API fees, no usage limits
- Full control - choose any open-source model you want
- Offline capable - works without an internet connection
Supported Providers
- Ollama - the easiest way to run models locally
- LocalAI - self-hosted, OpenAI-compatible API
- LiteLLM - proxy that supports 100+ LLM providers with a unified interface
Privacy first: With local models, your text never leaves your device. Not to us, not to any cloud provider. It's the most private way to use AI.
Commands Marketplace
What You Can Do
- Browse a growing library of community-created commands
- Download commands instantly and start using them right away
- Share your own custom commands with the world
- Discover new workflows you never thought of
Community-powered: The Marketplace turns Echoo into a collaborative platform. Every user can contribute, and everyone benefits.
What's Next?
Adoption is accelerating worldwide, and the feedback has been incredible. With the Marketplace and local LLM support, Echoo is becoming a platform that adapts to your exact needs - whether you prioritize privacy, cost savings, or workflow efficiency.
We're excited for what's coming next!
Upgrade Now
Ready to try local LLMs and the Commands Marketplace? Update to the latest version and explore what's new.
Have questions or feedback? We'd love to hear from you! Get in touch with us on GitHub.
Mike
Creator of Echoo