3 min to read
This comprehensive guide walks you through every step—from prerequisites to advanced features—ensuring a smooth and efficient setup of Cherry Studio and Ollama on macOS.
Start Ollama as a background service
brew services start ollama
Recommended: Install via Homebrew
brew install ollama
Ollama will now be running at:http://localhost:11434/
.zip
from the official site.Ollama.app
into your Applications folder.Visit http://localhost:11434/
in your browser. If Ollama is running, you’ll see a status confirmation.
Check the Ollama models page or use the terminal.
Example:
ollama pull deepseek-r1
ollama run deepseek-r1
You can now interact with the model directly from your terminal.
.dmg
or .zip
file.Cherry Studio
to your Applications folder.brew install --cask cherry-studio
This method ensures you're installing the latest version.
Open the application from your Applications folder.
http://localhost:11434/
Compare responses by interacting with multiple models (OpenAI, Gemini, Ollama, etc.) in a single conversation window.
Create assistants for tasks like summarization, code generation, document analysis, and more. Cherry Studio includes 300+ prebuilt assistants, or you can customize your own.
Upload and analyze:
Your selected model will handle the extraction and interpretation.
Utilize translation, global search, and other productivity tools—all powered by your chosen LLM.
ollama pull model-name
.Ollama not running?
Restart with:
brew services restart ollama
Q: Can I use Cherry Studio without Ollama?
Yes, Cherry Studio supports cloud-based LLMs like OpenAI, Gemini, and Anthropic. Ollama enhances your setup by allowing offline, local inference.
Q: Do I need to be online after setup?
Only for downloading updates or models. Once installed, everything works offline.
Q: Can I switch between multiple models?
Absolutely. Cherry Studio enables dynamic switching and comparison between various local and cloud-based models.
Step | Cherry Studio | Ollama |
---|---|---|
Download | Official website / Homebrew | Official website / Homebrew |
Install | Drag to Applications / brew install |
Drag to Applications / brew install |
Configure | Add Ollama provider, select model | Run service, pull and run models |
Use | Chat, assistants, tools | Terminal or through Cherry Studio |
With Cherry Studio and Ollama on macOS, you can unlock the full power of modern AI—locally, privately, and efficiently. Whether you're building custom AI assistants, analyzing documents, or experimenting with powerful open-source models, this setup provides a seamless and flexible environment for both casual and advanced users.
Connect with top remote developers instantly. No commitment, no risk.
Tags
Discover our most popular articles and guides
Running Android emulators on low-end PCs—especially those without Virtualization Technology (VT) or a dedicated graphics card—can be a challenge. Many popular emulators rely on hardware acceleration and virtualization to deliver smooth performance.
The demand for Android emulation has soared as users and developers seek flexible ways to run Android apps and games without a physical device. Online Android emulators, accessible directly through a web browser.
Discover the best free iPhone emulators that work online without downloads. Test iOS apps and games directly in your browser.
Top Android emulators optimized for gaming performance. Run mobile games smoothly on PC with these powerful emulators.
The rapid evolution of large language models (LLMs) has brought forth a new generation of open-source AI models that are more powerful, efficient, and versatile than ever.
ApkOnline is a cloud-based Android emulator that allows users to run Android apps and APK files directly from their web browsers, eliminating the need for physical devices or complex software installations.
Choosing the right Android emulator can transform your experience—whether you're a gamer, developer, or just want to run your favorite mobile apps on a bigger screen.
The rapid evolution of large language models (LLMs) has brought forth a new generation of open-source AI models that are more powerful, efficient, and versatile than ever.