4 min to read
This comprehensive guide walks you through installing and running Cherry Studio with Ollama on Ubuntu Linux. Learn how to set up a robust local environment for running large language models (LLMs) privately, securely, and efficiently.
Whether you're a developer, researcher, or privacy-conscious user, this setup will give you powerful AI capabilities without relying on cloud services.
A cross-platform desktop application for managing and interacting with LLMs. Supports local models via Ollama and cloud providers like OpenAI, Gemini, and Anthropic.
An open-source tool that enables local execution of LLMs with support for popular models like LLaMA 2, DeepSeek, Mistral, and Gemma. It exposes an OpenAI-compatible API, making integration with other tools seamless.
Cherry Studio provides Linux builds in .AppImage and .deb formats.
wget https://github.com/Cherry-AI/CherryStudio/releases/download/vX.Y.Z/CherryStudio-x86_64.AppImage
chmod +x CherryStudio-x86_64.AppImage
chmod +x CherryStudio-x86_64.AppImage
./CherryStudio-x86_64.AppImage
sudo dpkg -i CherryStudio-x86_64.deb
sudo apt-get install -f
Then run:
cherrystudio
Install Ollama using the official script:
curl -fsSL https://ollama.com/install.sh | sh
Verify the installation:
ollama --version
Choose and download a model:
ollama pull llama3
Start the model:
ollama run llama3
Keep this terminal window open—Cherry Studio connects to Ollama’s local API at http://localhost:11434.
http://localhost:11434llama3).llama3).Use Ollama to manage multiple models:
ollama list # View available models
ollama pull gemma # Download another model
ollama rm llama3 # Remove a model
Add any additional models to Cherry Studio under “Providers > Ollama > Add Model.”
If Ollama runs on a non-default port, update the API Address in Cherry Studio settings accordingly:
http://localhost:12345
Install FUSE for AppImage support:
sudo apt install libfuse2
Check if the API is live:
curl http://localhost:11434
If it fails, ensure the model is running and that nothing is blocking the port.
Ensure the model name exactly matches the name shown in:
ollama list
| Step | Command/Action | Notes |
|---|---|---|
| Download Cherry Studio | wget ... |
Get latest version from GitHub |
| Run Cherry Studio | ./CherryStudio-x86_64.AppImage |
Or install with .deb package |
| Install Ollama | `curl -fsSL ... | sh` |
| Download Model | ollama pull llama3 |
Choose a supported model |
| Run Model | ollama run llama3 |
Keep terminal open |
| Configure in Cherry | Settings → Model Providers → Add Ollama | Use default API address |
| Add Model to Cherry | Use + Add and model name |
Must match name from ollama list |
By integrating Cherry Studio with Ollama on Ubuntu, you unlock the full power of local LLMs with zero reliance on cloud infrastructure. This setup is ideal for developers, researchers, and privacy advocates seeking flexibility, performance, and security.
Connect with top remote developers instantly. No commitment, no risk.
Tags
Discover our most popular articles and guides
Running Android emulators on low-end PCs—especially those without Virtualization Technology (VT) or a dedicated graphics card—can be a challenge. Many popular emulators rely on hardware acceleration and virtualization to deliver smooth performance.
The demand for Android emulation has soared as users and developers seek flexible ways to run Android apps and games without a physical device. Online Android emulators, accessible directly through a web browser.
Discover the best free iPhone emulators that work online without downloads. Test iOS apps and games directly in your browser.
Top Android emulators optimized for gaming performance. Run mobile games smoothly on PC with these powerful emulators.
The rapid evolution of large language models (LLMs) has brought forth a new generation of open-source AI models that are more powerful, efficient, and versatile than ever.
ApkOnline is a cloud-based Android emulator that allows users to run Android apps and APK files directly from their web browsers, eliminating the need for physical devices or complex software installations.
Choosing the right Android emulator can transform your experience—whether you're a gamer, developer, or just want to run your favorite mobile apps on a bigger screen.
The rapid evolution of large language models (LLMs) has brought forth a new generation of open-source AI models that are more powerful, efficient, and versatile than ever.