Record & Share Like a Pro
Free Screen Recording Tool
Made with ❤️ by developers at Codersera, forever free
5 min to read
The rise of AI-powered coding tools has transformed software development, but many popular solutions—like Cursor and GitHub Copilot—are closed-source and cloud-based, raising concerns about privacy and data control.
Enter Void, an open-source, locally-hosted AI code editor, and Ollama, a robust tool for running large language models (LLMs) on your own machine.
This guide provides a comprehensive walkthrough for running Void AI with Ollama on Ubuntu, creating a powerful, private, and customizable alternative to Cursor.
Void is an open-source AI code editor designed as a direct alternative to Cursor. Built as a fork of VS Code, it retains full extension and theme compatibility, while adding powerful AI features for code completion, editing, and chat—all with your data kept local.
Ollama is an open-source tool for running large language models directly on your local machine. It supports a wide range of LLMs (like Llama 2, Llama 3, Mistral, Gemma, and more), offering full data privacy, offline operation, and multi-platform support (Linux, macOS, Windows).
Combining Void and Ollama gives you a fully local, private, and extensible AI coding environment:
Feature | Cursor | Void + Ollama |
---|---|---|
Open Source | No | Yes |
Local Model Hosting | No (Cloud-based) | Yes (Ollama) |
Data Privacy | Limited | Full (local-only) |
Extension Support | No | Yes (VS Code compatible) |
Model Choice | Fixed | Any supported by Ollama |
Cost | Paid/Subscription | Free & Open Source |
Platform Support | macOS, Windows | Linux, macOS, Windows |
Update your package lists and upgrade existing packages:
bashsudo apt
updatesudo apt
upgrade -y
Ollama is required to host your LLMs locally.
a. Open Terminal and Install Ollama:
bashcurl -fsSL https://ollama.com/install.sh | sh
b. Allow Ollama’s Default Port (if using a firewall):
bashsudo ufw allow 11434
/tcp
c. Start the Ollama Daemon:
bashollama daemon start
d. Verify Ollama is Running:
Visit http://localhost:11434
in your browser. You should see a message:
“Ollama is running”.
Ollama supports many models. For this guide, let’s use Llama 3.1 as an example.
a. Pull the Model:
bashollama pull llama3.1:8b
b. Run the Model (Optional Test):
bashollama run llama3.1:8b --prompt "Hello, world!"
You can list all available models:
bashollama list
a. Download the Latest .deb
Package:
Get the latest release from the Void GitHub Releases page.
b. Install Void Using APT:
bashcd
~/Downloadssudo apt
updatesudo apt install
./void_1.99.30034_amd64.deb
(Replace the filename with the latest version as needed).
Start Void from your applications menu or by running:
bashvoid
On first launch, Void will detect your local Ollama instance running at localhost:11434
4.
Void automatically detects running Ollama instances on the default port (11434). If you have multiple models, you can choose which to use from within Void’s settings or model selection menu.
llama3.1:8b
).Void provides several AI-powered features, all running locally with your chosen model:
Tab
to accept AI-generated code completions.Ctrl + K
to invoke AI-powered editing.Ctrl + L
to open a chat window, ask questions, or attach files for context-aware answers.All these features operate locally, ensuring your code and queries remain private.
ollama pull <model-name>
ollama rm <model-name>
ollama show <model-name>
Ollama supports a wide range of models, including Llama 2, Llama 3, Mistral, Gemma, and more. You can pull and use any supported model:
bashollama pull mistralollama run mistral --prompt "Summarize this code:"
Switch models in Void’s settings as described above.
Since Void is a fork of VS Code, you can install VS Code extensions and themes for additional functionality and customization.
Q: Void can’t connect to Ollama. What should I check?
ollama daemon start
).localhost:11434
).sudo ufw allow 11434/tcp
).Q: How do I improve AI performance?
Q: Can I use Ollama and Void on Windows or macOS?
Q: Is my code or data ever sent to the cloud?
Running Void AI with Ollama on Ubuntu gives you a powerful, private, and fully customizable AI coding environment—without the privacy trade-offs or costs of commercial solutions like Cursor.
Need expert guidance? Connect with a top Codersera professional today!