Connect with OneDrive
High Quality Video Sharing
Store & share your recordings seamlessly with OneDrive integration
3 min to read
Void Linux is a lightweight, systemd-free Linux distribution lauded for its speed, minimalism, and control. With the rise of local AI and Large Language Models (LLMs), tools like Ollama have made it easier for users to run advanced AI models on their own hardware.
This guide provides a thorough walkthrough for installing and running Void AI with Ollama on Linux, covering everything from system preparation to advanced configuration and troubleshooting.
runit
instead of systemd
.xbps
for package management.sudo
privileges required.sudo xbps-install -Syu
curl -fsSL https://ollama.com/install.sh | sh
/usr/local/bin
.Ollama may not work out-of-the-box due to ABI issues.
xbps-install -Sy bash git go gcc cmake make
git clone https://github.com/ollama/ollama.git && cd ollama
go generate ./...
go build .
./ollama serve &
ollama pull phi3
ollama run phi3
/usr/share/ollama/.ollama/models
$HOME/.ollama/models
Backup Example:
tar -cvf /mnt/hdd/backup/models.tar $HOME/.ollama/models
ollama serve &
http://127.0.0.1:11434/
ollama daemon start
ollama ps
ollama stop phi3
/bye
in terminal.deb
file from official GitHub.cd ~/Downloads
sudo apt update
sudo apt install ./void_1.99.30034_amd64.deb
Tunnel Ollama from another system:
ssh -L 7500:127.0.0.1:11434 user@ollamaserver
$HOME/.ollama
.11434
; change if needed.ollama pull phi3
ollama serve &
Can I use Ollama on Void-musl?
Yes, but it may require building from source or using glibc in a container.
Does Ollama support GPU on Void?
Yes, but manual setup is required. Use CPU mode as fallback.
Where are models stored?
In /usr/share/ollama/.ollama/models
or $HOME/.ollama/models
.
Can I run Ollama as a background service?
Yes, with ollama daemon start
or custom runit/systemd scripts.
Task | Command |
---|---|
Update Void Linux | sudo xbps-install -Syu |
Install Ollama | `curl -fsSL https://ollama.com/install.sh |
Pull Model | ollama pull phi3 |
Run Model | ollama run phi3 |
Start Ollama Server | ollama serve & |
List Running Models | ollama ps |
Stop a Model | ollama stop phi3 |
Install Void IDE (.deb) | sudo apt install ./void_xxx_amd64.deb |
SSH Tunnel Remote Ollama | ssh -L 7500:127.0.0.1:11434 user@server |
Running Ollama on Void Linux empowers users to run advanced AI locally with full control over privacy and performance. While Void’s minimal design can present compatibility hurdles, its flexibility and speed make it a great choice for power users. With proper setup, you can enjoy seamless AI-driven development, fully offline.
Need expert guidance? Connect with a top Codersera professional today!