AI-powered code editors are transforming how developers write, refactor, and understand code. Among the most popular commercial options is Cursor, but its closed-source nature and subscription fees have prompted the rise of open-source alternatives.
Void is one such tool, designed as a privacy-first, flexible, and powerful AI coding IDE that can run local large language models (LLMs) using Ollama—making it a true Cursor alternative for Windows users.
What is Void?
Void is an open-source AI code editor, forked from Visual Studio Code (VS Code), and designed to offer advanced AI-powered coding features without the privacy or cost concerns of commercial alternatives like Cursor.
Key Principles:
Open Source: Free to use, modify, and contribute to.
Privacy-First: No data is sent to third-party servers unless you explicitly connect to a cloud model.
Flexible AI Integration: Supports both local and cloud-based LLMs, including direct integration with Ollama.
VS Code Compatibility: Seamlessly migrate themes, keybinds, and settings from VS Code for a familiar experience.
Key Features of Void
Void aims to replicate and extend the best features of Cursor, with unique advantages:
AI-Powered Autocomplete: Press Tab for smart code suggestions.
Inline AI Editing: Use Ctrl+K to edit code selections with AI assistance.
AI Chat and File Context: Press Ctrl+L to ask questions, including file attachments for contextual answers.
Intelligent File Indexing: AI can reference your entire codebase for more relevant suggestions.
Advanced Search: Use AI to find and edit code across large projects.
Prompt Customization: View and edit the prompts used by the AI for fine-tuned results.
Multi-Model Support: Connect to any LLM, local or remote (Claude, GPT, Gemini, etc.).
Experimental Features: Fast code application, contextual awareness, and community-contributed plugins.
No Subscription Fees: 100% open source and community supported.
Use Cases:
Software development with AI code completion and refactoring
Machine learning model integration and experimentation
Open-source collaboration and plugin development
Why Use Ollama?
Ollama is a tool for running large language models locally on your machine, eliminating the need for cloud APIs and ensuring that your code and queries never leave your computer.
It’s especially useful for privacy-conscious developers or those who want to avoid API costs and latency.
Ollama’s Advantages:
Runs Locally: No data leaves your device.
Supports Many Models: Including Llama, CodeLlama, and specialized coding models.
Simple Setup: Easy installation on Windows, with a user-friendly interface.
Resource Efficient: Can run smaller models on CPU or leverage GPU for larger models.
Step-by-Step: Installing and Running Ollama on Windows
Prerequisites
Windows 10 or newer
Sufficient RAM and CPU/GPU resources (at least 16GB RAM and a modern CPU recommended; for larger models, a GPU with at least 24GB VRAM is ideal, but smaller models can run on CPU)
Installation Steps
Download Ollama for Windows
Visit the Ollama website and download the Windows installer (.exe file).
Run the Installer
Double-click the downloaded file and follow the prompts to install Ollama.
Launch Ollama
After installation, start Ollama. You should see its interface indicating it’s running on localhost:11434 by default.
Download a Coding Model
Open a command prompt or PowerShell.
To download and run a model (e.g., CodeLlama or CodeJamma), use:textollama run codellama ortextollama run codejamma
Ollama will download the model, verify it, and start serving it locally.
Verify Ollama is Running
By default, Ollama listens on port 11434. You can check this by visiting http://localhost:11434 in your browser or using command-line tools.
Step-by-Step: Installing and Running Void on Windows
Current Status
As of now, Void does not provide official binary releases for Windows, but you can build it from source if you’re comfortable with development tools.
Building Void from Source
Install Prerequisites
Git: For cloning the repository.
Node.js and npm: Required for building VS Code-based projects.
Yarn: Sometimes required for dependency management.
Visual Studio Build Tools: For compiling native modules (if prompted).
Clone the Void Repository
Open a terminal and run:textgit clone https://github.com/void-editor/void.git cd void
Install Dependencies
Run:textnpm install ortextyarn install
Build Void
Execute:textnpm run build ortextyarn build
Run Void
Start the editor:textnpm run start ortextyarn start
The Void editor should launch, presenting a familiar VS Code-like interface.
Integrating Void with Ollama
Void is designed to work seamlessly with local LLMs via Ollama. Here’s how to connect them:
Ensure Ollama is Running
Verify that Ollama is running on localhost:11434 and that your desired model is active7.
Open Void’s AI Integration Settings
In Void, navigate to the AI integration or model settings (often found in the sidebar or settings menu).
Add a Local Model
Select the option to add a local model, and choose “Ollama” or enter the endpoint as http://localhost:11434.
Select Your Model
Void should automatically detect the models available in Ollama. Select your preferred coding model (e.g., CodeLlama or CodeJamma)7.
Test the Connection
Try using AI autocomplete (Tab), inline editing (Ctrl+K), or chat (Ctrl+L). If everything is set up correctly, the AI responses will come from your local Ollama instance.
Using Void AI Features with Local Models
Once integrated, you can leverage all of Void’s AI features with your local model:
Autocomplete: Press Tab for context-aware code suggestions.
Inline Editing: Select code, press Ctrl+K, and let the AI refactor or improve it.
AI Chat: Press Ctrl+L to ask questions about your codebase, attach files, or request code generation.
File Indexing: The AI can reference your entire project for more accurate suggestions.
Prompt Customization: Advanced users can tweak the prompts sent to the LLM for tailored results.
Performance Tips:
For best results, use a model optimized for code (e.g., CodeLlama, CodeJamma).
If you have limited hardware, choose quantized or smaller models to avoid slowdowns.
Void gives you more control, privacy, and flexibility.
Cursor offers a polished, ready-to-use experience but at a cost and with privacy trade-offs.
Troubleshooting and Tips
Common Issues:
Ollama Not Detected: Ensure Ollama is running and accessible at localhost:11434.
Model Not Downloading: Check your internet connection and disk space.
Performance Issues: Use smaller or quantized models if you lack a powerful GPU.
Build Errors (Void): Make sure all dependencies are installed and compatible with your system.
Tips:
Regularly update both Void and Ollama to benefit from new features and bug fixes.
Explore community forums and Discord channels for troubleshooting and tips.
Community, Customization, and Future Roadmap
Void is under active development, with a strong community focus. You can:
Join the Waitlist: For official binary releases as they become available16.
Contribute to Development: Submit pull requests, suggest features, or build plugins.
Integrate New Models: Add support for additional LLMs or AI tools as they emerge.
Customize Prompts and Workflows: Tailor the AI’s behavior to your coding style and needs.
Conclusion
Void, paired with Ollama, delivers a powerful, privacy-first, and highly customizable AI coding experience on Windows. Whether you’re a developer looking to escape the limitations of commercial tools like Cursor or simply want to experiment with local LLMs, Void offers a robust and future-proof platform.
Key Benefits:
Full local control and privacy
No subscription fees
Familiar VS Code environment
Flexible AI model integration
Active open-source community
As AI coding tools continue to evolve, Void stands out as a leading open-source alternative—empowering developers to shape the future of AI-assisted programming on their own terms.