Based on a tutorial by Code with Nathan
Are you frustrated with Cursor’s limitations in the free version but not ready to commit to the $20 monthly subscription? You’re not alone. Many developers love Cursor’s AI-powered features but find the price point difficult to justify, especially for hobby projects.
In this article, I’ll show you how to create your own free, open-source alternative to Cursor using VS Code and a couple of powerful extensions. This setup gives you even more control over your data while keeping costs at zero.
Quick Navigation
- Understanding Cursor and VS Code (00:00-01:30)
- Required Tools Overview (01:31-03:00)
- Installing VS Code and Ollama (03:01-05:15)
- Setting Up Continue.dev and Client Extensions (05:16-08:30)
- Configuring Local AI Models (08:31-11:45)
- Testing the Setup with Real Examples (11:46-15:00)
- Using Open Router for Free AI Models (15:01-17:30)
Understanding Cursor and VS Code (00:00-01:30)
Before diving into the setup, it’s important to understand what Cursor is and why VS Code makes an excellent alternative. Cursor is essentially a premium code editor with AI features that allow you to generate files, write code, refactor existing code, and debug errors.
However, Cursor itself is actually a fork of Visual Studio Code (VS Code), which is Microsoft’s powerful open-source code editor beloved by professional developers worldwide.
Key Points:
- Cursor is a fork of VS Code with added AI features
- Cursor’s free version has limitations, and the Pro version costs $20/month
- VS Code is open-source, highly extensible, and completely free
- Using VS Code with specific extensions provides similar functionality to Cursor
My Take:
While Cursor doesn’t violate VS Code’s license agreement, it’s worth noting that you can recreate much of its functionality without the monthly subscription cost. For hobbyists or developers working on personal projects, this free alternative is particularly valuable.
Required Tools Overview (01:31-03:00)
To turn VS Code into a powerful Cursor alternative, you’ll need just a few key components: VS Code itself, Ollama for local AI models, and two essential extensions – Continue.dev and Client.
This combination provides full control over your data, allowing you to use free and local models while keeping costs at zero. Plus, you won’t have to worry about enabling privacy mode to prevent third parties from using your code to train their AI models.
Key Points:
- VS Code – The base code editor
- Ollama – For running AI models locally on your machine
- Continue.dev extension – For code completion and editing features
- Client extension – For multi-file editing and advanced AI reasoning
My Take:
This setup actually provides better privacy than Cursor since all processing can happen locally. For developers concerned about their intellectual property or working on sensitive projects, this alone makes the setup worth considering.
Installing VS Code and Ollama (03:01-05:15)
Let’s start with the basic installations needed for our Cursor alternative. First, you’ll need to download and install Visual Studio Code, followed by Ollama for local AI model access.
Key Points:
- Download VS Code from code.visualstudio.com
- Install Ollama from ollama.com
- Ollama allows you to download state-of-the-art open-source models locally
- Local models require sufficient storage space on your computer
- If you have limited memory or slow internet, Open Router API can be used as an alternative
My Take:
Running AI models locally does require some system resources, so if you’re working on an older machine with limited RAM or storage, you might want to consider the Open Router option mentioned later in this guide.
Setting Up Continue.dev and Client Extensions (05:16-08:30)
With VS Code and Ollama installed, it’s time to add the extensions that will transform your code editor into a Cursor alternative. These two extensions work together to provide a complete AI-assisted coding experience.
Key Points:
- Open VS Code and click on the Extensions tab on the left sidebar
- Search for “Continue.dev” and install it
- Search for “Client” and install it as well
- Continue.dev enables auto code completion and multi-line edits
- Client allows AI to reason and handle multiple files editing better
# Commands to install AI models with Ollama
ollama pull deepseek-coder-1.3b-instruct
ollama pull qwen:1.5b-chat
My Take:
The combination of these two extensions creates a powerful synergy. Continue.dev handles most of the code-level interactions, while Client provides the multi-file project-level capabilities that make Cursor so useful.
Configuring Local AI Models (08:31-11:45)
After installing the extensions, we need to configure them to use our local AI models through Ollama. The tutorial recommends using DeepSeek R1 for chat mode and Qwen 2.5 Coder for code completion.
Key Points:
- Open the Continue extension and click the cube icon to configure models
- For chat mode, use DeepSeek R1 with 7 billion parameters
- For code completion, use Qwen 2.5 Coder with 1.5 billion parameters
- Download models using terminal commands:
ollama pull model-name
- Configure Continue.dev to use the downloaded models by specifying name, provider, and model
- Set the appropriate role for each model in the configuration
// Example Continue.dev configuration
{
"name": "DeepSeek R1",
"provider": "ollama",
"model": "deepseek-coder-1.3b-instruct"
}
// For Qwen model with role
{
"name": "Qwen 2.5 Coder",
"provider": "ollama",
"model": "qwen:1.5b-chat",
"role": "coder"
}
My Take:
Using smaller models for code completion is actually a smart approach. The tutorial mentions that larger models don’t significantly improve autocompletion performance, so the 1.5B parameter model is plenty sufficient while being much lighter on system resources.
Testing the Setup with Real Examples (11:46-15:00)
Now that everything is set up, let’s test our new Cursor alternative with some real coding tasks. The tutorial demonstrates both basic code completion and more complex project generation.
Key Points:
- Test code completion by creating a simple Python function
- Press Tab to accept code suggestions
- Test Client’s ability to create a complete project (e.g., a Flappy Bird game)
- Client can create necessary files, write code, and even install dependencies
- Continue.dev provides keyboard shortcuts for highlighting code and getting suggestions
- Use Control+I (or Command+I on Mac) as an alternative to Cursor’s “/” feature
My Take:
The Flappy Bird example demonstrates the real power of this setup. Being able to generate an entire working game from a single prompt is impressive and shows that this free alternative can truly compete with Cursor’s paid features.
Using Open Router for Free AI Models (15:01-17:30)
If you prefer not to download large AI models or your system has limited resources, Open Router provides an excellent alternative. It offers access to various AI models through a unified API, including several free options.
Key Points:
- Open Router is a platform providing access to multiple AI models
- Sign up for an account and create an API key
- Configure Client to use Open Router as the provider
- Select a powerful free model like DeepSeek R1 (67B parameters)
- Free models may have longer request times when many people are using them
// Client configuration for Open Router
{
"provider": "openrouter",
"apiKey": "your-api-key-here",
"model": "deepseek-coder/deepseek-coder-33b-instruct"
}
My Take:
Open Router offers a nice balance between local processing and cloud-based models. The free tier might be a bit slower during peak usage times, but for many developers, this trade-off is worth it to avoid downloading multi-gigabyte models.
This article summarizes the excellent tutorial created by Code with Nathan. If you found this summary helpful, please support the creator by watching the full video and subscribing to their channel.