In this article I will show you how you can get a local AI Coding Assistant in Visual Studio Code. At no cost except your own computer's power (running the AI locally uses your computer's processing power).
Firstly, to set up a local AI we need to use a free and open-source project called Ollama but what is it? Ollama lets you run various open source LLMs locally and opens a local API to use those LLMs with ease. It has massed a whopping 77k stars on GitHub as of writing this article. It is easy to install and setup for all operating systems. I personally use WSL so I used their simple one liner command to install and set up Ollama in Linux.
Now that you've got Ollama you need to choose which LLM will be your AI coding partner. At the time of writing this article I use CodeQwen which you can get simply by first starting up Ollama with ollama serve
then downloading the LLM with ollama pull codeqwen
. However, there are always more stronger and powerful LLMs coming out each day so I recommend you do your research to find the LLM most suitable for you. Because there are trade offs between the storage space required for these LLM's and how well they perform, etc.
How can we use our newly acquired AI within our code editor? With this nifty extension called Continue. You can get it directly from the VSCode marketplace. There is a lot of different things you can do with AI using this extension such tab to autocomplete code suggestions and asking questions about your codebase and much more. Be sure to check out their documentation.
To simply let Continue use Ollama's AI capabilities make sure you're running Ollama with the ollama serve
command. And in the Continue extension settings edit the json like such. For example if you're using CodeQwen.
{ "models": [ { "title": "Ollama CodeQwen", "provider": "ollama", "model": "codeqwen" } ] }
If you're stuck please read the documentation.
Now you should be all set. Start coding to see autocompletion and check out their website to make use of all their awesome features.
I hope this was useful for you. Thank you for reading!