Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.

License

Notifications You must be signed in to change notification settings

Abaso007/code-llama-for-vscode

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Code Llama for VSCode

An API which mocks Llama.cpp to enable support for Code Llama with the Continue Visual Studio Code extension.

As of the time of writing and to my knowledge, this is the only way to use Code Llama with VSCode locally without having to sign up or get an API key for a service. The only exception to this is Continue with Ollama, but Ollama doesn't support Windows or Linux. On the other hand, Code Llama for VSCode is completely cross-platform and will run wherever Meta's own codellama code will run.

Now let's get started!

Setup

Prerequisites:

After you are able to use both independently, we will glue them together with Code Llama for VSCode.

Steps:

  1. Move llamacpp_mock_api.py to your codellama folder and install Flask to your environment with pip install flask.
  2. Run llamacpp_mock_api.py with your Code Llama Instruct torchrun command. For example:
torchrun --nproc_per_node 1 llamacpp_mock_api.py \
    --ckpt_dir CodeLlama-7b-Instruct/ \
    --tokenizer_path CodeLlama-7b-Instruct/tokenizer.model \
    --max_seq_len 512 --max_batch_size 4
  1. Click the settings button at the bottom right of Continue's UI in VSCode and make changes to config.json so it looks like this[archive]. Replace MODEL_NAME with codellama-7b.

Restart VSCode or reload the Continue extension and you should now be able to use Code Llama for VSCode!

About

Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%