Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Local Model Support #124

Open
Open
@MrMuhannadObeidat

Description

@MrMuhannadObeidat

This is a great project and effort, thanks for sharing.
I tried the online version against a couple of github repos I am familiar with and it works like a charm.

Running locally however is more challenging. First I need to add a custom method to support ollama based interaction as it does not have an OpenAI interface. Then I struggled with choice of model to use as we have access to smaller models when running locally.

Do you have any recommendations on smaller models to use through ollama locally on laptop/local server?

I tried the three below. The only one that gave good response the rest of the code was able to parse is llama3.1.
qwen2.5:latest 845dbda0ea48 4.7 GB 6 weeks ago
gemma3:latest a2af6cc3eb7f 3.3 GB 8 weeks ago
llama3.1:latest 62757c860e01 4.7 GB 9 months ago

The output was pretty weird though. While it listed file names correctly most of the text produced is laced with hallucinations.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions