Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Add ability to use local llm #28

@camarokris

Description

@camarokris

Some software has the ability to emulate the OpenAI API such as ollama. If the endpoint URL could be set to a local host or local network host where something like ollama is running it would enable users to use free alternatives to OpenAI and choose the model they prefer to use as well.

As you can see from here the ollama OpenAI API emulation does support the same chat completion endpoint you use in your code.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions