Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Explicitly define models #47

@ghostdevv

Description

@ghostdevv

I'd like to be able to use olla with Cloudflare Workers AI, but for one reason or another it's open ai compatibility doesn't include /v1/models. Something that I think traceloop/hub does well is that it allows you to define providers and models more explicity.

Their config would allow something like this as I understand it:

providers:
    - key: cloudflare
      type: openai
      api_key: ...
      base_url: ...

models:
    - key: llama-test
      type: @cf/meta/llama-3.2-1b-instruct
      provider: cloudflare

Metadata

Metadata

Assignees

Labels

enhancementNew feature or requestroadmap-featureThis issue will be implemented in the roadmap of features for Olla.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions