Commit ac55773
authored
fix: route bedrock/ model ids via LiteLLM Bedrock adapter (#89)
_resolve_llm_params treated any non-anthropic/openai prefix as an HF
router model and wrapped it as 'openai/<id>' pointed at
router.huggingface.co. 'bedrock/...' therefore fell through to HF
router which rejected it with 'model does not exist'. Add a dedicated
bedrock/ branch that passes the model id as-is so LiteLLM's Converse
adapter picks it up with the standard AWS env creds.1 parent 2fac9ff commit ac55773
1 file changed
Lines changed: 8 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
154 | 154 | | |
155 | 155 | | |
156 | 156 | | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
157 | 165 | | |
158 | 166 | | |
159 | 167 | | |
| |||
0 commit comments