how to run with Docker for local LLM and local codebase #140
Unanswered
mailonebox
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Yeah just implement https://github.com/The-Pocket/PocketFlow-Tutorial-Codebase-Knowledge/blob/main/utils/call_llm.py with your local llama. Also see https://the-pocket.github.io/PocketFlow/utility_function/llm.html |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
First of all - Thanks for this great tool!!!
Is it possible to run your project in docker container with local LLM and local codebase?
I have a local ollama running in docker container with the custom network (ollama-net) and I'd like to create a tutorial for a codebase which is also resides in a local directory.
How can I set the docker run command to use your project to run in docker container on the same ollama-net and generate the documentation. I do not have any API keys for local LLM.
Beta Was this translation helpful? Give feedback.
All reactions