Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 6981597

Browse files
committed
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python into main
2 parents d5dbb3f + 84380fe commit 6981597

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

README.md

+1
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ This package provides:
1414
- High-level Python API for text completion
1515
- OpenAI-like API
1616
- [LangChain compatibility](https://python.langchain.com/docs/integrations/llms/llamacpp)
17+
- [LlamaIndex compatibility](https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html)
1718
- OpenAI compatible web server
1819
- [Local Copilot replacement](https://llama-cpp-python.readthedocs.io/en/latest/server/#code-completion)
1920
- [Function Calling support](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling)

llama_cpp/server/settings.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ class ModelSettings(BaseSettings):
6060
seed: int = Field(
6161
default=llama_cpp.LLAMA_DEFAULT_SEED, description="Random seed. -1 for random."
6262
)
63-
n_ctx: int = Field(default=2048, ge=1, description="The context size.")
63+
n_ctx: int = Field(default=2048, ge=0, description="The context size.")
6464
n_batch: int = Field(
6565
default=512, ge=1, description="The batch size to use per eval."
6666
)

0 commit comments

Comments
 (0)