Tags: ngxson/wllama
Toggle 2.3.7's commit message
sync upstream llama.cpp (b7179) (#194 )
* sync upstream llama.cpp
* v2.3.7
Toggle 2.3.6's commit message
sync with llama.cpp upstream (#192 )
* sync with llama.cpp upstream
* update package-lock
Toggle 2.3.5's commit message
Sync with latest upstream, fix useCache, add getLibllamaVersion() (#189 )
* fix problem with useCache
Co-authored-by: khromov <[email protected] >
* bump to latest upstream llama.cpp
* add api for getting libllama version number
* correct doc
* fix CI
* v2.3.5
* fix submodule
---------
Co-authored-by: khromov <[email protected] >
Toggle 2.3.4's commit message
if KV rm fails, we should clear the whole cache (#188 )
* if KV rm fails, we should clear the whole cache
* bump version
* rm redundant fields
Toggle 2.3.3's commit message
sync with latest upstream llama.cpp (#187 )
* sync with latest upstream llama.cpp
* v2.3.3
Toggle 2.3.2's commit message
Toggle 2.3.1's commit message
sync with upstream llama.cpp source code (#171 )
Toggle 2.3.0's commit message
Toggle 2.2.1's commit message
Toggle 2.2.0's commit message
Fix a bug with kv_remove, release v2.2.0 (#157 )
* fix a bug with kv remove
* v2.2.0
You can’t perform that action at this time.