You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -23,7 +24,8 @@ Documentation is available at [https://llama-cpp-python.readthedocs.io/en/latest
23
24
24
25
25
26
26
-
## Installation from PyPI
27
+
## Installation
28
+
---
27
29
28
30
Install from PyPI (requires a c compiler):
29
31
@@ -107,6 +109,7 @@ See the above instructions and set `CMAKE_ARGS` to the BLAS backend you want to
107
109
Detailed MacOS Metal GPU install documentation is available at [docs/install/macos.md](https://llama-cpp-python.readthedocs.io/en/latest/install/macos/)
[Docker on termux (requires root)](https://gist.github.com/FreddieOliveira/efe850df7ff3951cb62d74bd770dce27) is currently the only known way to run this on phones, see [termux support issue](https://github.com/abetlen/llama-cpp-python/issues/389)
@@ -344,12 +350,14 @@ Below is a short example demonstrating how to use the low-level API to tokenize
344
350
Check out the [examples folder](examples/low_level_api) for more examples of using the low-level API.
345
351
346
352
347
-
# Documentation
353
+
## Documentation
354
+
---
348
355
349
356
Documentation is available via [https://llama-cpp-python.readthedocs.io/](https://llama-cpp-python.readthedocs.io/).
350
357
If you find any issues with the documentation, please open an issue or submit a PR.
351
358
352
-
# Development
359
+
## Development
360
+
---
353
361
354
362
This package is under active development and I welcome any contributions.
355
363
@@ -375,7 +383,21 @@ pip install -e .[all]
375
383
make clean
376
384
```
377
385
378
-
# How does this compare to other Python bindings of `llama.cpp`?
386
+
## FAQ
387
+
---
388
+
389
+
### Are there pre-built binaries / binary wheels available?
390
+
391
+
The recommended installation method is to install from source as described above.
392
+
The reason for this is that `llama.cpp` is built with compiler optimizations that are specific to your system.
393
+
Using pre-built binaries would require disabling these optimizations or supporting a large number of pre-built binaries for each platform.
394
+
395
+
That being said there are some pre-built binaries available through the Releases as well as some community provided wheels.
396
+
397
+
In the future, I would like to provide pre-built binaries and wheels for common platforms and I'm happy to accept any useful contributions in this area.
398
+
This is currently being tracked in #741
399
+
400
+
### How does this compare to other Python bindings of `llama.cpp`?
379
401
380
402
I originally wrote this package for my own use with two goals in mind:
381
403
@@ -384,6 +406,7 @@ I originally wrote this package for my own use with two goals in mind:
384
406
385
407
Any contributions and changes to this package will be made with these goals in mind.
386
408
387
-
# License
409
+
## License
410
+
---
388
411
389
412
This project is licensed under the terms of the MIT license.
0 commit comments