Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 8ddf63b

Browse files
committed
Remove reference to FORCE_CMAKE from docs
1 parent 109123c commit 8ddf63b

File tree

1 file changed

+6
-10
lines changed

1 file changed

+6
-10
lines changed

README.md

+6-10
Original file line numberDiff line numberDiff line change
@@ -52,45 +52,41 @@ Otherwise, while installing it will build the llama.ccp x86 version which will b
5252
To install with OpenBLAS, set the `LLAMA_BLAS and LLAMA_BLAS_VENDOR` environment variables before installing:
5353

5454
```bash
55-
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" FORCE_CMAKE=1 pip install llama-cpp-python
55+
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
5656
```
5757

5858
To install with cuBLAS, set the `LLAMA_CUBLAS=1` environment variable before installing:
5959

6060
```bash
61-
CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python
61+
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
6262
```
6363

6464
To install with CLBlast, set the `LLAMA_CLBLAST=1` environment variable before installing:
6565

6666
```bash
67-
CMAKE_ARGS="-DLLAMA_CLBLAST=on" FORCE_CMAKE=1 pip install llama-cpp-python
67+
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
6868
```
6969

7070
To install with Metal (MPS), set the `LLAMA_METAL=on` environment variable before installing:
7171

7272
```bash
73-
CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install llama-cpp-python
73+
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
7474
```
7575

7676
To install with hipBLAS / ROCm support for AMD cards, set the `LLAMA_HIPBLAS=on` environment variable before installing:
7777

7878
```bash
79-
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python
79+
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
8080
```
8181

8282
#### Windows remarks
8383

84-
To set the variables `CMAKE_ARGS` and `FORCE_CMAKE` in PowerShell, follow the next steps (Example using, OpenBLAS):
84+
To set the variables `CMAKE_ARGS`in PowerShell, follow the next steps (Example using, OpenBLAS):
8585

8686
```ps
8787
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
8888
```
8989

90-
```ps
91-
$env:FORCE_CMAKE = 1
92-
```
93-
9490
Then, call `pip` after setting the variables:
9591
```
9692
pip install llama-cpp-python

0 commit comments

Comments
 (0)