PureCPP is the C++ backend powering the core logic of the RAG (Retrieval-Augmented Generation) system. It provides high-performance native modules that integrate seamlessly with Python via bindings.
We welcome contributions to PureCPP!
Before submitting a pull request or issue, please read our Contribution Guide.
.
├── scripts/ # Shell utilities and setup scripts
├── package/ # Python package
│ └── purecpp/ # Contains the compiled .so
├── build/ # Generated build files
├── libs/ # Third-party dependencies
├── CMakeLists.txt # Main build config
├── Dockerfile # Build environment
└── README.md
For full installation and setup instructions, visit our official documentation:
To install the package via pip (for end-users):
pip install purecppYou can either build locally or use our Docker environment to ensure consistency.
To simplify setup and avoid installing system-wide dependencies, use the provided Dockerfile.
docker build -t purecpp .docker run -it --rm purecpp bash./buildThis will generate the shared object (RagPUREAI.cpython-<python-version>*.so) in the build/Release/ directory.
To test the Python bindings, copy the .so file to your test script directory:
cp build/Release/RagPUREAI*.so /some-test-folderYou may also build the project manually without Docker, if your environment satisfies the requirements.
- Python ≥ 3.8
- CMake ≥ 3.22
- Conan ≥ 2.0
- Rust
- GCC/G++ = 13
- Protobuf Compiler
chmod +x scripts/install_python_dependencies.sh
chmod +x scripts/install_torch.sh
chmod +x scripts/install_libs.sh
chmod +x scripts/configure_conan_profile.sh
chmod +x build
# Install dependencies
./scripts/install_python_dependencies.sh
./scripts/install_torch.sh
./scripts/install_libs.sh
./scripts/configure_conan_profile.sh
# Build the project
./buildThe output .so file will be located in build/Release/.
To test the Python bindings:
from RagPUREAI import SomeExposedFunction Ensure RagPUREAI*.so is placed in the same folder as your Python project.
To build and upload the Python package:
./scripts/create_pip_package <your-pypi-api-key>This script will:
- Copy the
.sofile to the appropriate location. - Package the Python module using
setuptools. - Upload to PyPI using
twine.
You can convert HuggingFace models to ONNX using:
python3 scripts/hf_model_to_onnx.py -m="dbmdz/bert-large-cased-finetuned-conll03-english" -o="bert-large-cased-finetuned-conll03-english"
python3 scripts/hf_model_to_onnx.py -m="sentence-transformers/all-MiniLM-L6-v2" -o="sentence-transformers/all-MiniLM-L6-v2"Stay tuned for updates and new model integrations! 🚀
