Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@maickrau
Copy link

@maickrau maickrau commented Jun 3, 2024

Adds a very simple CPU-only mode for inference for servers without GPUs. Modifies the -d parameter to allow value cpu which uses a CPU instead of a GPU device. Multiple threads by specifying the number of cpus in unary, eg. -d cpu,cpu,cpu,cpu uses four threads. Suggest -t 1 and also low batch size.

Much slower than GPU inference. Using 32 threads (with -t 1 -b 4) on an 18x coverage whole human genome ultralong R10.4.1 dataset took 3 days 7 hours and peak 116Gb RAM.

@xingjianfeng100
Copy link

Thank you for the modified mode! It shall be re-compiled, right? Could you provide a singularity image verison of the modified mode? Cuz the compatibility environment for compiling is too hard to create o(╥﹏╥)o.

@1Wencai 1Wencai mentioned this pull request Dec 2, 2024
@xiekunwhy
Copy link

CPU-only version has been released ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants