Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@jfolz
Copy link
Contributor

@jfolz jfolz commented Dec 20, 2023

Currently, enroot calls unsquashfs such that it will use all installed processors, which may not always be desirable. E.g., running a non-exclusive Slurm job restricts the number of usable processors, in our case 2 by default. I timed unsquashfs under these conditions, with and without explicitly setting the number of processors to use:

$ time unsquashfs -p 2 -user-xattrs -d /run/enroot-data/user-$UID/container /netscratch/enroot/nvcr.io_nvidia_pytorch_23.12-py3.sqsh
Parallel unsquashfs: Using 2 processors
...
real    0m37.332s
user    0m40.923s
sys     0m28.742s
$ rm -rf /run/enroot-data/user-$UID/container
$ time unsquashfs -user-xattrs -d /run/enroot-data/user-$UID/container /netscratch/enroot/nvcr.io_nvidia_pytorch_23.12-py3.sqsh
Parallel unsquashfs: Using 40 processors
...
real    0m41.553s
user    0m38.024s
sys     0m28.396s

On this older machine, setting an appropriate number of processors reduces the runtime by about 10%.

This pull request adds a new config variable, ENROOT_NUM_THREADS, which allows controlling the number of processors used by applies ENROOT_MAX_PROCESSORS to unsquashfs. It is unset by default, which maintains the current behavior.

Signed-off-by: Joachim Folz <[email protected]>
@3XX0
Copy link
Member

3XX0 commented Jan 10, 2024

There already is ENROOT_MAX_PROCESSORS, why introduce another configuration parameter?

@jfolz jfolz changed the title Add ENROOT_NUM_THREADS to control how many threads unsquashfs uses Apply ENROOT_MAX_PROCESSORS to unsquashfs Jan 10, 2024
@jfolz
Copy link
Contributor Author

jfolz commented Jan 10, 2024

@3XX0 I have a very good explanation: I did not see it ;)
The PR now applies ENROOT_MAX_PROCESSORS to unsquashfs instead.

@3XX0 3XX0 merged commit eb4300b into NVIDIA:master Jan 12, 2024
@3XX0
Copy link
Member

3XX0 commented Jan 12, 2024

Merged, thanks!

@jfolz jfolz deleted the enroot_num_threads branch January 12, 2024 12:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants