Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

Actions: NVIDIA-NeMo/RL

Actions

Automodel Integration and Submodule Checks

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
2,183 workflow runs
2,183 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2183: Pull request #1756 opened by mxinO
5m 53s
feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2182: Pull request #1749 synchronize by mxinO
5m 15s
fix: split dtensorv1 vllm dependency
Automodel Integration and Submodule Checks #2181: Pull request #1638 synchronize by yuki-97
5m 56s
feat: refactor common data utilities of dtensor policy v2
Automodel Integration and Submodule Checks #2180: Pull request #1710 synchronize by hemildesai
5m 29s
feat: refactor init of dtensor policy v2
Automodel Integration and Submodule Checks #2179: Pull request #1709 synchronize by hemildesai
5m 30s
feat: refactor init of dtensor policy v2
Automodel Integration and Submodule Checks #2178: Pull request #1709 synchronize by hemildesai
5m 36s
build: Resolve CVEs for gnupg and aiohttp
Automodel Integration and Submodule Checks #2177: Pull request #1755 opened by chtruong814
5m 51s
feat: refactor init of dtensor policy v2
Automodel Integration and Submodule Checks #2176: Pull request #1709 synchronize by hemildesai
5m 40s
build: Resolve CVEs for gnupg and aiohttp for nano-v3
Automodel Integration and Submodule Checks #2175: Pull request #1754 opened by chtruong814
5m 32s
feat: refactor init of dtensor policy v2
Automodel Integration and Submodule Checks #2174: Pull request #1709 synchronize by hemildesai
5m 55s
feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2173: Pull request #1749 synchronize by mxinO
5m 21s
feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2172: Pull request #1749 synchronize by mxinO
5m 30s
cp: fix: patch pytorch aten.alias.default shard strategy (1728) into r0.5.0
Automodel Integration and Submodule Checks #2171: Pull request #1753 opened by chtruong814
6m 2s
feat: Support lora in dtensor grpo workflow[3/3]: async vllm
Automodel Integration and Submodule Checks #2170: Pull request #1752 synchronize by RayenTian
5m 41s
feat: Support lora in dtensor grpo workflow[3/3]: async vllm
Automodel Integration and Submodule Checks #2169: Pull request #1752 synchronize by RayenTian
6m 11s
feat: Support lora in dtensor grpo workflow[2/3]: sync and non-colocated setup
Automodel Integration and Submodule Checks #2168: Pull request #1751 synchronize by RayenTian
6m 2s
feat: Support lora in dtensor grpo workflow[2/3]: sync and non-colocated setup
Automodel Integration and Submodule Checks #2167: Pull request #1751 synchronize by RayenTian
5m 37s
feat: Support lora in dtensor grpo workflow[1/3]: sync and colocated setup
Automodel Integration and Submodule Checks #2166: Pull request #1748 synchronize by RayenTian
5m 47s
refactor: split train and val dataset in response dataset
Automodel Integration and Submodule Checks #2165: Pull request #1649 synchronize by yuki-97
5m 33s
feat: Support lora in dtensor grpo workflow[3/3]: async vllm
Automodel Integration and Submodule Checks #2164: Pull request #1752 opened by RayenTian
6m 10s
feat: Support lora in dtensor grpo workflow[2/3]: sync and non-colocated setup
Automodel Integration and Submodule Checks #2163: Pull request #1751 opened by RayenTian
5m 39s
feat: Support lora in dtensor grpo workflow[1/3]: sync and colocated setup
Automodel Integration and Submodule Checks #2162: Pull request #1748 synchronize by RayenTian
5m 55s
feat: Support lora in dtensor grpo workflow[1/3]: sync and colocated setup
Automodel Integration and Submodule Checks #2161: Pull request #1748 synchronize by RayenTian
5m 51s
feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2160: Pull request #1749 synchronize by mxinO
5m 27s
feat: Adding quantization aware training support with Model-Optimizer
Automodel Integration and Submodule Checks #2159: Pull request #1749 synchronize by mxinO
5m 14s