Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Documentation, LR scheduler step on iteration, new LR schedulers

Choose a tag to compare

@lRomul lRomul released this 24 May 18:37
· 257 commits to master since this release

New Features

  • Documentation https://pytorch-argus.readthedocs.io
  • Add step on iteration option for LR schedulers.
    from argus.callbacks import CosineAnnealingLR
    
    CosineAnnealingLR(10000, step_on_iteration=True)
  • New LR schedulers.
    • argus.callbacks.lr_schedulers.MultiplicativeLR: Multiply learning rate by the factor given in the specified function.
    • argus.callbacks.lr_schedulers.OneCycleLR: One Cycle learning rate policy.
  • Make LR scheduler step on epoch complete instead start.
  • Compute metric score with torch no grad.

Fix

  • Fix LR logging with several parameters group in optimizer.
  • Fix key error in redefine metric warning.

Breaking Changes

  • PyTorch requirements torch>=1.1.0.