Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@The-truthh
Copy link
Collaborator

@The-truthh The-truthh commented May 22, 2023

Thank you for your contribution to the MindCV repo.
Before submitting this PR, please make sure:

Motivation

Add OneCycleLR and CyclicLR scheduler

Test Plan

It's tested in dynamic_lr.

Related Issues and PRs

(Is this PR part of a group of changes? Link the other relevant PRs and Issues here. Use https://help.github.com/en/articles/closing-issues-using-keywords for help on GitHub syntax)

@The-truthh The-truthh force-pushed the lr branch 6 times, most recently from 4b4ce59 to b5d88b0 Compare May 22, 2023 07:01
return lrs


def one_cycle_lr(pct_start, anneal_strategy, three_phase, min_lr, max_lr, *, lr, steps_per_epoch, epochs):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

建议加注释注明方法出处

train.py Outdated
three_phase=args.three_phase,
step_size_up=args.step_size_up,
step_size_down=args.step_size_down,
decay_mode=args.decay_mode,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这堆decay参数过于confusing. 建议优化命名,或者加个前缀区分下。考虑精简下参数个数。如:
decay_strategy -> oc_decay_strategy
three_phase -> is_oc_three_phase
step_size_up -> oc_ascend_steps
step_size_down -> oc_descend_steps

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个可能先不用暴露出来吧,感觉太过复杂。在create_scheduler这里不传入这些参数

config.py Outdated
help='Learning rate (default=0.001)')
group.add_argument('--min_lr', type=float, default=1e-6,
help='The minimum value of learning rate if scheduler supports (default=1e-6)')
group.add_argument('--max_lr', type=float, default=2.0,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个是不是没有必要加,lr就能代表max_lr

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不行,根据论文中的设置,resnet one cycle的学习率变化为0.05(lr)到1.0(max_lr),最后下降至0.00005(min_lr)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

而且这样也无法使用warm_up 来达到初始学习率了

return lrs


def one_cycle_refined_lr(pct_start, anneal_strategy, three_phase, min_lr, max_lr, *, lr, steps_per_epoch, epochs):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个应该没有refined吧,一直都是随step变化的

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

momentum是不是没法做

train.py Outdated
three_phase=args.three_phase,
step_size_up=args.step_size_up,
step_size_down=args.step_size_down,
decay_mode=args.decay_mode,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个可能先不用暴露出来吧,感觉太过复杂。在create_scheduler这里不传入这些参数

@The-truthh The-truthh force-pushed the lr branch 6 times, most recently from 36354b1 to ba4c0e1 Compare June 5, 2023 09:08
final_div_factor = initial_lr / min_lr
main_lr_scheduler = one_cycle_lr(
max_lr=lr,
pct_start=0.3,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这些默认值的,不用写在factory里就行

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

@The-truthh The-truthh changed the title [Feature] Add OneCycleLR and CyclicLR scheduler feat: Add OneCycleLR and CyclicLR scheduler Jun 8, 2023
@The-truthh The-truthh changed the title feat: Add OneCycleLR and CyclicLR scheduler feat: add OneCycleLR and CyclicLR scheduler Jun 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants