Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@eywalker
Copy link
Member

@eywalker eywalker commented Mar 5, 2018

  • It appears that due to the update in PyTorch, previous implementation of BiasBatchNorm2d no longer works. I have updated the implementation to be cleaner and hopefully less dependent on the exact implementation of nn.BatchNorm2d and nn.Module
  • The Pyramid implementation is now fixed to follow the standard Laplace image pyramid implementation. Some special parameters are added to maintain backward compatibility with the already trained networks using Pyramid.

@eywalker eywalker requested a review from fabiansinz March 5, 2018 03:33
@eywalker eywalker changed the title Fix BiasBatchNorm2d layer Fix BiasBatchNorm2d layer and Pyramid implementation Mar 7, 2018
super().__init__(features, **kwargs)
self.bias = nn.Parameter(torch.Tensor(features))
super().__init__()
self.bn = nn.BatchNorm2d(features, **kwargs)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't that have an affine=False by default then?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

affine=False would allow the user to override it and also suggests that you can change the behavior. Given this, I prefer to simply fix the value in the init

@fabiansinz fabiansinz merged commit baccca5 into atlab:master Mar 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants