Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature(norm): Add GroupNorm #963

Merged
merged 5 commits into from
Nov 21, 2023
Merged

Conversation

dcvz
Copy link
Contributor

@dcvz dcvz commented Nov 16, 2023

Pull Request Template

Checklist

  • Confirm that run-checks script has been executed.

Related Issues/PRs

Resolves #685

@dcvz dcvz marked this pull request as ready for review November 16, 2023 22:20
Copy link

codecov bot commented Nov 16, 2023

Codecov Report

Attention: 22 lines in your changes are missing coverage. Please review.

Comparison is base (945014b) 87.50% compared to head (7adeefe) 87.40%.

Files Patch % Lines
burn-core/src/nn/norm/group.rs 80.18% 22 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #963      +/-   ##
==========================================
- Coverage   87.50%   87.40%   -0.10%     
==========================================
  Files         502      503       +1     
  Lines       51080    51074       -6     
==========================================
- Hits        44697    44642      -55     
- Misses       6383     6432      +49     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@antimora
Copy link
Collaborator

@nathanielsimard and @louisfd would know more, but backward pass of GroupNorm also needs to be implemented.

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for this implementation! I only have minor comments, they should be easy to fix!

Comment on lines 33 to 34
gamma: Param<Tensor<B, 1>>,
beta: Param<Tensor<B, 1>>,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would wrap those in Option so when group norm is configured with afine=false no weights are added!

use crate::TestBackend;

#[test]
fn group_norm_forward() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be valuable to have two tests, one with affine=true and one with affine=false.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would you like to see tested in affine true? that the gamma and beta get set in the struct? Or that gamma and beta are getting updated during learning?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just testing the forward pass with another set of numbers

@nathanielsimard
Copy link
Member

@nathanielsimard and @louisfd would know more, but backward pass of GroupNorm also needs to be implemented.

Not at this level, the autodiff system will take care of that!

@antimora
Copy link
Collaborator

@nathanielsimard and @louisfd would know more, but backward pass of GroupNorm also needs to be implemented.

Not at this level, the autodiff system will take care of that!

Is it because GN uses Burn op primitives, which Autodiff already implements?

@dcvz dcvz mentioned this pull request Nov 18, 2023
1 task
@dcvz
Copy link
Contributor Author

dcvz commented Nov 19, 2023

@nathanielsimard I think I've addressed everything now!

@antimora
Copy link
Collaborator

@nathanielsimard I think I've addressed everything now!

Failing due to formatting

@dcvz
Copy link
Contributor Author

dcvz commented Nov 20, 2023

@nathanielsimard I think I've addressed everything now!

Failing due to formatting

Should be resolved now

@antimora
Copy link
Collaborator

Can we also add an entry to the book?

Here is the section: https://burn.dev/book/building-blocks/module.html#general

@dcvz
Copy link
Contributor Author

dcvz commented Nov 20, 2023

Can we also add an entry to the book?

Here is the section: https://burn.dev/book/building-blocks/module.html#general

Added!

@dcvz
Copy link
Contributor Author

dcvz commented Nov 20, 2023

Looks like CI is failing but not due to the PR but in the install lavapipe. Might need a retrigger

Copy link
Collaborator

@antimora antimora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thank you!

@louisfd
Copy link
Member

louisfd commented Nov 21, 2023

@nathanielsimard your requests have been fulfilled, so i merge

@louisfd louisfd merged commit 88b4420 into tracel-ai:main Nov 21, 2023
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Group Normalization
4 participants