-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature(norm): Add GroupNorm #963
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #963 +/- ##
==========================================
- Coverage 87.50% 87.40% -0.10%
==========================================
Files 502 503 +1
Lines 51080 51074 -6
==========================================
- Hits 44697 44642 -55
- Misses 6383 6432 +49 ☔ View full report in Codecov by Sentry. |
@nathanielsimard and @louisfd would know more, but backward pass of GroupNorm also needs to be implemented. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for this implementation! I only have minor comments, they should be easy to fix!
burn-core/src/nn/norm/group.rs
Outdated
gamma: Param<Tensor<B, 1>>, | ||
beta: Param<Tensor<B, 1>>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would wrap those in Option
so when group norm is configured with afine=false
no weights are added!
burn-core/src/nn/norm/group.rs
Outdated
use crate::TestBackend; | ||
|
||
#[test] | ||
fn group_norm_forward() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be valuable to have two tests, one with affine=true
and one with affine=false
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would you like to see tested in affine true? that the gamma and beta get set in the struct? Or that gamma and beta are getting updated during learning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just testing the forward pass with another set of numbers
Not at this level, the autodiff system will take care of that! |
Is it because GN uses Burn op primitives, which Autodiff already implements? |
@nathanielsimard I think I've addressed everything now! |
Failing due to formatting |
Should be resolved now |
Can we also add an entry to the book? Here is the section: https://burn.dev/book/building-blocks/module.html#general |
Added! |
Looks like CI is failing but not due to the PR but in the install lavapipe. Might need a retrigger |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thank you!
@nathanielsimard your requests have been fulfilled, so i merge |
Pull Request Template
Checklist
run-checks
script has been executed.Related Issues/PRs
Resolves #685