-
Notifications
You must be signed in to change notification settings - Fork 469
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor/autotune/key #924
Conversation
burn-wgpu/src/compute/tune_key.rs
Outdated
Matmul { | ||
/// The key with specific matmul information | ||
matmul_key: MatmulAutotuneKey, | ||
}, | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The name is not necessary:
WgpuAutotuneKey {
Matmul(MatmulAutotuneKey),
}
burn-compute/src/tune/tuner.rs
Outdated
pub struct Tuner<S, C> { | ||
tune_cache: TuneCache<S>, | ||
pub struct Tuner<S: ComputeServer, C> { | ||
tune_cache: TuneCache<S::AutotuneKey>, | ||
_server: PhantomData<S>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this phantom data is necessary.
pub struct MatmulAutotuneKey { | ||
round: bool, // True when all matmul dims are multiples of 64 | ||
broadcast: bool, // True when there are differences in batch size | ||
anchored_m: usize, | ||
anchored_k: usize, | ||
anchored_n: usize, | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we wanted to include the round batch size as well !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if you mean to have a bool for if batch sizes are all multiples of 64, or have an anchored batch size as usize (for all batch dims?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
anchored batch size for all batch dims!
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #924 +/- ##
==========================================
- Coverage 86.54% 86.52% -0.03%
==========================================
Files 469 471 +2
Lines 44191 44238 +47
==========================================
+ Hits 38244 38275 +31
- Misses 5947 5963 +16
☔ View full report in Codecov by Sentry. |
Refactored the autotune key (wgpu) as an enum and a struct. It should now be more robust, clear and efficient.