-
Notifications
You must be signed in to change notification settings - Fork 411
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix fantasization with FixedNoiseGP and outcome transforms and use FantasizeMixin #2011
Conversation
This pull request was exported from Phabricator. Differential Revision: D49200325 |
Codecov Report
@@ Coverage Diff @@
## main #2011 +/- ##
===========================================
- Coverage 100.00% 99.99% -0.01%
===========================================
Files 179 179
Lines 15798 15806 +8
===========================================
+ Hits 15798 15805 +7
- Misses 0 1 +1
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: D49200325
d548d6b
to
33d0523
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: D49200325
33d0523
to
f7319cb
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: D49200325
f7319cb
to
b1a493d
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: D49200325
b1a493d
to
c3c814b
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Reviewed By: Balandat Differential Revision: D49200325
c3c814b
to
cde2f7d
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Reviewed By: Balandat Differential Revision: D49200325
cde2f7d
to
77137d4
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Reviewed By: Balandat Differential Revision: D49200325
77137d4
to
169cb69
Compare
This pull request was exported from Phabricator. Differential Revision: D49200325 |
…ntasizeMixin (pytorch#2011) Summary: Pull Request resolved: pytorch#2011 This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: https://internalfb.com/D49200325 fbshipit-source-id: f2150e08009bfdf0f86b0b9e5908610dbb6709ee
Summary: see title. This is causing a failure in the tutorials on pytorch#2011 Differential Revision: D49382057 fbshipit-source-id: f1c62d17d7485ce7e2a381646fad84885ce0d94b
…ntasizeMixin (pytorch#2011) Summary: Pull Request resolved: pytorch#2011 This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: https://internalfb.com/D49200325 fbshipit-source-id: 998c649302579960e6ee2618dfadb87c52d6309b
Summary: Pull Request resolved: pytorch#2013 see title. This is causing a failure in the tutorials on pytorch#2011 Reviewed By: Balandat Differential Revision: D49382057 fbshipit-source-id: 2b88bfeea442e6c1b3d521ee97b4631ff2c77d60
…ntasizeMixin (pytorch#2011) Summary: Pull Request resolved: pytorch#2011 This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization. This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output. This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension. This also deprecates the use of `observation_noise` as a boolean argument to fantasize. Differential Revision: https://internalfb.com/D49200325 fbshipit-source-id: 686663284452f02114695d1bb2da973a16c3267e
Summary: Pull Request resolved: pytorch#2013 see title. This is causing a failure in the tutorials on pytorch#2011 Reviewed By: Balandat Differential Revision: D49382057 fbshipit-source-id: 04ba98192b26117c762b4188fae454dca3f8899f
This pull request has been merged in cbb9ce4. |
Summary:
This fixes fantasization with FixedNoiseGP and outcome transforms where transformed
noise
was outcome-transformed again.This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.
This also removes repeated code and uses the logic in
FantasizeMixin.fantasize
for handlingX
with size 0 on the -2 dimension.Differential Revision: D49200325