Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix fantasization with FixedNoiseGP and outcome transforms and use FantasizeMixin #2011

Closed
wants to merge 1 commit into from

Conversation

sdaulton
Copy link
Contributor

Summary:
This fixes fantasization with FixedNoiseGP and outcome transforms where transformed noise was outcome-transformed again.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in FantasizeMixin.fantasize for handling X with size 0 on the -2 dimension.

Differential Revision: D49200325

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Sep 13, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

@codecov
Copy link

codecov bot commented Sep 13, 2023

Codecov Report

Merging #2011 (529d5ba) into main (fa51038) will decrease coverage by 0.01%.
The diff coverage is 96.29%.

❗ Current head 529d5ba differs from pull request most recent head 169cb69. Consider uploading reports for the commit 169cb69 to get more accurate results

@@             Coverage Diff             @@
##              main    #2011      +/-   ##
===========================================
- Coverage   100.00%   99.99%   -0.01%     
===========================================
  Files          179      179              
  Lines        15798    15806       +8     
===========================================
+ Hits         15798    15805       +7     
- Misses           0        1       +1     
Files Changed Coverage Δ
botorch/acquisition/active_learning.py 100.00% <ø> (ø)
botorch/acquisition/knowledge_gradient.py 100.00% <ø> (ø)
botorch/acquisition/max_value_entropy_search.py 100.00% <ø> (ø)
...sition/multi_objective/max_value_entropy_search.py 100.00% <ø> (ø)
botorch/acquisition/multi_step_lookahead.py 100.00% <ø> (ø)
botorch/models/model.py 99.45% <88.88%> (-0.55%) ⬇️
botorch/models/gp_regression.py 100.00% <100.00%> (ø)
botorch/models/gpytorch.py 100.00% <100.00%> (ø)
botorch/utils/testing.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 14, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 15, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 15, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 15, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Reviewed By: Balandat

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Reviewed By: Balandat

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

…ntasizeMixin (pytorch#2011)

Summary:

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Reviewed By: Balandat

Differential Revision: D49200325
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D49200325

sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
…ntasizeMixin (pytorch#2011)

Summary:
Pull Request resolved: pytorch#2011

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: https://internalfb.com/D49200325

fbshipit-source-id: f2150e08009bfdf0f86b0b9e5908610dbb6709ee
sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
Summary: see title. This is causing a failure in the tutorials on  pytorch#2011

Differential Revision: D49382057

fbshipit-source-id: f1c62d17d7485ce7e2a381646fad84885ce0d94b
sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
…ntasizeMixin (pytorch#2011)

Summary:
Pull Request resolved: pytorch#2011

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: https://internalfb.com/D49200325

fbshipit-source-id: 998c649302579960e6ee2618dfadb87c52d6309b
sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
Summary:
Pull Request resolved: pytorch#2013

see title. This is causing a failure in the tutorials on  pytorch#2011

Reviewed By: Balandat

Differential Revision: D49382057

fbshipit-source-id: 2b88bfeea442e6c1b3d521ee97b4631ff2c77d60
sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
…ntasizeMixin (pytorch#2011)

Summary:
Pull Request resolved: pytorch#2011

This fixes fantasization with FixedNoiseGP when using outcome transforms----previously, already-transformed noise was transformed again during fantasization.

This also improves the fantasization for batched and batched multi-output models to use the average noise for each batch and output.

This also removes repeated code and uses the logic in `FantasizeMixin.fantasize` for handling `X` with size 0 on the -2 dimension.

This also deprecates the use of `observation_noise` as a boolean argument to fantasize.

Differential Revision: https://internalfb.com/D49200325

fbshipit-source-id: 686663284452f02114695d1bb2da973a16c3267e
sdaulton added a commit to sdaulton/botorch that referenced this pull request Sep 18, 2023
Summary:
Pull Request resolved: pytorch#2013

see title. This is causing a failure in the tutorials on  pytorch#2011

Reviewed By: Balandat

Differential Revision: D49382057

fbshipit-source-id: 04ba98192b26117c762b4188fae454dca3f8899f
facebook-github-bot pushed a commit that referenced this pull request Sep 18, 2023
Summary:
Pull Request resolved: #2013

see title. This is causing a failure in the tutorials on  #2011

Reviewed By: Balandat

Differential Revision: D49382057

fbshipit-source-id: e75494aab138e1a7ea67d5bcfab8a8698f77ab6c
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in cbb9ce4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants