Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Max pool Transform #841

Merged
merged 21 commits into from
Jan 23, 2023
Merged

Conversation

albertbou92
Copy link
Contributor

@albertbou92 albertbou92 commented Jan 18, 2023

Description

A new Transform that does a max pooling of values over time dimension. It was used, for examples, in https://www.nature.com/articles/nature14236 to deal with flickering objects in Atari 2600 environment.

The PR also includes tests for the Transform.

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 18, 2023
@vmoens vmoens added the enhancement New feature or request label Jan 19, 2023
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great implementation i like it
Can you address the small comments i left?

torchrl/envs/transforms/transforms.py Outdated Show resolved Hide resolved
torchrl/envs/transforms/transforms.py Outdated Show resolved Hide resolved
torchrl/envs/transforms/transforms.py Outdated Show resolved Hide resolved
torchrl/envs/transforms/transforms.py Show resolved Hide resolved
@vmoens
Copy link
Contributor

vmoens commented Jan 19, 2023

FYI
you may want to merge main, we have deprecated SavedTensorDict and tests fail bc of it

@vmoens
Copy link
Contributor

vmoens commented Jan 19, 2023

I love the logic so much that I will duplicate it for cat frames :p

@albertbou92
Copy link
Contributor Author

I think the registration of buffers should be fine now

@vmoens
Copy link
Contributor

vmoens commented Jan 23, 2023

I'm #847 I introduced a bug, since uninitialised buffers were not found by load_state_dict.
I solved it in #855.
Sorry about all this
Have a look!

@albertbou92
Copy link
Contributor Author

Oh interesting I did not know about UninitializedBuffer's in PyTorch. I will adapt the code, no problem

Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@vmoens vmoens merged commit 985f5d1 into pytorch:main Jan 23, 2023
@albertbou92 albertbou92 deleted the max_pool_transform branch January 18, 2024 10:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants