Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Refactor] Update all instances of exploration *Wrapper to *Module #2298

Merged
merged 2 commits into from
Jul 22, 2024

Conversation

kurtamohler
Copy link
Collaborator

@kurtamohler kurtamohler commented Jul 19, 2024

Description

Update instances of

  • AdditiveGaussianWrapper --> AdditiveGaussianModule
  • OrnsteinUhlenbeckProcessWrapper --> OrnsteinUhlenbeckProcessModule

everywhere in the code base, except in test/test_exploration.py, which should still test both the wrappers and modules until we finally remove the wrappers in the future.

Motivation and Context

close #2295

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

@kurtamohler kurtamohler requested a review from vmoens July 19, 2024 20:29
Copy link

pytorch-bot bot commented Jul 19, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/rl/2298

Note: Links to docs will display an error until the docs builds have been completed.

❌ 3 New Failures, 1 Pending, 3 Unrelated Failures

As of commit 085bef2 with merge base bdc9784 (image):

NEW FAILURES - The following jobs have failed:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 19, 2024
@@ -108,7 +108,7 @@ def main(cfg: "DictConfig"): # noqa: F821
for _, tensordict in enumerate(collector):
sampling_time = time.time() - sampling_start
# Update exploration policy
exploration_policy.step(tensordict.numel())
exploration_policy[1].step(tensordict.numel())
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not completely sure if this is the best way to do this. Does there happen to be some alternative to TensorDictSequential which does essentially the same thing but also provides a step function?

Copy link
Collaborator Author

@kurtamohler kurtamohler Jul 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, it looks like the same thing was done when EGreedyWrapper was updated to EGreedyModule, so I guess it's alright:

policy = TensorDictSequential(
actor,
EGreedyModule(

policy[1].step()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep either that or

def update_exploration(module):
    if isinstance(module, ExplorationModule):
        module.set()
policy.apply(update_exploration)

We could make sure that all exploration modules have the same parent class and use that update function across examples.

@kurtamohler kurtamohler force-pushed the update-exploration-modules-0 branch from 9c4ccbd to b40430b Compare July 19, 2024 21:08
@vmoens vmoens changed the title Update all instances of exploration *Wrapper to *Module [Refactor] Update all instances of exploration *Wrapper to *Module Jul 22, 2024
@vmoens vmoens added the Refactoring Refactoring of an existing feature label Jul 22, 2024
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks for this
In a second time we could consider refactoring the update methods, happy to read your thoughts about this

@@ -108,7 +108,7 @@ def main(cfg: "DictConfig"): # noqa: F821
for _, tensordict in enumerate(collector):
sampling_time = time.time() - sampling_start
# Update exploration policy
exploration_policy.step(tensordict.numel())
exploration_policy[1].step(tensordict.numel())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep either that or

def update_exploration(module):
    if isinstance(module, ExplorationModule):
        module.set()
policy.apply(update_exploration)

We could make sure that all exploration modules have the same parent class and use that update function across examples.

@@ -200,7 +203,7 @@ def train(cfg: "DictConfig"): # noqa: F821
optim.zero_grad()
target_net_updater.step()

policy_explore.step(frames=current_frames) # Update exploration annealing
policy_explore[1].step(frames=current_frames) # Update exploration annealing
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the example I gave above, the update_exploration or step_exploration should be turned into a class to allow us to pass the current_frames

@vmoens vmoens merged commit 87f66e8 into pytorch:main Jul 22, 2024
49 of 55 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Refactoring Refactoring of an existing feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Convert TensorDictModuleWrappers to TensorDictModules
3 participants