-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip tuner algorithms on fast dev #3903
Skip tuner algorithms on fast dev #3903
Conversation
Hello @SkafteNicki! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-11-09 18:37:45 UTC |
Codecov Report
@@ Coverage Diff @@
## master #3903 +/- ##
======================================
Coverage 93% 93%
======================================
Files 116 116
Lines 8831 8837 +6
======================================
+ Hits 8232 8238 +6
Misses 599 599 |
This pull request is now in conflict... :( |
This pull request is now in conflict... :( |
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
This pull request is now in conflict... :( |
@SkafteNicki is this still relevant/required?? |
@rohitgr7 yes, it is still relevant, and it is probably ready merging |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great. LGTM :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great catch !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
* skip on fast dev * fix error * changelog * fix recursive issue * combine tests * pep8 * move logic to base funcs * fix mistake * Update pytorch_lightning/tuner/lr_finder.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * pep Co-authored-by: William Falcon <waf2107@columbia.edu> Co-authored-by: Nicki Skafte <nugginea@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: chaton <thomas@grid.ai> (cherry picked from commit 4f3160b)
* skip on fast dev * fix error * changelog * fix recursive issue * combine tests * pep8 * move logic to base funcs * fix mistake * Update pytorch_lightning/tuner/lr_finder.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * pep Co-authored-by: William Falcon <waf2107@columbia.edu> Co-authored-by: Nicki Skafte <nugginea@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: chaton <thomas@grid.ai> (cherry picked from commit 4f3160b)
This reverts commit 189ed25
* skip on fast dev * fix error * changelog * fix recursive issue * combine tests * pep8 * move logic to base funcs * fix mistake * Update pytorch_lightning/tuner/lr_finder.py Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> * pep Co-authored-by: William Falcon <waf2107@columbia.edu> Co-authored-by: Nicki Skafte <nugginea@gmail.com> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: chaton <thomas@grid.ai>
What does this PR do?
Fixes #3241
Using
fast_dev_run=True
in combination with tuner algorithms currently give strange error messages.fast_dev_run=True
is meant to see thattrainer.fit()
runs without errors, not the tuner algorithms.This PR implements that if the users calls the tuner algorithms while
fast_dev_run=True
, the tuning will be skipped and a warning will be thrown.Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃