-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add IOpt tuner #70
Add IOpt tuner #70
Conversation
Codecov Report
@@ Coverage Diff @@
## main #70 +/- ##
==========================================
+ Coverage 71.31% 71.95% +0.63%
==========================================
Files 120 122 +2
Lines 6691 6835 +144
==========================================
+ Hits 4772 4918 +146
+ Misses 1919 1917 -2
|
Hello @YamLyubov! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2023-05-26 10:25:55 UTC |
|
||
self._default_metric_value = np.inf | ||
|
||
def Calculate(self, point: Point, functionValue: FunctionValue) -> FunctionValue: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
А почему не calculate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Данный метод - реализация абстрактного метода из класса Problem
из IOpt. Поэтому переименовать его можно только там
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ок. Ну в целом можно вкинуть коллегам PR с правкой)
Касательно вложенных параметров. В целом, можно реаизовать двухсторонний маппинг, который будет вложенные словари переводить в не-вложенные, как-то специально переименовывая параметры. В духе
Да, звучит неплохо, тут все равно на нашей стороне ничего не поделаешь. Если только можно как-то шаг оптимизатора контролировать отдельно под каждыдй параметр. UPD: А вообще, правильно бы коллегам это в качестве issue закинуть, чем костыли вокруг подставлять. @YamLyubov , сделаешь? |
* WIP add iOpt tuner * Refactor SearchSpace * Fix tests * Add parameters for Solver * Add show progress * Add docstrings * Fix pep8 * Fix pep8 * Refactor SearchSpace * Fix line breaks * Move methods * Fix typings * Add line breaks * Add type alias for SearchSpace * Fix typing * Minor * Fix pep8 * Move hyperopt tuner * Add kwargs to IOptTuner * Correct tuning docs * Correct search space in docs * Fix typo * Move initial check
* WIP add iOpt tuner * Refactor SearchSpace * Fix tests * Add parameters for Solver * Add show progress * Add docstrings * Fix pep8 * Fix pep8 * Refactor SearchSpace * Fix line breaks * Move methods * Fix typings * Add line breaks * Add type alias for SearchSpace * Fix typing * Minor * Fix pep8 * Move hyperopt tuner * Add kwargs to IOptTuner * Correct tuning docs * Correct search space in docs * Fix typo * Move initial check
IOptTuner
BaseTuner
Concerns:
IOpt supports only continuous parameters, so discrete ones are optimised as continuous and are rounded. Also there is no support for timeout and initial point.
Despite this, on classification task IOpt oftenly outperformes huperopt (but also usually takes more time, especially when search space has a lot of dimentions)
Metric improvement comparison
Time spent