Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add IOpt tuner #70

Merged
merged 23 commits into from
May 30, 2023
Merged

Add IOpt tuner #70

merged 23 commits into from
May 30, 2023

Conversation

YamLyubov
Copy link
Collaborator

@YamLyubov YamLyubov commented Mar 23, 2023

  • Adds new IOptTuner
  • New base class for tuners is introduced BaseTuner
  • The way search space is defined is changed

Concerns:

IOpt supports only continuous parameters, so discrete ones are optimised as continuous and are rounded. Also there is no support for timeout and initial point.

Despite this, on classification task IOpt oftenly outperformes huperopt (but also usually takes more time, especially when search space has a lot of dimentions)

Metric improvement comparison
tuner comp metric

Time spent
tuner comp time

@YamLyubov YamLyubov added enhancement New feature or request architecture (re)design of existing or new framework subsystem labels Mar 23, 2023
@codecov-commenter
Copy link

codecov-commenter commented Mar 23, 2023

Codecov Report

Merging #70 (ac1acd8) into main (09bd64e) will increase coverage by 0.63%.
The diff coverage is 98.72%.

@@            Coverage Diff             @@
##             main      #70      +/-   ##
==========================================
+ Coverage   71.31%   71.95%   +0.63%     
==========================================
  Files         120      122       +2     
  Lines        6691     6835     +144     
==========================================
+ Hits         4772     4918     +146     
+ Misses       1919     1917       -2     
Impacted Files Coverage Δ
golem/core/tuning/tuner_interface.py 91.86% <94.11%> (+0.72%) ⬆️
golem/core/tuning/hyperopt_tuner.py 97.56% <97.56%> (ø)
golem/core/tuning/iopt_tuner.py 100.00% <100.00%> (ø)
golem/core/tuning/search_space.py 100.00% <100.00%> (ø)
golem/core/tuning/sequential.py 95.31% <100.00%> (-0.28%) ⬇️
golem/core/tuning/simultaneous.py 94.11% <100.00%> (+3.32%) ⬆️

@aim-pep8-bot
Copy link
Collaborator

aim-pep8-bot commented Apr 13, 2023

Hello @YamLyubov! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2023-05-26 10:25:55 UTC

@YamLyubov YamLyubov requested review from gkirgizov and nicl-nno April 13, 2023 10:04

self._default_metric_value = np.inf

def Calculate(self, point: Point, functionValue: FunctionValue) -> FunctionValue:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А почему не calculate?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Данный метод - реализация абстрактного метода из класса Problem из IOpt. Поэтому переименовать его можно только там

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ок. Ну в целом можно вкинуть коллегам PR с правкой)

golem/core/tuning/tuner_interface.py Show resolved Hide resolved
requirements.txt Outdated Show resolved Hide resolved
golem/core/tuning/search_space.py Outdated Show resolved Hide resolved
@gkirgizov
Copy link
Collaborator

gkirgizov commented Apr 17, 2023

IOpt does not support nested search space, which can be nessesary

Касательно вложенных параметров. В целом, можно реаизовать двухсторонний маппинг, который будет вложенные словари переводить в не-вложенные, как-то специально переименовывая параметры. В духе

  1. Трансформируем вложенный в не-вложенный: {'p1': 25, 'p_nested': {'a': 2}} -> {'p1': 25, 'p_nested/a': 2}
  2. Отдаем оптимизатору
  3. Трансформируем обратно

IOpt supports only continuous parameters, so discrete ones are optimised as continuous and are rounded.

Да, звучит неплохо, тут все равно на нашей стороне ничего не поделаешь. Если только можно как-то шаг оптимизатора контролировать отдельно под каждыдй параметр.

UPD: А вообще, правильно бы коллегам это в качестве issue закинуть, чем костыли вокруг подставлять. @YamLyubov , сделаешь?

golem/core/tuning/iopt_tuner.py Outdated Show resolved Hide resolved
golem/core/tuning/iopt_tuner.py Show resolved Hide resolved
golem/core/tuning/iopt_tuner.py Outdated Show resolved Hide resolved
golem/core/tuning/iopt_tuner.py Outdated Show resolved Hide resolved
golem/core/tuning/iopt_tuner.py Show resolved Hide resolved
golem/core/tuning/search_space.py Outdated Show resolved Hide resolved
golem/core/tuning/search_space.py Outdated Show resolved Hide resolved
golem/core/tuning/search_space.py Outdated Show resolved Hide resolved
golem/core/tuning/tuner_interface.py Outdated Show resolved Hide resolved
test/unit/tuning/test_tuning.py Show resolved Hide resolved
@YamLyubov YamLyubov requested review from nicl-nno and gkirgizov April 18, 2023 09:49
golem/core/tuning/iopt_tuner.py Outdated Show resolved Hide resolved
golem/core/tuning/tuner_interface.py Outdated Show resolved Hide resolved
@YamLyubov YamLyubov merged commit 4aadb7a into main May 30, 2023
@YamLyubov YamLyubov deleted the iopt-tuner branch May 30, 2023 09:57
YamLyubov added a commit that referenced this pull request Jun 13, 2023
* WIP add iOpt tuner

* Refactor SearchSpace

* Fix tests

* Add parameters for Solver

* Add show progress

* Add docstrings

* Fix pep8

* Fix pep8

* Refactor SearchSpace

* Fix line breaks

* Move methods

* Fix typings

* Add line breaks

* Add type alias for SearchSpace

* Fix typing

* Minor

* Fix pep8

* Move hyperopt tuner

* Add kwargs to IOptTuner

* Correct tuning docs

* Correct search space in docs

* Fix typo

* Move initial check
YamLyubov added a commit that referenced this pull request Jun 14, 2023
* WIP add iOpt tuner

* Refactor SearchSpace

* Fix tests

* Add parameters for Solver

* Add show progress

* Add docstrings

* Fix pep8

* Fix pep8

* Refactor SearchSpace

* Fix line breaks

* Move methods

* Fix typings

* Add line breaks

* Add type alias for SearchSpace

* Fix typing

* Minor

* Fix pep8

* Move hyperopt tuner

* Add kwargs to IOptTuner

* Correct tuning docs

* Correct search space in docs

* Fix typo

* Move initial check
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
architecture (re)design of existing or new framework subsystem enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants