Skip to content

TensorBoardLogger should be able to add metric names in hparams #1111

Closed
@tstumm

Description

🚀 Feature

TensorBoard allows investigating the effect of hyperparameters in the hparams tab. Unfortunately, the log_hyperparams function in TensorBoardLogger cannot add any information about which of the logged metrics is actually a "metric" which can be used for such a comparison.

Motivation

I would like to use the built-in hparams module of TensorBoard to evaluate my trainings.

Pitch

PyTorch-Lightning should give me the possibility to define the metrics of my model in some way such that any logger is able to derive which metric may be used for hyperparameter validation, as well as other possible characteristics which may be defined for those.

Additional context

The hparams method of a summary takes the following parameters:

def hparams(hparam_dict=None, metric_dict=None):

metric_dict is basically a dictionary mapping metric names to values, whereas the values are omitted in the function itself.

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementhelp wantedOpen to be worked onwon't fixThis will not be worked on

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions