Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added default values and one more argument "max_checkpoints" to --help #215

Merged
merged 2 commits into from
Feb 9, 2017

Conversation

Fjolnir-Dvorak
Copy link
Contributor

added default values to --help and added a new parameter --max_checkpoints to prevent default deleting checkpoints if there are more than 5.

…ints to prevent default deleting checkpoints if there are more than 5.
train.py Outdated
@@ -267,7 +270,7 @@ def main():
sess.run(init)

# Saver for storing checkpoints of the model.
saver = tf.train.Saver(var_list=tf.trainable_variables())
saver = tf.train.Saver(var_list=tf.trainable_variables(), max_to_keep=args.max_checkpoints) # TODO hier ansetzen
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does this TODO mean?

train.py Outdated
@@ -44,13 +45,13 @@ def _str_to_bool(s):

parser = argparse.ArgumentParser(description='WaveNet example network')
parser.add_argument('--batch_size', type=int, default=BATCH_SIZE,
help='How many wav files to process at once.')
help='How many wav files to process at once. Default: 1')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be taken from BATCH_SIZE variable.

train.py Outdated
parser.add_argument('--data_dir', type=str, default=DATA_DIRECTORY,
help='The directory containing the VCTK corpus.')
parser.add_argument('--store_metadata', type=bool, default=False,
help='Whether to store advanced debugging information '
'(execution time, memory consumption) for use with '
'TensorBoard.')
'TensorBoard. Default: False')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's also use a variable with default parameter value here (and other similar places).

…er.add_argument functions. Also removed two TODOs I forgot to delete. Sorry about that.
@akademi4eg
Copy link
Collaborator

What is a use case where having more than few last checkpoints would be useful?

@Fjolnir-Dvorak
Copy link
Contributor Author

  • To be able to perform better bug analysis like locating the appearance of a NaN Loss value on the latest commit from tensorflow-wavenet which happened somewhere (Forgot to grep and pipe the consolte output into a file 'cause I thought the issue were closed. Trying to figure out where the NaN Loss came from).
  • To analyse the learning of the net and perform research on ideas to improve the net and evaluate why the net functions as it does and how it evolvet to that.
  • Usability, functionality (More features which are pre given and need not to be tested and do not overcomplicate the program are a good thing in my opinion)
  • More information to be able to research the human recognition of sounds and the quality and cardinality of the net
  • To be able to let the user decide if he only want to have five checkpoints or if he want to do in-depth study.

@akademi4eg
Copy link
Collaborator

@Fjolnir-Dvorak ok, sounds reasonable.

@akademi4eg akademi4eg merged commit cb0f86b into ibab:master Feb 9, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants