Skip to content

Commit

Permalink
Prepare run
Browse files Browse the repository at this point in the history
  • Loading branch information
denisvstepanov committed Jun 4, 2018
1 parent c8d5c01 commit 9c7313d
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion scripts/ast/run.sh
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
#!/bin/bash

./scripts/ast/train.sh nt2n_base 04Jul_nt2n_base_new_embedding_size
./scripts/ast/train.sh nt2n_base_attention_plus_layered 04Jun_nt2n_base_attention_plus_layered_new_embedding_size
2 changes: 1 addition & 1 deletion scripts/ast/train.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ PYTHONPATH=. python3 -m cProfile -o program.prof zerogercrnn/experiments/ast_lev
--data_limit 100000 \
--model_save_dir saved/$2 \
--seq_len 50 \
--batch_size 80 \
--batch_size 128 \
--learning_rate 0.001 \
--epochs 8 \
--decay_after_epoch 0 \
Expand Down
2 changes: 1 addition & 1 deletion zerogercrnn/experiments/ast_level/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def optimize(self, loss):
# Backward pass
loss.backward()
torch.nn.utils.clip_grad_norm_(filter_requires_grad(self.model.parameters()), 5)
# torch.nn.utils.clip_grad_norm_(filter_requires_grad(self.model.sparse_parameters()), 5)
torch.nn.utils.clip_grad_norm_(filter_requires_grad(self.model.sparse_parameters()), 5)

# Optimizer step
for optimizer in self.optimizers:
Expand Down

0 comments on commit 9c7313d

Please sign in to comment.