20 Jan 2017, Keunwoo Choi
- A compact cnn - 5 layers, batch normalizationm, ELU, 32 feature maps for each conv layers.
- AUC is about 0.849 for the tagging task.
- Best of pre-trained: cherry-picked from pre-trained
- Concatenating 12345 + MFCCs : concat(all pre-trained features, MFCCs)
- MFCCs: mean and vars of {MFCC, dMFCC, ddMFCC}
- SoTA: reported state-of-the-art scores
More details coming soon.
- set
image_dim_ordering()
==th
. - It works on both tensorflow/theano backend.
- install kapre OLD VERSION by
$ git clone https://github.com/keunwoochoi/kapre.git
$ cd kapre
$ git checkout a3bde3e
$ python setup.py install
- See
main.py
for an example. - It is not the most efficient implementation, but the easiest for me :) still it's not slow even for cpu-based inference.
Tested on Keras 1.2.1
Coming soon.