An implementation of the ID3 algorithm for decision trees.
Plots of error vs tree depth
Learned Decision Tree and Confusion Matrices for depths 1 and 2 on Monks1 data.
Node value = None
Feature f1{1, 2, 3}
Node value = 1
Label value = 0
Node value = 2
Label value = 1
Node value = 3
Label value = 1
048 | 014
031 | 031
144 | 072
144 | 072
Node value = None
Feature f1{1, 2, 3}
Node value = 1
Feature f2{1, 2, 3}
Node value = 1
Label value = 1
Node value = 2
Label value = 0
Node value = 3
Label value = 0
Node value = 2
Feature f2{1, 2, 3}
Node value = 1
Label value = 0
Node value = 2
Label value = 1
Node value = 3
Label value = 0
Node value = 3
Feature f2{1, 2, 3}
Node value = 1
Label value = 0
Node value = 2
Label value = 0
Node value = 3
Label value = 1
041 | 021
000 | 062
144 | 072
000 | 216
200 | 016
036 | 180
Node value = None
Feature f1{0, 1}
Node value = 0
Label value = 0
Node value = 1
Label value = 1
018 | 022
011 | 029
087 | 085
003 | 012
Node value = None
Feature f1{0, 1}
Node value = 0
Feature f2{0, 1}
Node value = 0
Label value = 0
Node value = 1
Label value = 1
Node value = 1
Feature f2{0, 1}
Node value = 0
Label value = 1
Node value = 1
Label value = 1
022 | 018
013 | 027
105 | 067
003 | 012
118 | 054
003 | 012