Skip to content

MISS ONE OUT: A CROSSVALIDATION METHOD UTILIZING INDUCED TEACHER NOISE

Notifications You must be signed in to change notification settings

KingHodor/MISS-ONE-OUT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MISS ONE OUT: A CROSS-VALIDATION METHOD UTILIZING INDUCED TEACHER NOISE

Leave-one-out (LOO) and its generalization, K-Fold, are among most well-known cross-validation methods, which divide the sample into many folds, each one of which is, in turn, left out for testing, while the other parts are used for training. In this study, as an extension of this idea, we propose a new cross-validation approach that we called miss-one-out (MOO) that mislabels the example(s) in each fold and keeps this fold in the training set as well, rather than leaving it out as LOO does. Then, MOO tests whether the trained classifier can correct the erroneous label of the training sample. In principle, having only one fold deliberately labeled incorrectly should have only a small effect on the classifier that uses this bad-fold along with K - 1 good folds and can be utilized as a generalization measure of the classifier. Experimental results on a number of benchmark datasets and three real bioinformatics dataset show that MOO can better estimate the test set accuracy of the classifier.

Read more in the article.

Getting help

Please contact sfa.alptekin@gmail.com

About

MISS ONE OUT: A CROSSVALIDATION METHOD UTILIZING INDUCED TEACHER NOISE

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages