The paper presented exposes a novel approach to feed data to a Convolutional Neural Network (CNN) while training. Normally, neural networks are fed with shuffled data without any control of what type of examples contains a mini-batch. For situations where data are abundant and there does not exist an unbalancing between classes, shuffling the training data is enough to ensure a balanced mini-batch. On the contrary, most real-world problems end up with databases where some classes are predominant vs others, ill-conditioning the training network to learn those classes forgetting the others.
For those conditioned cases, most common methods simply discard a certain number of samples until the data is balanced, but this paper proposes an ordered method of feeding data while preserving randomness in the mini-batch composition and using all available samples. This method has proven to solve the problem with unbalanced data-sets while competing with other methods. Moreover, the paper will focus its attention to a well know CNN network structure, named Deep Residual Networks.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org