We propose an unsupervised learning algorithm for the estimation of the number of components and the parameters of a mixture model. It starts from a single mixture component covering the whole data set (therefore avoiding the ill-posed problem of the components' initialization, saving also computational burden). Then, it incrementally splits that component during expectation maximization steps, thus exploiting the full space of solutions following a binary tree structure. After each component insertion it evaluates whether accepting this new solution or discarding it according with the chosen information criterion. We show that the method is faster that state-of-the-art alternatives, is insensitive to initialization (deterministic initialization strategy), and has better data fits in average. This is illustrated through a series of experiments, both with synthetic and real images.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 firstname.lastname@example.org
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 email@example.com