As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In recent studies, image or speech recognition, psychology, and economics, etc. in real big data have been analyzed by learning systems. It is one of important problems to approximate an unknown true density function from training data selected from the true density function independently and identically using learning models. In a stochastic model, many hierarchical learning models for analyzing real data have been proposed, and proved to be effective. They are, however, singular, which classic theories for regular models cannot apply to. Therefore, the need for appropriate model selection methods for singular models has increased and several information criteria for singular models have been developed. For example, singular Bayesian information criterion, widely applicable information criterion, widely applicable Bayesian information criterion, and cross-validation have been considered based on mathematical theorems in algebraic analysis and geometry. In this paper, we consider learning coefficients in learning theory, which serve to measure the main term of learning efficiency in singular learning models. These coefficients have an important role in information criteria and are mathematically equal to the log canonical thresholds of Kullback functions. We show several mathematical theorems for obtaining these coefficients, and apply these theorems to Poisson distribution mixture models.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.