Mixtures of truncated basis functions (MoTBFs) have been recently proposed as a generalisation of mixtures of truncated exponentials and mixtures of polynomials for modelling univariate and conditional distributions in hybrid Bayesian networks. In this paper we analyse the problem of incorporating prior knowledge when learning univariate MoTBFs. We consider scenarios where the prior knowledge is expressed as an MoTBF that is combined with another MoTBF density estimated from the available data. An important property, from the point of view of inference in hybrid Bayesian networks, is that the density yielded after the combination is again an MoTBF. We show the performance of the proposed method in a series of experiments with simulated data. The experiments suggest that the incorporation of prior knowledge improves the estimations, especially with scarce data.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org