As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Extreme Learning Machine (ELM) is an approach recently proposed in the literature for Neural Network (NN) training. It has been shown to be much faster than the traditional gradient-based learning algorithms, with many variants, extensions and applications also investigated in the last few years. Among them, the ELM paradigm has been applied to train Time-Variant Neural Networks (TV-NN), through which the training time can be greatly reduced w.r.t. common Back-Propagation (BP). However, this approach may require more hidden nodes and the right type of basis function (through which time-dependence is introduced in TV-NN weight parameters) to attain good generalization. In this paper, we propose a hybrid learning algorithm, which applies differential evolutionary (to determine related input weights) and group selection method (to determine the type of basis function). Experimental results show that the proposed method allows achieving more compact networks yet with better generalization performance at the cost of larger training time; moreover the algorithm behavior is anyway significantly more performing than the BP one.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.