Regularized online sequential learning algorithm for single-hidden layer feedforward neural networks

Năm xuất bản

2011-10-15

Nhan đề tạp chí

ISSN

Nhan đề tập

Nhà xuất bản

Tóm tắt

Online learning algorithms have been preferred in many applications due to their ability to learn by the sequentially arriving data. One of the effective algorithms recently proposed for training single hidden- layer feedforward neural networks (SLFNs) is online sequential extreme learning machine (OS-ELM), which can learn data one-by-one or chunk-by-chunk at fixed or varying sizes. It is based on the ideas of extreme learning machine (ELM), in which the input weights and hidden layer biases are randomly chosen and then the output weights are determined by the pseudo-inverse operation. The learning speed of this algorithm is extremely high. However, it is not good to yield generalization models for noisy data and is difficult to initialize parameters in order to avoid singular and ill-posed problems. In this paper, we propose an improvement of OS-ELM based on the bi-objective optimization approach. It tries to minimize the empirical error and obtain small norm of network weight vector. Singular and ill-posed problems can be overcome by using the Tikhonov regularization. This approach is also able to learn data one-by-one or chunk-by-chunk. Experimental results show the better generalization performance of the proposed approach on benchmark datasets.

Mô tả

page 6

Từ khóa chủ đề

Neural networks, Online learning algorithm, ELM, OS-ELM, ReOS-ELM, Multiobjective training algorithms

Trích dẫn

Pattern Recognition Letters

Bộ sưu tập