Extended Hierarchical Extreme Learning Machine with Multilayer Perceptron
Main Article Content
Abstract
The Deep Learning approach provides a high performance of classification, especially when invoking image classification problems. However, a shortcoming of the traditional Deep Learning method is the large time scale of training. The hierarchical extreme learning machine (H-ELM) framework was based on the hierarchical learning architecture of multilayer perceptron to address the problem. H-ELM is composed of two parts; the first entails unsupervised multilayer encoding, and the second is the supervised feature classification. H-ELM can give a higher accuracy rate than the traditional ELM. However, there still remains room to enhance its classification performance. This paper therefore proposes a new method termed the extending hierarchical extreme learning machine (EH-ELM), which extends the number of layers in the supervised portion of the H-ELM from a single layer to multiple layers. To evaluate the performance of the EH-ELM, the various classification datasets were studied and compared with the H-ELM and the multilayer ELM, as well as various state-of-the-art such deep architecture methods. The experimental results show that the EH-ELM improved the accuracy rates over most other methods.
Article Details
References
G.-B. Huang, H. Zhou, X. Ding, and R. Zhang,
“Extreme learning machine for regression and
multiclass classification,” IEEE Trans. Syst.,
Man, Cybern. B, Cybern., vol. 42, no. 2, pp. 513-
, Apr. 2012.
K. Hornik, M. Stinchcombe, H. White, “Multilayer feedforward networks are universal approximators,” Neural Netw., vol. 2, no. 5, pp.359-366,
K. Hornik, “Approximation capabilities of multilayer feedforward networks,” Neural Netw., vol.
, no. 2, pp.251-257, 1991.
C. Bishop, Pattern Recognition and Machine
Learning, New York, NY, USA: Springer-Verlag,
Suykens, J. A., and Vandewalle, J., “Least
squares support vector machine classifiers,” Neural processing letters, vol.9, no. 3,pp. 293-300,
A. A. Mohammed, R. Minhas, Q. M. J. Wu,
and M. A. Sid-Ahmed,“Human face recognition
based on multidimensional PCA and extreme
learning machine,” Pattern Recognit., vol. 44,
nos. 10-11, pp. 2588-2597, 2011.
C. Pan, D. S. Park, Y. Yang, and H. M. Yoo,
“Leukocyte image segmentation by visual attention and extreme learning machine,” Neural
Comput. Appl., vol. 21, no. 6, pp. 1217-1227,
R. Minhas, A. Baradarani, S. Seifzadeh, and Q.
M. J. Wu, “Human action recognition using extreme learning machine based on visual vocabularies,” Neurocomputing, vol. 73, no. 10-12, pp.
-1917, 2010.
S. Cheng, J. Yan, D. Zhao, et al., “Shortterm load forecasting method based on ensemble
improved extreme learning machine,” J. Xi’an
Jiaotong University. 2:029, 2009.
L. Mao, Y. Wang, X. Liu, et al., “Short-term
power load forecasting method based on improved extreme learning machine,” Power Syst.
Prot. Control, vol. 40, no. 20, pp.140-144, 2012.
L. L. C. Kasun, H. Zhou, G.-B. Huang, and C.
M. Vong, “Representational learning with extreme learning machine for big data,” IEEE Intell. Syst., vol. 28, no. 6, pp. 31-34, Nov. 2013.
G. Hinton, S. Osindero, and Y. Teh, “A fast
learning algorithm for deep belief nets,” Neural
Comput., vol. 18, no. 7, pp. 1527-1554, Jul. 2006.
R. Salakhutdinov and G. Hinton, “Deep Boltzmann machines,” in Proc. 12th Int. Conf. Artif.
ECTI TRANSACTIONS ON COMPUTER AND INFORMATION TECHNOLOGY VOL.10, NO.2 November 2016
Intell. Statist., Clearwater Beach, FL, USA, pp.
-455, Jul. 2009.
G.-B. Huang, L. Chen, and C.-K. Siew, “Universal approximation using incremental constructive feedforward networks with random hidden
nodes,” IEEE Trans. Neural Netw., vol. 17, no.
, pp. 879-892, Jul. 2006.
Y. Bengio, “Learning deep architectures for AI,”
Found. Trends Mach. Learn., vol. 2, no. 1, pp.
-127, 2009.
Y. Bengio, A. Courville, and P. Vincent, “Representation learning: A review and new perspectives,” IEEE Trans. Pattern Anal. Mach. Intell.,
vol. 35, no. 8, pp. 1798-1828, Aug. 2013.
Y. Bengio, “Learning deep architectures for AI,”
Found. Trends Mach. Learn., vol. 2, no. 1, pp.
-127, 2009.
Jiexiong Tang, Chenwei Deng, and Guang-Bin
Huang, “Extreme Learning Machine for Multilayer Perceptron,” IEEE Transactions on Neural Networks and Learning Systems, vol. 5, no.
, pp. 809-821, 2015.
H.-G. Han, L.-D. Wang, J.-F. Qiao, “Hierarchical extreme learning machine for feedforward
neural network,” Neurocomputing, pp. 128-135.
Qu, B. Y., et al. “Two-hidden-layer extreme
learning machine for regression and classification,” Neurocomputing, vol. 175, pp. 826-834,
Jan. 2016.
S. I. Tamura and M. Tateishi, “Capabilities of
a four-layered feedforward neural network: four
layers versus three,” IEEE Transactions on Neural Networks, vol.8, no. 2, pp.251-255, 1997.
G.-B. Huang, “Learning capability and storage
capacity of two-hidden-layer feedforward networks,” IEEE Transactions on Neural Networks,
vol.14 no.2, pp.274-281, 2003.
C. L. Blake and C. J. Merz, “UCI Repository of Machine Learning Databases,”
Dept. Inf. Comput. Sci., Univ. California, Irvine, CA, 1998. [Online]. Available:http://www.ics.uci.edu/ mlearn/MLRepository.html
T. R. Golub, D. K. Slonim, P. Tamayo, C.
Huard, M. Gaasenbeek, J. P. Mesirov, H. Coller,
M. L. Loh, J. R. Downing, M. A. Caligiuri, C. D.
Bloomfield, and E. S. Lander, “Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring,” science,
pp.531-537,1999.
Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner,
“Gradient-based learning applied to document
recognition,” Proc. IEEE, vol. 86, no. 11, pp.
-2324, Nov. 1998.
Y. LeCun, F. J. Huang, and L. Bottou, “Learning methods for generic object recognition with
invariance to pose and lighting,” Computer Vision and Pattern Recognition, 2004. CVPR
Proceedings of the 2004 IEEE Computer
Society Conference on., Vol. 2, 2004.
G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural
networks,” Science, vol. 313, no. 5786, pp. 504-
,2006.
P. Vincent, H. Larochelle, Y. Bengio, and P.-
A. Manzagol, “Extracting and composing robust
features with denoising autoencoders,” in Proc.
th Int. Conf. Mach. Learn., Helsinki, Finland,
Jul. 2008, pp. 1096-1103.