Regularization activation function for Extreme Learning Machine

Noraini Ismail, Zulaiha Ali Othman, Noor Azah Samsudin

Research output: Contribution to journalArticle

Abstract

Extreme Learning Machine (ELM) algorithm based on single hidden layer feedforward neural networks has shown as the best time series prediction technique. Furthermore, the algorithm has a good generalization performance with extremely fast learning speed. However, ELM facing overfitting problem that can affect the model quality due to the implementation using empirical risk minimization scheme. Therefore, this study aims to improve ELM by introducing an Activation Functions Regularization in ELM called RAF-ELM. The experiment has been conducted in two phases. First, investigating the modified RAF-ELM performance using four types of activation functions are: Sigmoid, Sine, Tribas and Hardlim. In this study, input weight and bias for hidden layers are randomly selected, whereas the best neurons number of hidden layer is determined from 5 to 100. This experiment used UCI benchmark datasets. The number of neurons (99) using Sigmoid activation function shown the best performance. The proposed methods has improved the accuracy performance and learning speed up to 0.016205 MAE and processing time 0.007 seconds respectively compared with conventional ELM and has improved up to 0.0354 MSE for accuracy performance compare with state of the art algorithm. The second experiment is to validate the proposed RAF-ELM using 15 regression benchmark dataset. RAF-ELM has been compared with four neural network techniques namely conventional ELM, Back Propagation, Radial Basis Function and Elman. The results show that RAF-ELM technique obtain the best performance compared to other techniques in term of accuracy for various time series data that come from various domain.

Original languageEnglish
Pages (from-to)241-248
Number of pages8
JournalInternational Journal of Advanced Computer Science and Applications
Volume10
Issue number3
DOIs
Publication statusPublished - 1 Jan 2019

Fingerprint

Learning systems
Chemical activation
Neurons
Time series
Experiments
Feedforward neural networks
Backpropagation
Neural networks
Processing

Keywords

  • Extreme learning machine
  • Neural networks
  • Prediction
  • Regularization
  • Time series

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

Regularization activation function for Extreme Learning Machine. / Ismail, Noraini; Ali Othman, Zulaiha; Samsudin, Noor Azah.

In: International Journal of Advanced Computer Science and Applications, Vol. 10, No. 3, 01.01.2019, p. 241-248.

Research output: Contribution to journalArticle

@article{4e3e5bb02d1141c4a9ba5ec39938128c,
title = "Regularization activation function for Extreme Learning Machine",
abstract = "Extreme Learning Machine (ELM) algorithm based on single hidden layer feedforward neural networks has shown as the best time series prediction technique. Furthermore, the algorithm has a good generalization performance with extremely fast learning speed. However, ELM facing overfitting problem that can affect the model quality due to the implementation using empirical risk minimization scheme. Therefore, this study aims to improve ELM by introducing an Activation Functions Regularization in ELM called RAF-ELM. The experiment has been conducted in two phases. First, investigating the modified RAF-ELM performance using four types of activation functions are: Sigmoid, Sine, Tribas and Hardlim. In this study, input weight and bias for hidden layers are randomly selected, whereas the best neurons number of hidden layer is determined from 5 to 100. This experiment used UCI benchmark datasets. The number of neurons (99) using Sigmoid activation function shown the best performance. The proposed methods has improved the accuracy performance and learning speed up to 0.016205 MAE and processing time 0.007 seconds respectively compared with conventional ELM and has improved up to 0.0354 MSE for accuracy performance compare with state of the art algorithm. The second experiment is to validate the proposed RAF-ELM using 15 regression benchmark dataset. RAF-ELM has been compared with four neural network techniques namely conventional ELM, Back Propagation, Radial Basis Function and Elman. The results show that RAF-ELM technique obtain the best performance compared to other techniques in term of accuracy for various time series data that come from various domain.",
keywords = "Extreme learning machine, Neural networks, Prediction, Regularization, Time series",
author = "Noraini Ismail and {Ali Othman}, Zulaiha and Samsudin, {Noor Azah}",
year = "2019",
month = "1",
day = "1",
doi = "10.14569/IJACSA.2019.0100331",
language = "English",
volume = "10",
pages = "241--248",
journal = "International Journal of Advanced Computer Science and Applications",
issn = "2158-107X",
publisher = "Science and Information Organization",
number = "3",

}

TY - JOUR

T1 - Regularization activation function for Extreme Learning Machine

AU - Ismail, Noraini

AU - Ali Othman, Zulaiha

AU - Samsudin, Noor Azah

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Extreme Learning Machine (ELM) algorithm based on single hidden layer feedforward neural networks has shown as the best time series prediction technique. Furthermore, the algorithm has a good generalization performance with extremely fast learning speed. However, ELM facing overfitting problem that can affect the model quality due to the implementation using empirical risk minimization scheme. Therefore, this study aims to improve ELM by introducing an Activation Functions Regularization in ELM called RAF-ELM. The experiment has been conducted in two phases. First, investigating the modified RAF-ELM performance using four types of activation functions are: Sigmoid, Sine, Tribas and Hardlim. In this study, input weight and bias for hidden layers are randomly selected, whereas the best neurons number of hidden layer is determined from 5 to 100. This experiment used UCI benchmark datasets. The number of neurons (99) using Sigmoid activation function shown the best performance. The proposed methods has improved the accuracy performance and learning speed up to 0.016205 MAE and processing time 0.007 seconds respectively compared with conventional ELM and has improved up to 0.0354 MSE for accuracy performance compare with state of the art algorithm. The second experiment is to validate the proposed RAF-ELM using 15 regression benchmark dataset. RAF-ELM has been compared with four neural network techniques namely conventional ELM, Back Propagation, Radial Basis Function and Elman. The results show that RAF-ELM technique obtain the best performance compared to other techniques in term of accuracy for various time series data that come from various domain.

AB - Extreme Learning Machine (ELM) algorithm based on single hidden layer feedforward neural networks has shown as the best time series prediction technique. Furthermore, the algorithm has a good generalization performance with extremely fast learning speed. However, ELM facing overfitting problem that can affect the model quality due to the implementation using empirical risk minimization scheme. Therefore, this study aims to improve ELM by introducing an Activation Functions Regularization in ELM called RAF-ELM. The experiment has been conducted in two phases. First, investigating the modified RAF-ELM performance using four types of activation functions are: Sigmoid, Sine, Tribas and Hardlim. In this study, input weight and bias for hidden layers are randomly selected, whereas the best neurons number of hidden layer is determined from 5 to 100. This experiment used UCI benchmark datasets. The number of neurons (99) using Sigmoid activation function shown the best performance. The proposed methods has improved the accuracy performance and learning speed up to 0.016205 MAE and processing time 0.007 seconds respectively compared with conventional ELM and has improved up to 0.0354 MSE for accuracy performance compare with state of the art algorithm. The second experiment is to validate the proposed RAF-ELM using 15 regression benchmark dataset. RAF-ELM has been compared with four neural network techniques namely conventional ELM, Back Propagation, Radial Basis Function and Elman. The results show that RAF-ELM technique obtain the best performance compared to other techniques in term of accuracy for various time series data that come from various domain.

KW - Extreme learning machine

KW - Neural networks

KW - Prediction

KW - Regularization

KW - Time series

UR - http://www.scopus.com/inward/record.url?scp=85063720400&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063720400&partnerID=8YFLogxK

U2 - 10.14569/IJACSA.2019.0100331

DO - 10.14569/IJACSA.2019.0100331

M3 - Article

VL - 10

SP - 241

EP - 248

JO - International Journal of Advanced Computer Science and Applications

JF - International Journal of Advanced Computer Science and Applications

SN - 2158-107X

IS - 3

ER -