Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks

Mutasem Khalil Alsmadi, Khairuddin Omar, Shahrul Azman Mohd Noah, Ibrahim Almarashdah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

20 Citations (Scopus)

Abstract

A multilayer perceptron is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions, and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organised into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This paper presents the performance comparison between Multi-layer Perceptron (Back Propagation, Delta Rule and Perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. Its was found that the Perceptron algorithm are much better than others algorithms.

Original languageEnglish
Title of host publication2009 IEEE International Advance Computing Conference, IACC 2009
Pages296-299
Number of pages4
DOIs
Publication statusPublished - 2009
Event2009 IEEE International Advance Computing Conference, IACC 2009 - Patiala
Duration: 6 Mar 20097 Mar 2009

Other

Other2009 IEEE International Advance Computing Conference, IACC 2009
CityPatiala
Period6/3/097/3/09

Fingerprint

Multilayer neural networks
Backpropagation
Neural networks
Neurons
Chemical activation

Keywords

  • Backpropagation
  • Classification
  • Delta rule learning
  • Perceptron

ASJC Scopus subject areas

  • Software
  • Electrical and Electronic Engineering

Cite this

Alsmadi, M. K., Omar, K., Mohd Noah, S. A., & Almarashdah, I. (2009). Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks. In 2009 IEEE International Advance Computing Conference, IACC 2009 (pp. 296-299). [4809024] https://doi.org/10.1109/IADCC.2009.4809024

Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks. / Alsmadi, Mutasem Khalil; Omar, Khairuddin; Mohd Noah, Shahrul Azman; Almarashdah, Ibrahim.

2009 IEEE International Advance Computing Conference, IACC 2009. 2009. p. 296-299 4809024.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Alsmadi, MK, Omar, K, Mohd Noah, SA & Almarashdah, I 2009, Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks. in 2009 IEEE International Advance Computing Conference, IACC 2009., 4809024, pp. 296-299, 2009 IEEE International Advance Computing Conference, IACC 2009, Patiala, 6/3/09. https://doi.org/10.1109/IADCC.2009.4809024
Alsmadi, Mutasem Khalil ; Omar, Khairuddin ; Mohd Noah, Shahrul Azman ; Almarashdah, Ibrahim. / Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks. 2009 IEEE International Advance Computing Conference, IACC 2009. 2009. pp. 296-299
@inproceedings{e483dcfe9a41494ca671c74205d0cb71,
title = "Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks",
abstract = "A multilayer perceptron is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions, and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organised into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This paper presents the performance comparison between Multi-layer Perceptron (Back Propagation, Delta Rule and Perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. Its was found that the Perceptron algorithm are much better than others algorithms.",
keywords = "Backpropagation, Classification, Delta rule learning, Perceptron",
author = "Alsmadi, {Mutasem Khalil} and Khairuddin Omar and {Mohd Noah}, {Shahrul Azman} and Ibrahim Almarashdah",
year = "2009",
doi = "10.1109/IADCC.2009.4809024",
language = "English",
isbn = "9781424429288",
pages = "296--299",
booktitle = "2009 IEEE International Advance Computing Conference, IACC 2009",

}

TY - GEN

T1 - Performance comparison of multi-layer perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in neural networks

AU - Alsmadi, Mutasem Khalil

AU - Omar, Khairuddin

AU - Mohd Noah, Shahrul Azman

AU - Almarashdah, Ibrahim

PY - 2009

Y1 - 2009

N2 - A multilayer perceptron is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions, and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organised into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This paper presents the performance comparison between Multi-layer Perceptron (Back Propagation, Delta Rule and Perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. Its was found that the Perceptron algorithm are much better than others algorithms.

AB - A multilayer perceptron is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions, and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organised into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This paper presents the performance comparison between Multi-layer Perceptron (Back Propagation, Delta Rule and Perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. Its was found that the Perceptron algorithm are much better than others algorithms.

KW - Backpropagation

KW - Classification

KW - Delta rule learning

KW - Perceptron

UR - http://www.scopus.com/inward/record.url?scp=66349096618&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=66349096618&partnerID=8YFLogxK

U2 - 10.1109/IADCC.2009.4809024

DO - 10.1109/IADCC.2009.4809024

M3 - Conference contribution

SN - 9781424429288

SP - 296

EP - 299

BT - 2009 IEEE International Advance Computing Conference, IACC 2009

ER -