An improved error signal for the backpropagation model for classification problems

S. M. Shamsuddin, M. N. Sulaiman, Maslina Darus

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

Original languageEnglish
Pages (from-to)297-305
Number of pages9
JournalInternational Journal of Computer Mathematics
Volume76
Issue number3
Publication statusPublished - 2001

Fingerprint

Back Propagation
Backpropagation
Classification Problems
Mean Squared Error
Learning Strategies
Profitability
Output
Digit
Mean square error
Rate of Convergence
Composite
Model
Tend
Neural Networks
Neural networks
Internal
Target
Composite materials
Zero
Standards

Keywords

  • Backpropagation model
  • Handwritten/handprinted digits
  • KLCI
  • KLSE
  • Mean square error
  • Profitability analysis

ASJC Scopus subject areas

  • Applied Mathematics
  • Computer Science Applications
  • Computational Theory and Mathematics

Cite this

An improved error signal for the backpropagation model for classification problems. / Shamsuddin, S. M.; Sulaiman, M. N.; Darus, Maslina.

In: International Journal of Computer Mathematics, Vol. 76, No. 3, 2001, p. 297-305.

Research output: Contribution to journalArticle

@article{dac40c2207d94e47bd478672c88eedc0,
title = "An improved error signal for the backpropagation model for classification problems",
abstract = "Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.",
keywords = "Backpropagation model, Handwritten/handprinted digits, KLCI, KLSE, Mean square error, Profitability analysis",
author = "Shamsuddin, {S. M.} and Sulaiman, {M. N.} and Maslina Darus",
year = "2001",
language = "English",
volume = "76",
pages = "297--305",
journal = "International Journal of Computer Mathematics",
issn = "0020-7160",
publisher = "Taylor and Francis Ltd.",
number = "3",

}

TY - JOUR

T1 - An improved error signal for the backpropagation model for classification problems

AU - Shamsuddin, S. M.

AU - Sulaiman, M. N.

AU - Darus, Maslina

PY - 2001

Y1 - 2001

N2 - Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

AB - Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

KW - Backpropagation model

KW - Handwritten/handprinted digits

KW - KLCI

KW - KLSE

KW - Mean square error

KW - Profitability analysis

UR - http://www.scopus.com/inward/record.url?scp=0034460317&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034460317&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0034460317

VL - 76

SP - 297

EP - 305

JO - International Journal of Computer Mathematics

JF - International Journal of Computer Mathematics

SN - 0020-7160

IS - 3

ER -