### Abstract

Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

Original language | English |
---|---|

Pages (from-to) | 297-305 |

Number of pages | 9 |

Journal | International Journal of Computer Mathematics |

Volume | 76 |

Issue number | 3 |

Publication status | Published - 2001 |

### Fingerprint

### Keywords

- Backpropagation model
- Handwritten/handprinted digits
- KLCI
- KLSE
- Mean square error
- Profitability analysis

### ASJC Scopus subject areas

- Applied Mathematics
- Computer Science Applications
- Computational Theory and Mathematics

### Cite this

*International Journal of Computer Mathematics*,

*76*(3), 297-305.

**An improved error signal for the backpropagation model for classification problems.** / Shamsuddin, S. M.; Sulaiman, M. N.; Darus, Maslina.

Research output: Contribution to journal › Article

*International Journal of Computer Mathematics*, vol. 76, no. 3, pp. 297-305.

}

TY - JOUR

T1 - An improved error signal for the backpropagation model for classification problems

AU - Shamsuddin, S. M.

AU - Sulaiman, M. N.

AU - Darus, Maslina

PY - 2001

Y1 - 2001

N2 - Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

AB - Most supervised neural networks are trained by minimizing the mean squared error of the training set. But there are problems of using mean squared error especially whenever the target output is equal to the actual output in which the error signal tends to zero. This will lead to the instability of the internal structure of the network as well. In this paper, we discuss an improved convergence rates of standard backpropagation model with some modifications in its learning strategies. A modified backpropagation model is experimented on XOR problem, data of profitability analysis and Kuala Lumpur Composite Index (KLCI) at Kuala Lumpur Stock Exchange (KLSE) and handwritten/handprinted digits. The results are compared with standard backpropagation model which is based on mean square errors.

KW - Backpropagation model

KW - Handwritten/handprinted digits

KW - KLCI

KW - KLSE

KW - Mean square error

KW - Profitability analysis

UR - http://www.scopus.com/inward/record.url?scp=0034460317&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0034460317&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0034460317

VL - 76

SP - 297

EP - 305

JO - International Journal of Computer Mathematics

JF - International Journal of Computer Mathematics

SN - 0020-7160

IS - 3

ER -