Inter-rater reliability of actual tagged emotion categories validation using Cohen’s Kappa coefficient

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

It is necessary to find the human inter-rater agreement in emotion recognition research especially when handling with publicly available database. This paper discusses the Cohen’s Kappa coefficient technique to verify the actual tagged emotion categories for hybrid emotion model using music video as stimulus. This method has been done by finding the degree of inter-rater reliability between the five selected raters. As the results, the values of Cohen’s Kappa coefficients are over 0.87 for four actual tagged emotion categories which are happy, relaxed, sad and angry. These values demonstrate that the degree of inter-rater agreement are excellent. The actual tagged emotion categories are selected based on the division of average value of arousal-valence rating.

Original languageEnglish
Pages (from-to)259-264
Number of pages6
JournalJournal of Theoretical and Applied Information Technology
Volume95
Issue number2
Publication statusPublished - 31 Jan 2017

Fingerprint

Cohen's kappa
Coefficient
Emotion Recognition
Music
Division
Verify
Necessary
Emotion
Demonstrate

Keywords

  • Cohen’s Kappa coefficient
  • Emotion recognition

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

@article{b4f26a0590bf4b1c8724d7875cdca7d8,
title = "Inter-rater reliability of actual tagged emotion categories validation using Cohen’s Kappa coefficient",
abstract = "It is necessary to find the human inter-rater agreement in emotion recognition research especially when handling with publicly available database. This paper discusses the Cohen’s Kappa coefficient technique to verify the actual tagged emotion categories for hybrid emotion model using music video as stimulus. This method has been done by finding the degree of inter-rater reliability between the five selected raters. As the results, the values of Cohen’s Kappa coefficients are over 0.87 for four actual tagged emotion categories which are happy, relaxed, sad and angry. These values demonstrate that the degree of inter-rater agreement are excellent. The actual tagged emotion categories are selected based on the division of average value of arousal-valence rating.",
keywords = "Cohen’s Kappa coefficient, Emotion recognition",
author = "Juremi, {Nor Rashidah Md} and Zulkifley, {Mohd Asyraf} and Aini Hussain and {Wan Zaki}, {Wan Mimi Diyana}",
year = "2017",
month = "1",
day = "31",
language = "English",
volume = "95",
pages = "259--264",
journal = "Journal of Theoretical and Applied Information Technology",
issn = "1992-8645",
publisher = "Asian Research Publishing Network (ARPN)",
number = "2",

}

TY - JOUR

T1 - Inter-rater reliability of actual tagged emotion categories validation using Cohen’s Kappa coefficient

AU - Juremi, Nor Rashidah Md

AU - Zulkifley, Mohd Asyraf

AU - Hussain, Aini

AU - Wan Zaki, Wan Mimi Diyana

PY - 2017/1/31

Y1 - 2017/1/31

N2 - It is necessary to find the human inter-rater agreement in emotion recognition research especially when handling with publicly available database. This paper discusses the Cohen’s Kappa coefficient technique to verify the actual tagged emotion categories for hybrid emotion model using music video as stimulus. This method has been done by finding the degree of inter-rater reliability between the five selected raters. As the results, the values of Cohen’s Kappa coefficients are over 0.87 for four actual tagged emotion categories which are happy, relaxed, sad and angry. These values demonstrate that the degree of inter-rater agreement are excellent. The actual tagged emotion categories are selected based on the division of average value of arousal-valence rating.

AB - It is necessary to find the human inter-rater agreement in emotion recognition research especially when handling with publicly available database. This paper discusses the Cohen’s Kappa coefficient technique to verify the actual tagged emotion categories for hybrid emotion model using music video as stimulus. This method has been done by finding the degree of inter-rater reliability between the five selected raters. As the results, the values of Cohen’s Kappa coefficients are over 0.87 for four actual tagged emotion categories which are happy, relaxed, sad and angry. These values demonstrate that the degree of inter-rater agreement are excellent. The actual tagged emotion categories are selected based on the division of average value of arousal-valence rating.

KW - Cohen’s Kappa coefficient

KW - Emotion recognition

UR - http://www.scopus.com/inward/record.url?scp=85011659640&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85011659640&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:85011659640

VL - 95

SP - 259

EP - 264

JO - Journal of Theoretical and Applied Information Technology

JF - Journal of Theoretical and Applied Information Technology

SN - 1992-8645

IS - 2

ER -