Subject-dependent and subject-independent emotional classification of CMAC-based features using EFuNN

Emotions are postulated to be generated at the brain. To capture the brain activities during emotional processing, several neuro-imaging techniques have been adopted, including electroencephalogram (EEG). In the existing studies, different techniques have been employed to extract features from EEG s...

Full description

Bibliographic Details
Main Authors: Yaacob, Hamwira Sakti, Abdul Rahman, Abdul Wahab, Kamaruddin, Norhaslinda
Format: Conference or Workshop Item
Language:English
English
Published: International Society of Computers and Their Applications (ISCA) 2014
Subjects:
Online Access:http://irep.iium.edu.my/43492/
http://irep.iium.edu.my/43492/
http://irep.iium.edu.my/43492/1/43492_Subject-dependen_complete.pdf
http://irep.iium.edu.my/43492/2/43492_Subject-dependen_scopus.pdf
Description
Summary:Emotions are postulated to be generated at the brain. To capture the brain activities during emotional processing, several neuro-imaging techniques have been adopted, including electroencephalogram (EEG). In the existing studies, different techniques have been employed to extract features from EEG signals for emotion classification. However, existing feature extraction techniques do not consider spatial and temporal neural-dynamics of emotion. Furthermore, the non-linearity of EEG and self-adaptive of neural activations are disregard. Therefore, the classification accuracy of any feature extraction technique is inconsistent when applied with different classifiers. Hence, in this study, a new feature extraction technique that inculcates the qualities of EEG signal and the behavior neural activations based on Cerebellar Model Articulation Controller (CMAC) model is proposed. Classification performance of calm, fear, happiness and sadness using Evolving Fuzzy Neural Network (EFuNN) classifiers are compared based on subject-dependent and subject-independent validations. It is observed that the proposed technique is able to yield accuracy of above 50% to above 90% for subject-dependent classification. For subject-independent approach, the highest accuracy is barely 40%. The results suggest that this approach is comparable as a feature extraction technique for classifying emotions.