Please use this identifier to cite or link to this item: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/7375
Full metadata record
DC FieldValueLanguage
dc.contributor.authorM. Murugappan-
dc.contributor.authorRamachandran, Nagarajan-
dc.contributor.authorSazali, Yaacob-
dc.date.accessioned2009-12-09T01:16:30Z-
dc.date.available2009-12-09T01:16:30Z-
dc.date.issued2009-07-25-
dc.identifier.citationp.70-75en_US
dc.identifier.isbn978-1-4244-2886-1-
dc.identifier.urihttp://ieeexplore.ieee.org/xpls/abs_all.jsp?=&arnumber=5224237-
dc.identifier.urihttp://dspace.unimap.edu.my/123456789/7375-
dc.descriptionLink to publisher's homepage at http://ieeexplore.ieee.orgen_US
dc.description.abstractIn recent years, assessing human emotions through Electroencephalogram (EEG) is become one of the active research area in Brain Computer Interface (BCI) development. The combination of surface Laplacian filtering, Time-Frequency Analysis (Wavelet Transform) and linear classifiers (K Nearest Neighbor (KNN) and Linear Discriminant Analysis (LDA)) are used to detect the discrete emotions (happy, surprise, fear, disgust, and neutral) of human through EEG signals. The database is generated with 20 subjects in the age group of 21~39 years using 64 channels with a sampling frequency of 256 Hz. An audio-visual induction (video clips) based protocol has been designed for evoking the discrete emotions. The raw EEG signals are preprocessed through Surface Laplacian filtering method and decomposed into five different EEG frequency bands using Wavelet Transform (WT) and the statistical features from alpha frequency band is considered for classifying the emotions. In our work, there are four different wavelet functions ("db4", "db8", "sym8" and "coif5") are used to derive the linear and non linear features for classifying the emotions. The validation of statistical features is performed using 5 fold cross validation. In this work, KNN outperforms LDA by offering a maximum average classification rate of 78.04 % on 62 channels, 77.61% and 71.30% on 24 channels and 8 channels respectively. Finally we present the average classification accuracy and individual classification accuracy of two different classifiers for justifying the performance of our emotion recognition system.en_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineering (IEEE)en_US
dc.relation.ispartofseriesProceedings of the 2009 Innovative Technologies in Intelligent Systems and Industrial Applications (CITISIA 2009)en_US
dc.subjectEEGen_US
dc.subjectKNNen_US
dc.subjectLDAen_US
dc.subjectSurface laplacian filteringen_US
dc.subjectWavelet transformsen_US
dc.subjectEmotion recognitionen_US
dc.subjectHuman computer interactionen_US
dc.subjectMedical signal processingen_US
dc.titleAppraising human emotions using time frequency analysis based EEG alpha band featuresen_US
dc.typeWorking Paperen_US
dc.contributor.urlm.murugappan@gmail.comen_US
Appears in Collections:Conference Papers
Sazali Yaacob, Prof. Dr.
Ramachandran, Nagarajan, Prof. Dr.
M. Murugappan, Dr.

Files in This Item:
File Description SizeFormat 
abstract.pdf8.1 kBAdobe PDFView/Open


Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.