Please use this identifier to cite or link to this item:
http://dspace.unimap.edu.my:80/xmlui/handle/123456789/7375
Title: | Appraising human emotions using time frequency analysis based EEG alpha band features |
Authors: | M. Murugappan Ramachandran, Nagarajan Sazali, Yaacob m.murugappan@gmail.com |
Keywords: | EEG KNN LDA Surface laplacian filtering Wavelet transforms Emotion recognition Human computer interaction Medical signal processing |
Issue Date: | 25-Jul-2009 |
Publisher: | Institute of Electrical and Electronics Engineering (IEEE) |
Citation: | p.70-75 |
Series/Report no.: | Proceedings of the 2009 Innovative Technologies in Intelligent Systems and Industrial Applications (CITISIA 2009) |
Abstract: | In recent years, assessing human emotions through Electroencephalogram (EEG) is become one of the active research area in Brain Computer Interface (BCI) development. The combination of surface Laplacian filtering, Time-Frequency Analysis (Wavelet Transform) and linear classifiers (K Nearest Neighbor (KNN) and Linear Discriminant Analysis (LDA)) are used to detect the discrete emotions (happy, surprise, fear, disgust, and neutral) of human through EEG signals. The database is generated with 20 subjects in the age group of 21~39 years using 64 channels with a sampling frequency of 256 Hz. An audio-visual induction (video clips) based protocol has been designed for evoking the discrete emotions. The raw EEG signals are preprocessed through Surface Laplacian filtering method and decomposed into five different EEG frequency bands using Wavelet Transform (WT) and the statistical features from alpha frequency band is considered for classifying the emotions. In our work, there are four different wavelet functions ("db4", "db8", "sym8" and "coif5") are used to derive the linear and non linear features for classifying the emotions. The validation of statistical features is performed using 5 fold cross validation. In this work, KNN outperforms LDA by offering a maximum average classification rate of 78.04 % on 62 channels, 77.61% and 71.30% on 24 channels and 8 channels respectively. Finally we present the average classification accuracy and individual classification accuracy of two different classifiers for justifying the performance of our emotion recognition system. |
Description: | Link to publisher's homepage at http://ieeexplore.ieee.org |
URI: | http://ieeexplore.ieee.org/xpls/abs_all.jsp?=&arnumber=5224237 http://dspace.unimap.edu.my/123456789/7375 |
ISBN: | 978-1-4244-2886-1 |
Appears in Collections: | Conference Papers Sazali Yaacob, Prof. Dr. Ramachandran, Nagarajan, Prof. Dr. M. Murugappan, Dr. |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
abstract.pdf | 8.1 kB | Adobe PDF | View/Open |
Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.