Show simple item record

dc.contributor.authorMohd Rizon
dc.contributor.authorHazry, Desa
dc.contributor.authorKarthigayan, M.
dc.contributor.authorNagarajan, Ramachandran
dc.contributor.authorAlajlan, N.
dc.contributor.authorSazali, Yaacob, Prof. Dr.
dc.contributor.authorNor Azmi, Johari
dc.contributor.authorIna Suryani, Ab Rahim
dc.date.accessioned2010-08-16T03:23:10Z
dc.date.available2010-08-16T03:23:10Z
dc.date.issued2009-07-05
dc.identifier.citationp.224-228en_US
dc.identifier.isbn978-0-7695-3734-4
dc.identifier.urihttp://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5230751&tag=1
dc.identifier.urihttp://dspace.unimap.edu.my/123456789/8686
dc.descriptionLink to publisher's homepage at http://ieeexplore.ieee.org/en_US
dc.description.abstractIn this paper, lip and eye features are applied to classify the human emotion through a set of irregular and regular ellipse fitting equations using Genetic algorithm (GA). South East Asian face is considered in this study. All six universally accepted emotions and one neutral are considered for classifications. The method which is fastest in extracting lip features is adopted in this study. Observation of various emotions of the subject lead to an unique characteristic of lips and eye. GA is adopted to optimize irregular ellipse and regular ellipse characteristics of the lip and eye features in each emotion respectively. That is, the top portion of lip configuration is a part of one ellipse and the bottom of different ellipse. Two ellipse based fitness equations are proposed for the lip configuration and relevant parameters that define the emotion are listed. One ellipse based fitness function is proposed for eye. The GA method approach has achieved reasonably successful classification of emotion. While performing classification, optimized values can mess or overlap with other emotions range. In order to overcome the overlapping problem between the emotions and at the same time to improve the classification, a neural network (NN) approach is implemented. The GA-NN based process exhibits a range of 83% - 90% classification of the emotion from the optimized feature of top lip, bottom lip and eye.en_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Elctronics Engineering (IEEE)en_US
dc.relation.ispartofseriesProceedings of the 2nd International Conference in Visualisation (VIZ) 2009en_US
dc.subjectClassificationen_US
dc.subjectClassification of emotionsen_US
dc.subjectEllipse fittingen_US
dc.subjectFitness functionsen_US
dc.subjectHuman emotionen_US
dc.subjectLip featuresen_US
dc.titlePersonalized human emotion classification using genetic algorithmen_US
dc.typeWorking Paperen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record