Show simple item record

dc.contributor.authorMuthukaruppan, Karthigayan
dc.contributor.authorMohd Rizon, Mohamed Juhari
dc.contributor.authorRamachandran, Nagarajan
dc.contributor.authorMasanori, Sugisaka
dc.contributor.authorSazali, Yaacob
dc.contributor.authorMohd Rozailan, Mamat
dc.contributor.authorHazry, Desa
dc.date.accessioned2009-08-06T12:28:02Z
dc.date.available2009-08-06T12:28:02Z
dc.date.issued2007-08-10
dc.identifier.citationArtificial Life and Robotics, vol.11 (2), 2007, pages 197-203.en_US
dc.identifier.issn1433-5298 (Print)
dc.identifier.issn1614-7456 (Online)
dc.identifier.urihttp://www.springerlink.com/content/5031051346335636/
dc.identifier.urihttp://dspace.unimap.edu.my/123456789/6688
dc.descriptionLink to publisher's homepage at http://www.springerlink.comen_US
dc.description.abstractIn this article, two subjects, one South East Asian (SEA) and the other Japanese, are considered for face emotion recognition using a genetic algorithm (GA). The parameters relating the face emotions in each case are entirely different. All six universally accepted emotions and one neutral are considered for each subject. Eyes and lips are usually considered as the features for recognizing emotions. This paper has two parts. The first part investigates a set of image processing methods suitable for recognizing face emotion. The acquired images have gone through a few preprocessing methods such as gray-scale, histogram equalization, and filtering. The edge detection has to be sufficiently successful even when the light intensity is uneven. So, to overcome this problem, the histogram-equalized image has been split into two regions of interest (ROI): the eye and lip regions. The two regions have been applied with the same preprocessing methods, but with different threshold values. It was found that the Sobel edge detection method performed very well in segmenting the image. Three feature extraction methods are considered, and their respective performances are compared. The method which is fastest in extracting eye features is adopted. The second part of the paper discusses the way to recognize emotions from eye features alone. Observation of various emotions of the two subjects lead to an unique eye characteristic, that is, the eye exhibits ellipses of different parameters in each emotion. The GA is adopted to optimize the ellipse characteristics of the eye features in each emotion based on an ellipse-based fitness function. This has shown successful emotion classifications, and a comparison is made on the emotions of each subject.en_US
dc.language.isoenen_US
dc.publisherSpringer Japanen_US
dc.subjectFeature extractionen_US
dc.subjectEllipse fitness functionen_US
dc.subjectGenetic algorithmen_US
dc.subjectFace emotion recognitionen_US
dc.subjectFace emotionen_US
dc.subjectDetectorsen_US
dc.subjectRecognition systemen_US
dc.subjectGenetic algorithm (GA)en_US
dc.titleDevelopment of a personified face emotion recognition technique using fitness functionen_US
dc.typeArticleen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record