Please use this identifier to cite or link to this item: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/10306
Full metadata record
DC FieldValueLanguage
dc.contributor.authorM., Satiyan-
dc.contributor.authorM., Hariharan-
dc.contributor.authorRamachandran, Nagarajan, Prof. Dr.-
dc.date.accessioned2010-11-25T06:15:28Z-
dc.date.available2010-11-25T06:15:28Z-
dc.date.issued2010-05-21-
dc.identifier.isbn978-1-4244-7122-5-
dc.identifier.urihttp://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5545262-
dc.identifier.urihttp://dspace.unimap.edu.my/123456789/10306-
dc.descriptionLink to publisher's homepage at http://ieeexplore.ieee.org/en_US
dc.description.abstractThis paper investigates the performance of a Daubechies Wavelet family in recognizing facial expressions. A set of luminance stickers were fixed on subject's face and the subject is instructed to perform required facial expressions. At the same time, subject's expressions are recorded in video. A set of 2D coordinate values are obtained by tracking the movements of the stickers in video using tracking software. Daubechies wavelet transform with different orders (db1 to db20) performed on obtained data. Standard deviation is derived from wavelet approximation coefficients for each daubechies wavelet orders. This standard deviation is used as an input to the neural network for classifying 8 facial expressions.en_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.ispartofseriesProceedings of the 6th International Colloquium on Signal Processing and Its Applications (CSPA) 2010en_US
dc.subjectDaubechies wavelet transformen_US
dc.subjectArtificial neural networken_US
dc.subjectFacial expression recognitionen_US
dc.titleComparison of performance using Daubechies Wavelet family for facial expression recognitionen_US
dc.typeWorking Paperen_US
Appears in Collections:Conference Papers
Ramachandran, Nagarajan, Prof. Dr.
Hariharan Muthusamy, Dr.



Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.