Show simple item record

dc.contributor.authorBashir, Mohammed Ghandi
dc.date.accessioned2014-01-10T07:26:41Z
dc.date.available2014-01-10T07:26:41Z
dc.date.issued2012
dc.identifier.urihttp://dspace.unimap.edu.my/123456789/31081
dc.description.abstractAutomatic facial emotion recognition has become a very active research area in recent years. The reason for this interest in the subject is because the traditional means of interacting with computers, namely keyboard, mouse and screen, have become outdated when compared to the advancement of computer technology and its wider usage in everyday tasks. For a better human-computer interaction, future computers need to have the intelligence to initiate appropriate actions on their own rather than waiting for explicit commands from users. To achieve this, computers must be able to perceive facial emotions, which is the primary means through which humans express their state of mind and also provide cues and explanations during interactions. While many different approaches for facial emotion recognition have been proposed, many of them involved computationally expensive image processing techniques, making them unsuitable for real-time applications. In this thesis, Facial Points (FPs), which are specified on the face of a subject, are themselves proposed to be used as features that are analyzed for the purpose of identifying emotions. This approach, combined with the Lucas-Kanade optical flow algorithm that is used to keep track of the positions of the FPs at run-time, succeeded in eliminating the need for pre-processing and cutting down the computational time required to extract features from image sequences of the face. An algorithm named, Guided Particle Swarm Optimization (GPSO), is proposed as a novel technique that analyzes the run-time positions of the FPs to recognize the emotion expressed by the subject. A real-time emotion recognition software is then developed using the GPSO algorithm as the classifier. The software takes a live video stream of the subject as input and once the FPs are specified on the stream; it correctly classifies the emotion expressed by the subject in each frame and instantaneously displays the result. The performance of the software was evaluated by testing it with 25 subjects of different ethnic backgrounds and was found to correctly recognize the six basic emotions, namely happy, sad, surprise, disgust, fear and anger; achieving an average recognition success rate of 86.17% at an average processing rate of 31.58 frames per second. As a way of further evaluating the GPSO method, two other AI techniques were explored; namely Backpropagation Neural Network (BPNN) and Genetic Algorithm (GA). In each case, the emotion recognition software was implemented using each of these methods and tested with the same set of data from 25 subjects. The GPSO method was found to be the overall best in each case. Finally, as a way of investigating the feasibility of embedding the software to create an autonomous robot that can recognize emotions and possibly offer some service to the elderly and the disabled; the software was integrated with a humanoid robot. In this arrangement, the wireless camera on the head on the robot captures and transmits the video stream of the subject’s face to the software. The software identifies the emotion in real time and transmits the result to the robot. The robot then performs some pre-programmed actions corresponding to the recognized emotion. The software was found to still perform well in this scenario.en_US
dc.language.isoenen_US
dc.publisherUniversiti Malaysia Perlis (UniMAP)en_US
dc.subjectAutomatic facial emotion recognitionen_US
dc.subjectFacial Points (FPs)en_US
dc.subjectFace recognitionen_US
dc.subjectGuided Particle Swarm Optimization (GPSO)en_US
dc.subjectArtificial intelligent (AI)en_US
dc.titleDevelopment of facial emotion recognition system using a modified particle swarm optimization techniqueen_US
dc.typeThesisen_US
dc.publisher.departmentSchool of Mechatronic Engineeringen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record