Show simple item record

dc.contributor.authorMohammed Elzaroug, Alshrief
dc.date.accessioned2019-04-10T04:03:20Z
dc.date.available2019-04-10T04:03:20Z
dc.date.issued2014
dc.identifier.urihttp://dspace.unimap.edu.my:80/xmlui/handle/123456789/59422
dc.description.abstractMultimodal biometric systems that integrate the biometric traits from several modalities are able to overcome the limitations of single modal biometrics. Fusing the information at the middle stage by consolidating the information given by different traits can give a better result due to the richness of information at this stage. In this thesis, an information fusion at matching score level is used to integrate face and palm-print modalities. Three types of matching score rule are used which is sum, product and minimum rule. A linear statistical projection method based on the principle component analysis (PCA) is used to capture the important information and reduce feature dimension in the feature space. A fusion process is performed using matching score computed using Euclidean distance classifier. The experiment is conducted using a benchmark ORL face and PolyU palm-print dataset to examine the recognition rates of the propose technique. The best recognition rate is 98.96% achieved by using sum rule fusion method. Recognition rate can also be improved by increasing number of training images and number of PCA coefficients.en_US
dc.language.isoenen_US
dc.publisherUniversiti Malaysia Perlis (UniMAP)en_US
dc.subjectMultimodal biometric systemsen_US
dc.subjectBiometricen_US
dc.subjectFusionen_US
dc.subjectPalm-printen_US
dc.titleInformation fusion of face and palm-print multimodal biometric at matching score levelen_US
dc.typeThesisen_US
dc.contributor.advisorDr. Muhammad Imran Ahmaden_US
dc.publisher.departmentSchool of Computer and Communicationen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record