Please use this identifier to cite or link to this item: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/42106
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSiti Nurhawa, Hassan-
dc.date.accessioned2016-06-17T08:35:18Z-
dc.date.available2016-06-17T08:35:18Z-
dc.date.issued2015-05-
dc.identifier.urihttp://dspace.unimap.edu.my:80/xmlui/handle/123456789/42106-
dc.descriptionAccess is limited to UniMAP community.en_US
dc.description.abstractIt is a well-known fact that building speech recognition systems is one of the hottest areas of research. Speech recognition has been an important part of the civilization for many centuries. We depend on intelligent and recognizable sounds for common communications. In this project, spoken commands are recognized by the system and executed. The project consists of two phases. The first part deals with recognizing the format frequencies associated with the input voice and the second phase involves pattern classification using Artificial Neural Networks.en_US
dc.language.isoenen_US
dc.publisherUniversiti Malaysia Perlis (UniMAP)en_US
dc.subjectSpeech recognitionen_US
dc.subjectArtificial Neural Networken_US
dc.subjectSpeech recognition -- Design and constructionen_US
dc.subjectSpeech recognition systemsen_US
dc.titleSpeech recognition using artificial neural networken_US
dc.typeLearning Objecten_US
dc.contributor.advisorDr Md Mijanur Rahmanen_US
dc.publisher.departmentSchool of Computer and Communication Engineeringen_US
Appears in Collections:School of Computer and Communication Engineering (FYP)

Files in This Item:
File Description SizeFormat 
Abstract,Acknowledgement.pdf346.14 kBAdobe PDFView/Open
Introduction.pdf263.41 kBAdobe PDFView/Open
Literature Review.pdf435.29 kBAdobe PDFView/Open
Methodology.pdf715.32 kBAdobe PDFView/Open
Results and Discussion.pdf174.59 kBAdobe PDFView/Open
Conclusion and Recommendation.pdf185.58 kBAdobe PDFView/Open
Refference and Appendics.pdf258.96 kBAdobe PDFView/Open


Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.