Please use this identifier to cite or link to this item:
http://dspace.unimap.edu.my:80/xmlui/handle/123456789/79028
Title: | Implementation of a cloud based embedded platform for object detection and recognition |
Authors: | Said Amirul Anwar, Ab.Hamid @ Ab Majib, Dr. |
Keywords: | Embedded computer systems Single-board computers Raspberry Pi (Computer) Computer vision |
Publisher: | Universiti Malaysia Perlis (UniMAP) |
Abstract: | With the recent advancements in deep learning-based computer vision models, object detection and recognition applications such as video surveillance, Bio-Imaging, autonomous cars are increasing in number. Object detection techniques require some large image datasets, memory, a machine with GPU to train the algorithm and have high power consumption. Embedded platforms are characterized by low power consumption, space, and energy resources making the deployment of the algorithms on them difficult. In order to overcome these drawbacks, the detection algorithm (Faster R-CNN) is trained and tested with an image dataset obtained from ImageNet. This algorithm is implemented on a computer with MATLAB. An image acquisition device is set up using the Raspberry pi and pi camera to capture, process and send images to the detector via Dropbox cloud platform with Python. The Dropbox platform serves as an interface between the Raspberry pi and the remote detector. The detector was trained to locate five classes of objects which namely Broom, Fan, Keyboard, Mouse, and Television. The multi-class object detector was trained on 2500 images with each class having 500 still images and tested on 500 still images. The system was tested in real-time by capturing images on the Raspberry pi and transmitting it to and from the detector using internet access in order to determine the process duration. The detector accuracy is measured using the average precision (AP) metric for each class and calculating the mean average precision (mAP) metric for all classes. The multi-class object detector achieved a mean Average Precision (mAP) of 0.67 and the entire system procedure from image capturing to the final display was executed in an average of 45 seconds. |
Description: | Master of Science in Embedded System Design Engineering |
URI: | http://dspace.unimap.edu.my:80/xmlui/handle/123456789/79028 |
Appears in Collections: | School of Computer and Communication Engineering (Theses) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Page 1-24.pdf | Access is limited to UniMAP community. | 471.97 kB | Adobe PDF | View/Open |
Full text.pdf | This item is protected by original copyright | 2.46 MB | Adobe PDF | View/Open |
Olalekan.pdf | Declaration Form | 202.78 kB | Adobe PDF | View/Open |
Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.