iRepository at Perpustakaan UniMAP >
UNIVERSITY LIBRARY >
Conference Papers >
Please use this identifier to cite or link to this item:
|Title: ||Estimation of mobile robot orientation using neural networks|
|Authors: ||Pandiyan, Paulraj Murugesa|
R. Badlishah, Ahmad
Hema, Chengalvarayan Radhakrishnamurthy.
|Keywords: ||Mobile robot;Neural network;Orientation;Stereo imaging;Colloquium on Signal Processing & Its Applications (CSPA)|
|Issue Date: ||6-Mar-2009|
|Publisher: ||Institute of Electrical and Electronics Engineering (IEEE)|
|Citation: ||p. 42-46|
|Series/Report no.: ||Proceedings of the 5th International Colloquium on Signal Processing and Its Applications (CSPA) 2009|
|Abstract: ||The computation of a mobile robot position and orientation is a common task in the area of computer vision and image processing. For a successful application, it is important that the position and orientation of a mobile robot must be determined properly. In this paper, a simple procedure for determining the orientation of the mobile robot using two cameras is presented. The two cameras are used to capture the images of a mobile robot at various orientations. Four simple neural network models are developed to associate the inputs and output (orientation). First and second neural network models are used to estimate the orientation of a mobile robot using only the features derived from the first and second camera. The third neural network model is used for estimating the orientation of a mobile robot using features derived from both the first and second camera. The fourth neural network model is used to estimate the orientation using features derived from the combined images. Simulation results show that the proposed algorithm can be used to estimate the orientation of the mobile robot accurately.|
|Description: ||Link to publisher's homepage at http://ieeexplore.ieee.org/|
|Appears in Collections:||Conference Papers|
R. Badlishah Ahmad, Prof. Dr.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.