Investigation of Touch based perception for manipulating object by using glovemap
Abstract
The development in use of technology is increased due to the need of public demand in their daily life. User friendly technology tools can help people more in handling and preparing for assignment in their lives. This research is about designing of a low cost
wired glove system called GloveMAP. The purpose of the project is to reinvent the 'Wired Glove' UniMAP version that provides similar function with the conventional data glove. The system involves the finger movements with some of grasping activities to
investigate the force exerted on the fingertips during grasping of various types of objects and weights. Force sensing resistors (FSR) are attached to the thumb, index and middle fingers to the glove to obtain the voltage changes from the fingers grasping. The output voltage signals are then changed to force output signals by implementing polynomial regression equation. There are twenty samples of weight for each three types of object with different shapes which are cylinder, rectangular and spherical that
has been tested to investigate the relationship between weights, shape of object and force obtained from the grasping activities. The mathematical operations such as normalization function, gradient function and also average function has been used in
order to find the relationship between these three factors. Twenty human subjects are involved in carrying out the experiment for five selected objects. The output force data obtained from the experiments are then analyzed by finding the features of force object grasping. Standard deviation and area under graph has been selected as features due to the force versus weight graph plotted for all objects. The classification method used in this research has successfully recognized the objects according to the grasping force output data from thumb, index and middle fingers. The combination information of these three fingers helps to recognize the objects easily as results of the classification rate for all objects are above 85%.