Sensing a Physical Object Gripping using Haptic Technology and Machine Learning Algorithms

Swati CHOWDHURI1, Swati BARUI2, Sagnik MONDAL3, Sourav MANDAL4, Biswarup NEOGI5
1 Department of EEE, Institute of Engineering & Management, Kolkata, India
2 Department of ECE, Narula Institute of Technology, Kolkata, India
3 Department of ECE, RCC Institute of Information Technology, Kolkata, India
4 School of CSE, XIM University, Bhubaneswar, India
5 Department of ECE, JIS College of Engineering, Nadia, India,,,,

Abstract: This study describes a new method for gripping and sensing a physical object (or material) with a prosthetic arm that uses haptic technologies, kinaesthetic communication, and machine learning. Haptic technology is a method of determining if an object is firm or soft as if it were gripped by a human and determining how much gripping force the object can withstand without crushing it. The bending moment and gripping force are measured using a flex sensor in human fingers and a pressure sensor applied by the tip of the human fingers. Three different types of objects (soft sponge, hard sponge, and plastic) are studied and tested in this work by pressing them with varying gripping pressures (soft, firm, and firmer). In addition, a model (Haptic Intelligence Recorder arm) is proposed that can anticipate the object type and gripping force based on the recorded intelligence data. The major goal is to educate our prosthetic hand to be able to grip various items with varying finger pressures, much like we can do naturally. Finally, a glove is created that is tailored to the intelligence arm’s ability to anticipate grabbing items.

Keywords: Haptic technology, intelligent recorder arm, flex sensor, pressure sensor, machine learning.

View full text

Swati CHOWDHURI, Swati BARUI, Sagnik MONDAL, Sourav MANDAL, Biswarup NEOGI, Sensing a Physical Object Gripping using Haptic Technology and Machine Learning Algorithms, Romanian Journal of Information Technology and Automatic Control, ISSN 1220-1758, vol. 32(2), pp. 93-104, 2022.