Tactile Grasp Stability Classification Based on Graph Convolutional Networks

2021 IEEE International Conference on Real-time Computing and Robotics (RCAR)(2021)

Cited 9|Views13
No score
Abstract
One of the challenges for robots to grasp unknown objects is to predict whether objects will fall at the beginning of grasping. Evaluating robotic grasp state accurately and efficiently is a significant step to address this issue. In this paper, based on the different fusion approaches of multi-sensor tactile signals, we propose two novel methods based on Graph Convolution Network (GCN) for robotic stability classification. Specifically, we propose two deep learning methods including GCN based on data-level fusion (GCN-DF) and GCN based on feature-level fusion (GCN-FF). We explore the optimal parameters for transforming sensor signals into a graph structure. Furthermore, we verify the effectiveness of the proposed methods on the BioTac Grasp Stability (BiGS) dataset. The experimental results prove that the proposed approaches achieve higher classification accuracy than Support Vector Machine (SVM) and Long Short-Term Memory (LSTM).
More
Translated text
Key words
Robotic Grasping,Stability Classification,Tactile Sensors,Graph Convolutional Network
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined