Vision-based tactile intelligence with soft robotic metamaterial

Tianyu Wu, Yujian Dong,Xiaobo Liu,Xudong Han, Yang Xiao, Jinqi Wei,Fang Wan,Chaoyang Song

MATERIALS & DESIGN(2024)

Cited 0|Views7
No score
Abstract
Robotic metamaterials represent an innovative approach to creating synthetic structures that combine desired material characteristics with embodied intelligence, blurring the boundaries between materials and machinery. Inspired by the functional qualities of biological skin, integrating tactile intelligence into these materials has gained significant interest for research and practical applications. This study introduces a Soft Robotic Metamaterial (SRM) design featuring omnidirectional adaptability and superior tactile sensing, combining visionbased motion tracking and machine learning. The study compares two sensory integration methods to a state-ofthe-art motion tracking system and force/torque sensor baseline: an internal-vision design with high frame rates and an external-vision design offering cost-effectiveness. The results demonstrate the internal-vision SRM design achieving an impressive tactile accuracy of 98.96%, enabling soft and adaptive tactile interactions, especially beneficial for dexterous robotic grasping. The external-vision design offers similar performance at a reduced cost and can be adapted for portability, enhancing material science education and robotic learning. This research significantly advances tactile sensing using vision-based motion tracking in soft robotic metamaterials, and the open-source availability on GitHub fosters collaboration and further exploration of this innovative technology (https://github .com /bionicdl -sustech /SoftRoboticTongs).
More
Translated text
Key words
Soft robotic metamaterials,Vision-based tactile sensing,Machine learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined