Chrome Extension
WeChat Mini Program
Use on ChatGLM

Autonomous Vision-Guided Two-Arm Collaborative Microassembly Using Learned Manipulation Model

IEEE ROBOTICS AND AUTOMATION LETTERS(2024)

Cited 0|Views3
No score
Abstract
This letter presents an integrated micromanipulation system suited for precise assembly of micro-parts in intricate environments. By embedding the Global Attention Mechanism (GAM) into YOLOv8, the system not only enhances its performance but also accurately identifies target keypoints, pinpointing the center position and rotational angle of the target. Building on this, we introduce a learning-driven pushing assembly manipulation method, adeptly harnessing a regression model trained on historic micromanipulation data. This model captures the nuanced dynamics of micro-components, addressing challenges like frictional contact and other uncertain factors that are typically tough to model. Our experimental suite, executed in complex settings, covers transportation and assembly tasks utilizing dual micro-manipulators. The system's hallmark is its synthesis of YOLOv8-driven keypoint recognition and an advanced pushing manipulation mechanism. Its innovative design and methodologies can be adapted for assembling micro-parts of diverse geometric configurations, especially in sophisticated biological settings, like manipulating biological cells.
More
Translated text
Key words
Microassembly,Probes,Task analysis,End effectors,Sensors,Micromanipulators,Collaboration,Learning model for control,recognition tracking,micro/nano robots,micromanipulation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined