Multimodal data driven robot control for human-robot collaborative assembly

Journal of Manufacturing Science and Engineering-transactions of The Asme(2022)

引用 11|浏览6
暂无评分
摘要
Abstract In human-robot collaborative assembly, leveraging multimodal commands for intuitive robot control remains a challenge from command translation to efficient collaborative operations. This paper investigates multimodal data driven robot control for human-robot collaborative assembly. Leveraging function blocks, a programming-free human-robot interface is designed to fuse multimodal human commands and accurately trigger defined robot control modalities. Deep learning is explored to develop a command classification system for low-latency and high-accuracy robot control, in which a spatial-temporal graph convolutional network is developed for a reliable and accurate translation of brainwave command phrases into robot commands. Then, multimodal data-driven high-level robot control during assembly is facilitated by the use of event-driven function blocks. The high-level commands serve as triggering events to algorithms execution of fine robot manipulation and assembly feature-based collaborative assembly. Finally, a partial car engine assembly deployed to a robot team is chosen as a case study to demonstrate the effectiveness of the developed system.
更多
查看译文
关键词
robot, assembly, multimodal data, human-robot collaboration, brain robotics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要