Semantic-Aware Vision-Assisted Integrated Sensing and Communication: Architecture and Resource Allocation

IEEE WIRELESS COMMUNICATIONS(2024)

引用 0|浏览6
暂无评分
摘要
Many intelligent (mobile) applications are driven by real-time environmental information which may be unavailable at the core network and is challenging to transmit, given the limited spectrum resource. This article proposes an innovative architecture, referred to as semantic-aware, vision-assisted integrated sensing and communication (SA-VA-ISAC), to enable real-time environmental information collection and transmission, by integrating emerging paradigms and key technologies, including computer vision (CV), ISAC, mobile edge computing (MEC), semantic communications, and beamforming. First, the CV and ISAC are employed to capture abundant environmental information, which is further aggregated at an MEC server. Second, semantic communications enable information compression to satisfy the stringent reliability and latency requirements, and beamforming provides high-quality wireless coverage. To facilitate the resource allocation in the proposed architecture, deep learning (DL) is adopted for environmental information collection and aggregation, semantic encoder and decoder and beamforming design. Numerical results manifest the advantages of the proposed architecture and the DL-based resource allocation schemes.
更多
查看译文
关键词
Semantics,Symbols,Servers,Real-time systems,Resource management,Wireless communication,Sensors
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要