Multi-Channel Nonlinearity Mitigation Using Machine Learning Algorithms.

IEEE Trans. Mob. Comput.(2024)

引用 0|浏览0
暂无评分
摘要
This paper investigates multi-channel machine learning (ML) techniques in the presence of receiver nonlinearities and noise, and compares the results with the single-channel receiver architecture. It is known that the multi-channel architecture relaxes the sampling speed requirement of analog to digital conversion and provides significant robustness to clock jitter and front-end noise due to the bandwidth-splitting property inherent in these receivers. However, when a high-voltage swing signal is used in a wireline communication link, the received signal suffers from third-order harmonic distortions and inter-modulation products caused by the nonlinearity profile of the analog front-end (AFE). To this end, this paper proposes the channel decision passing (CDP) algorithm in combination with nonlinear feedback cancellation as a low-complexity candidate for nonlinearity mitigation and compares the performance of this solution with other well-known ML algorithms. Simulation results show significant improvement in a multi-channel receiver architecture equipped with nonlinear feedback cancellation and CDP in comparison with its single-channel counterpart under practical nonlinearity profiles and noise conditions.
更多
查看译文
关键词
Machine learning,multi-channel receiver,nonlinearities,reinforcement learning,supervised learning,unsupervised learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要