A Survey of Applied Machine Learning Techniques for Optical OFDM based Networks

CoRR(2021)

Cited 0|Views1
No score
Abstract
In this survey, we analyze the newest machine learning (ML) techniques for optical orthogonal frequency division multiplexing (O-OFDM)-based optical communications. ML has been proposed to mitigate channel and transceiver imperfections. For instance, ML can improve the signal quality under low modulation extinction ratio or can tackle both determinist and stochastic-induced nonlinearities such as parametric noise amplification in long-haul transmission. The proposed ML algorithms for O-OFDM can in particularly tackle inter-subcarrier nonlinear effects such as four-wave mixing and cross-phase modulation. In essence, these ML techniques could be beneficial for any multi-carrier approach (e.g. filter bank modulation). Supervised and unsupervised ML techniques are analyzed in terms of both O-OFDM transmission performance and computational complexity for potential real-time implementation. We indicate the strict conditions under which a ML algorithm should perform classification, regression or clustering. The survey also discusses open research issues and future directions towards the ML implementation.
More
Translated text
Key words
optical ofdm,applied machine learning techniques,machine learning,networks
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined