Minibatch Least-Squares Reverse Time Migration In A Deep-Learning Framework

GEOPHYSICS(2021)

引用 18|浏览10
暂无评分
摘要
Migration techniques are an integral part of seismic imaging workflows. Least-squares reverse time migration (LSRTM) overcomes some of the shortcomings of conventional migration algorithms by compensating for illumination and removing sampling artifacts to increase spatial resolution. However, the computational cost associated with iterative LSRTM is high and convergence can be slow in complex media. We implement prestack LSRTM in a deep-learning framework and adopt strategies from the data science domain to accelerate convergence. Our hybrid framework leverages the existing physics-based models and machine-learning optimizers to achieve better and cheaper solutions. Using a time-domain formulation, we find that minibatch gradients can reduce the computation cost by using a subset of total shots for each iteration. The minibatch approach not only reduces source crosstalk, but it is also less memory-intensive. Combining minibatch gradients with deep-learning optimizers and loss functions can improve the efficiency of LSRTM. Deep -learning optimizers such as adaptive moment estimation are generally well-suited for noisy and sparse data. We compare different optimizers and determine their efficacy in mitigating migration artifacts. To accelerate the inversion, we adopt the regularized Huber loss function in conjunction. We apply these techniques to 2D Marmousi and 3D SEG/EAGE salt models and find improvements over conventional LSRTM baselines. Our approach achieves higher spatial resolution in less computation time measured by various qualitative and quantitative evaluation metrics.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要