Applying a stochastic quasi-Newton optimizer to least-squares reverse time migration

Computers & Geosciences(2023)

Cited 0|Views2
No score
Abstract
As we keep moving towards large-scale 3D datasets, inversions based on the wave equation, such as least-squares reverse time migration (LSRTM), must rapidly converge to reduce the computational burden and deliver its well-known advantages, compensating the illumination and providing accurate amplitudes. For that, we explore stochastic optimization methods for LSRTM beyond the minibatch stochastic gradient descent (SGD), considered the state of the art in large-scale machine learning problems. We apply a second-order stochastic method to the LSRTM problem, that uses a quasi-Newton approach, known as the Sum of Functions Optimizer (SFO) algorithm. This method also works well with minibatches of data and has shown good performance on optimization of multi-layer neural networks. It can maintain computational tractability and limit memory requirements even for high dimensional optimization problems. As typical for quasi-Newton methods no adjustment of hyperparameters is required. On our experiments presented here, using synthetic data, we demonstrated that the SFO algorithm shows better results and faster convergence than the minibatch SGD.
More
Translated text
Key words
LSRTM,Stochastic optimization,Devito,quasi-Newton
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined