Sample as You Infer: Predictive Coding With Langevin Dynamics.
CoRR(2023)
摘要
We present a novel algorithm for parameter learning in generic deep
generative models that builds upon the predictive coding (PC) framework of
computational neuroscience. Our approach modifies the standard PC algorithm to
bring performance on-par and exceeding that obtained from standard variational
auto-encoder (VAE) training. By injecting Gaussian noise into the PC inference
procedure we re-envision it as an overdamped Langevin sampling, which
facilitates optimisation with respect to a tight evidence lower bound (ELBO).
We improve the resultant encoder-free training method by incorporating an
encoder network to provide an amortised warm-start to our Langevin sampling and
test three different objectives for doing so. Finally, to increase robustness
to the sampling step size and reduce sensitivity to curvature, we validate a
lightweight and easily computable form of preconditioning, inspired by Riemann
Manifold Langevin and adaptive optimizers from the SGD literature. We compare
against VAEs by training like-for-like generative models using our technique
against those trained with standard reparameterisation-trick-based ELBOs. We
observe our method out-performs or matches performance across a number of
metrics, including sample quality, while converging in a fraction of the number
of SGD training iterations.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要