Backpropagation with Callbacks: Towards Efficient and Expressive Differentiable Programming

neural information processing systems(2018)

引用 40|浏览34
暂无评分
摘要
Deep learning rests in crucial ways on gradient-descent optimization and end- to-end differentiation. Under the slogan of differentiable programming, there is an increasing demand for efficient automatic gradient computation for emerging network architectures that incorporate dynamic control flow. In this paper we take a fresh look at backpropagation, and propose an implementation using functions with callbacks, where the forward pass is executed as a sequence of function calls and the backward pass when the functions return. A key realization is that this technique of chaining callbacks is well known in the programming languages community under the name continuation-passing style (CPS), and any program can be converted to this form using standard techniques. Our approach achieves the same flexibility as other reverse-mode automatic differentiation (AD) techniques, but it can be implemented without any auxiliary data structures, and it can easily be combined with native code generation techniques, leading to a highly efficient implementation that combines the performance benefits of deep learning frameworks based on explicit reified computation graphs (e.g., TensorFlow) with the expressiveness of pure library approaches (e.g., PyTorch).
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要