A Framework for Time-Varying Optimization via Derivative Estimation
CoRR(2024)
摘要
Optimization algorithms have a rich and fundamental relationship with
ordinary differential equations given by its continuous-time limit. When the
cost function varies with time – typically in response to a dynamically
changing environment – online optimization becomes a continuous-time
trajectory tracking problem. To accommodate these time variations, one
typically requires some inherent knowledge about their nature such as a time
derivative.
In this paper, we propose a novel construction and analysis of a
continuous-time derivative estimation scheme based on "dirty-derivatives", and
show how it naturally interfaces with continuous-time optimization algorithms
using the language of ISS (Input-to-State Stability). More generally, we show
how a simple Lyapunov redesign technique leads to provable suboptimality
guarantees when composing this estimator with any well-behaved optimization
algorithm for time-varying costs.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要