具有可证明收敛保证的分布式优化的有效二阶算法

活动信息

  • 开始时间:2023-12-25 15:30:00
  • 活动地点:海山楼A1101
  • 主讲人:张娇娇

活动简介

<p>Big data over geographically distributed devices necessitates the development of decentralized optimization. Although first-order methods enjoy low per-iteration computational complexity, second-order methods are attractive due to their faster convergence rates. Motivated by this, we aim to propoe decentralized second-order algorithms inheriting the advantage of fast convergence as in the single-machine setting while avoiding high communication cost.

In the first work, we propose a Newton tracking algorithm, where no Hessian matrices exchange over the network. In the single-machine setting, the Newton method has theoretically faster rate than first-order methods. However, developing a communicate-efficient decentralized variant of the Newton method with condition-number-independence property or super-linear rate is non-trivial. In the second work, we fill this gap by proposing a decentralized Newton method and establishing a theoretically faster rate than first-order methods. In the third work, we move on to the stochastic setting when each node has many samples so that computing the local full gradients is not affordable. We develop a general algorithmic framework that incorporates stochastic quasi-Newton approximations with variance reduction and then specify two fully decent</p>

主讲人介绍

张娇娇,KTH-皇家理工学院决策与控制系统部门博士后研究员。博士毕业于香港中文大学系统工程与工程管理系。研究兴趣包括分布式优化算法及理论分析,研究工作发表于 IEEE Trans on Signal Processing, IEEE Trans on Automatic Control, IEEE Trans on Signal and Information Processing over Networks等期刊。