You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SpeedMapping accelerates the convergence of a mapping to a fixed point by the Alternating cyclic extrapolation algorithm. Since gradient descent is an example of such mapping, it can also perform multivariate optimization based on the gradient function. Typical uses are
Let $F:\mathbb{R}^{n}\rightarrow\mathbb{R}^{n}$ denote a mapping which admits continuous, bounded partial derivatives. A p-order cyclic extrapolation may be expressed as
$\sigma_{k}^{(p)}=\frac{\left\vert\left\langle\Delta^{p},\Delta^{p-1}\right\rangle \right\vert}{\left\Vert\Delta^{p}\right\Vert^{2}}$, $\binom{p}{i}=\frac{p!}{i!\left(p-i\right)!}$, $\Delta^{1}x_{k}=F\left(x_{k}\right)-x_{k}$, and $\Delta^{p}x_{k}=\Delta^{p-1}F\left(x_{k}\right)-\Delta^{p-1}x_{k}.$
The extrapolation step size is $\sigma^{(p)}$ and $\Delta^{i}x$ follows Aitken's notation. The algorithm alternates between $p=3$ and $p=2$. For gradient descent acceleration, $\sigma^{(p)}$ is used to adjust the learning rate dynamically.
Reference:
N. Lepage-Saucier. Alternating cyclic vector extrapolation technique for accelerating nonlinear optimization algorithms and fixed-point mapping applications, Journal of Computational and Applied Mathematics, 439, 15 March 2024, 115607. https://www.sciencedirect.com/science/article/abs/pii/S0377042723005514
About
General fixed point mapping acceleration and optimization in Julia