Vistat

a reproducible gallery of statistical graphics

contact

  • github.com/supstat
  • weibo.com/supstat

Demonstration of the Gradient Descent Algorithm

  • Lijia Yu (yu@lijiayu.net / GitHub / Twitter) A master candidate majoring in Bioinformatics at Beijing Institute of Genomics.

In the animation package, there is a function named grad.desc(). It provides a visual illustration for the process of minimizing a real-valued function through the Gradient Descent Algorithm. The two examples below show you how to use the grad.desc() function.

A simple function

The default objective function in grad.desc() is . The arrows will take you to the minima step by step:

library(animation)
par(mar = c(4, 4, 2, 0.1))
grad.desc()

When the algorithm fails

This example shows how the gradient descent algorithm will fail with a too large step length.

To find a local minimum of a bivariate objective function:

ani.options(nmax = 70)
par(mar = c(4, 4, 2, 0.1))
f2 = function(x, y) sin(1/2 * x^2 - 1/4 * y^2 + 3) * cos(2 * x + 1 - 
  exp(y))
grad.desc(f2, c(-2, -2, 2, 2), c(-1, 0.5), gamma = 0.3, tol = 1e-04)
## Warning: Maximum number of iterations reached!

Apparently the arrows get lost eventually. You can replace gamma=0.3 with a smaller value and retry the function.

Meta

Keywords: Categories: Reviewer: You can find the R Markdown source document in the vistat repository on GitHub.