How To Wiki

How to preform and use a gradient descent algorithm

1,797pages on
this wiki
Add New Page
Talk0 Share

Object:Gradient descent

To find a local minimum of a function with Gradient descent algorythm:

1. If function has many variables, e.g., f(x1, x2, ..., xn), just choose an arbitrary
point M0 in n-dimensional argument plane with coordinates (x1, x2, ..., xn)
(i.e., just give some initial values for every x)
2. Define a scalar step M for descending the function
3. Repeat
     Calculate gradient of the function at the point A0
     Calculate new point A0(x1, x2,.., xn) by calculating new coordinate for every x.
     Every new xn value will be equal to previous value xn - M * partial differential 
     of the function in respect with xn
   Until    gradient of the function equals 0 or its absolute value is "small enough"

After the procedure, the coordinates (x1,x2,..,xn) will be the coordinates of a local minimum Fmin(x1,x2,..,xn).

Example 1Edit

Given function: f(x,y) = -(x^2-7)-(y^2+12)+12
Problem: Find a local maximum when argument x is between -25 and 25, and argument y is between -25 and 25;

1. Let's choose initial M0 to be (0; 0). So x=0; y=0.

2. Let's choose M step to be one tenth: M= 0.1;

3. Let's repeat the cycle, but because we are searching for a maximum, not minimum, we have to change the - to +

... xn + M * partial differential..

Until (condition)

In various mathematics software there are functions that will calculate gradient, so, let's construct the cycle in some of them.

In Maple 10Edit

1. Defining function: (Press Ctrl+M before editing math text)

restart; // recommended to write before new calculation in Maple
z := (x,y) -> -(x^2-7)-(y^2+12)+12; // defining the function
From HowTo Wiki, a Wikia wiki.

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.