One Dimensional Search Method: Newton’s Method Part 2

Last post we went over how Newton’s method works and derived the iterative step. This post we will go through our example step by step through the search method. We are going to use the same function from GSS and FS, f(x) = x^2 +3x+7 over the interval [-3,1]. However this time we do no need to concern ourself with the interval. We simply need an initial guess. Here’s the graph again (from -5 to 5).

Screen Shot 2015-06-14 at 10.41.00 AM

Noticed the little dot. This will be our starting point. I chose the left side of our old interval at x=-3. With Newton’s method we have to use this point to evaluate two functions, the first derivative as well as the second derivative. Hopefully this increase in function evaluations will mean a reduction in iteration or at least a more accurate and precise answer.

Before we go any further we need to find the first and second derivative. Using our function above I am sure you can find the first and second derivative, but I have listed them below for your convenience.

Screen Shot 2015-06-18 at 9.40.23 PM

Now we need to evaluate the first and second derivative at our initial point. This gives us -3 and 2. using these values we plug them into our iterative equation from last post (x(i+1)=x(i) -f’/f”). This produces a x(2) of -1.5. Like GSS, Newton’s Method stops at a tolerance condition. Remember tolerance is defined here as the absolute value of the difference between two consecutive x values. Because this hasn’t been reached we run our method again.

Using -1.5 as our x we evaluate f’ and f ”. This produces 0 and 2 respectively. Plugging these values in gives us a new x of -1.5, the same value we started with. Thus the tolerance condition is met and our search stops. The graph is below, but rather uninteresting first point in green second/third in red.

Screen Shot 2015-06-18 at 10.19.35 PM

Now why did Newton’s work so well? It found the exact minimum in only two iterations. If you remember from part 2, Newton’s method approximates the function with a quadratic equation. Well if our function is already quadratic Newton’s method can approximate it perfectly. Because of this Newton’s method finds the minimum for quadratic equations very fast.

Well that lame example was Newton’s method. Next post we will go over hopefully a more interesting problem as well as the limitations.

-Marcello