One Dimensional Search Method: Newton’s Method Part 3

Unlike GSS and FS, Newton’s method has different limitations and restrictions. However, before we jump in lets look at a harder more elaborate function. Part2 goes over a too simple function for the sake of consistency, but lets take a function say: f(x) = x^4+6x^3+x^2-2x +cos(x) . This function is plotted below.

Screen Shot 2015-06-19 at 6.37.08 PM

As you can see we have two minimums so we can see how our method behaves. I am going to skip the step by step and just show the results of the method. We will choose our fist point at -8. This produces the following graph with the points different colors.

Screen Shot 2015-06-19 at 6.51.12 PMI had to make this graph a little bigger so you can see whats going on. Each point represents a step in our method. This method took 5 steps to get to our minimum (not including the starting point). Newton’s Method produced a minimum of -4.3462 which is close to the actual minimum, -4.34613. With only 5 iterations this is impressive method. Lets try a new initial point,5. This produces the following.

Screen Shot 2015-06-19 at 7.00.50 PMUh oh. It looks like our method converged to the wrong point. Like FS and GSS, Netwon’s will find a minimum on the interval you choose however, this minimum might not be a global minimum. Local vs global minimum is a major issue with all three ODSM. In this case, Newton’s method got caught in a saddle-like point.

Another more obvious limitation stems from the method itself. We need to be able to find the second derivative of our function. If there is no second derivative, there is no method. Well mathematicians don’t exactly take no for an answer. The Secant method was developed to remedy this. The Secant method is basically a modified Newton’s method that does not require a second derivative, only the first.

The last limitation involves our initial conditions like with GSS and FS. Our initial value for Newton’s method greatly determines the effectiveness. This is evidenced by the two above graphs which show how sensitive the method is.

I hope you understand Newtons method a little better. You can experiment more if you like i have copied the code below.

%array to get values of our function
z = -8:.001:5;</code>

% our function x^4+8x^3-2x +cos(x)
l = z.^4+6*z.^3+z.^2-2*z+cos(z);

tol = 1;
%initialize arrays to store step values
x = zeros(1,10);
y = zeros(1,10);
%initial guess
x(1)=5;
y(1) = x(1).^4+6*x(1).^3+x(1).^2-2*x(1)+cos(x(1));
i=2;
while tol &gt;0.005 %tolerance stop condition
     %first derivative
     dx = 4*x(i-1).^3+18*x(i-1).^2+2*x(i-1)-2-sin(x(i-1));
     %second derivative
     d2x = 12*x(i-1).^2+36*x(i-1)+2-cos(x(i-1));
     %newtons method step
     x(i) = x(i-1)-(dx/d2x);
     y(i)= x(i).^4+6*x(i).^3+x(i).^2-2*x(i)+cos(x(i));

     tol = abs(x(i)-x(i-1));
     i= i +1;
end

x
%plot our results
plot(z,l,x(1),y(1),'g*',x(4),y(4),'g*',x(2),y(2),'c*',x(3),y(3),'r*',x(5),y(5),'c*',x(6),y(6),'r*',x(7),y(7),'g*',x(8),y(8),'g*',x(9),y(9),'c*',x(10),y(10),'r*')

title('Newtons Method example 2 ')
xlabel('x')
ylabel('y')

-Marcello

One Dimensional Search Method: Newton’s Method Part 2

Last post we went over how Newton’s method works and derived the iterative step. This post we will go through our example step by step through the search method. We are going to use the same function from GSS and FS, f(x) = x^2 +3x+7 over the interval [-3,1]. However this time we do no need to concern ourself with the interval. We simply need an initial guess. Here’s the graph again (from -5 to 5).

Screen Shot 2015-06-14 at 10.41.00 AM

Noticed the little dot. This will be our starting point. I chose the left side of our old interval at x=-3. With Newton’s method we have to use this point to evaluate two functions, the first derivative as well as the second derivative. Hopefully this increase in function evaluations will mean a reduction in iteration or at least a more accurate and precise answer.

Before we go any further we need to find the first and second derivative. Using our function above I am sure you can find the first and second derivative, but I have listed them below for your convenience.

Screen Shot 2015-06-18 at 9.40.23 PM

Now we need to evaluate the first and second derivative at our initial point. This gives us -3 and 2. using these values we plug them into our iterative equation from last post (x(i+1)=x(i) -f’/f”). This produces a x(2) of -1.5. Like GSS, Newton’s Method stops at a tolerance condition. Remember tolerance is defined here as the absolute value of the difference between two consecutive x values. Because this hasn’t been reached we run our method again.

Using -1.5 as our x we evaluate f’ and f ”. This produces 0 and 2 respectively. Plugging these values in gives us a new x of -1.5, the same value we started with. Thus the tolerance condition is met and our search stops. The graph is below, but rather uninteresting first point in green second/third in red.

Screen Shot 2015-06-18 at 10.19.35 PM

Now why did Newton’s work so well? It found the exact minimum in only two iterations. If you remember from part 2, Newton’s method approximates the function with a quadratic equation. Well if our function is already quadratic Newton’s method can approximate it perfectly. Because of this Newton’s method finds the minimum for quadratic equations very fast.

Well that lame example was Newton’s method. Next post we will go over hopefully a more interesting problem as well as the limitations.

-Marcello