One Dimensional Search Methods: The Recap

In the past couple of posts we have gone over three different search methods. We started with the Golden Section Search (GSS) as the most basic and intuitive way to find a minimum. We then optimized GSS with varying reduction step sizes and came up with the Fibonacci search method. Finally, we looked at the properties of functions and their derivatives to create our last search method, Newton’s Method.  Each method had its own shortcomings and benefits, but the question remains which method should we be focusing on? Is there a best method?

Well i have some follow up questions. What makes a method the best? The two obvious choices are speed and accuracy (and to an extent precision). So which of our methods came out on top? Well if you remember, we used the same example function throughout our journey through ODSM. This function was a simple quadratic function. Gss took 6 iterations, FS took five and NM only took 1. To be honest this was not the best example as NM will always find the exact answer to all quadratic equations in one step. So for this recap we will use a new function and see how they all stack up. I have chosen an interesting function paper, but a rather boring one graphically. The function is f(x)=x^4 -(x)^3+cos(x)-12 and is plotted below (taken from wolfram alpha).

Screen Shot 2015-06-25 at 9.41.06 PM

Using this new function with our three methods. I capped each at 5 iterations so we could see the graphs and the methods without getting to cluttered. running the three methods produces the following plot.

Screen Shot 2015-06-25 at 9.34.42 PMEach method was able to get decently close to our minimum. However, as you can see each method goes about it at a different way. Newton’s method quickly converges to our minimum. Really only taking 4 iterations, Newton’s method converges fastest. GSS and FS both come decently close in our 5 iteration limit. But what about accuracy?

The true minimum of this solution is  at x ≈ 0.969366 with f(x)  = 11.4621. Newton’s method after 5 iterations produced a minimum of 0.9694. GSS produced a minimum of 0.8754, and FS produced a min of 0.7077. As you can see there is no competition between the three in regards to this function. Newton’s method is the fastest and the most accurate at 5 iterations. However, this is not to say this is the best method. It is time I introduced the no free lunch theorem of optimization.

TheNO Free Lunch theorem introduced by David Wolpert and WIlliam G Macready, says “that for certain types of mathematical problems, the computational cost of finding a solution, averaged over all problems in the class, is the same for any solution method”(from wikipedia). How does that apply here? Well it says that even though Newton’s method is the clear winner with regards to this function, when we consider all possible functions and limitations Newton’s method is no better that GSS or FS. This is not to say that we should just choose our method without regard or rationale.

For example if we have a function who has a terrible derivative or awful second derivative, we might favor FS or Gss. On the other hand if evaluating the derivatives are easy we would choose Newton’s Method. This all depends on whether or not you believe in such a thing as a free lunch

Marcello