One Dimensional Search Method: Fibonacci Search Part 2

Last post we went over how Fibonacci Search works and how it got its name just like GSS. This post we will go through our example step by step through the search method. We are going to use the same function from GSS f(x) = x^2 +3x+7 over the interval [-3,1]. Here’s the graph again (from -5 to 5).

Screen Shot 2015-06-14 at 10.41.00 AM

So before we start anything we need to choose how many iterations we want and calculate our fibonacci sequence. To match with the GSS, we will start with 5 iterations. this means we need 6 numbers of the fibonacci sequence (N+1). This does not include F0 and F-1. Using the recursive formula this gives us 1,2,3,5,8,13 as the numbers we need. From this we can calculate all of our necessary z values. Our next step is to find our next two points. From part 1, x3 = x1+z*(x2-x1) and x4 = x1 +(1-z)*(x2-x1), where z is equal to 1 minus the ratio of two sequential fibonacci numbers starting at N+1 and working down to 1 (check back one post for formula) . Substituting -3 for x1 and 1 for x2. This gives us an initial x3 = -1.4615 and x4 = -0.5385.  I added these values as lines to our graph below.

Screen Shot 2015-06-14 at 10.42.41 AM

Just like with GSS (actually even more so), our new interval as already drastically decreased from our original interval.  Again, we need to determine which interval we keep. Graphically we can see that the green line on the left is the one we want to keep. Also we can evaluate our function at the two points and see which one gives us a lower y value. Doing this confirms our guess of the left line. We want to keep this line as it contains our minimum.So we create a new interval ensuring the green line is within the interval. This time we only have to evaluate one additional point. That point will lie between our left most boundary (black line) and the left green line. We first re evaluate our interval, x1 is still x1, x4 is now our right most boundary (x2). Now we need our two interior points. Well we already have one, the left green line, so we need to calculate the other one. Using the formula above for x3, we find x3b = -2.0769, which is plotted below. Screen Shot 2015-06-14 at 10.47.09 AM

The two “new” boundaries are shown in red. As you can see one of the green lines before is now red.  As before,If we had chosen the other line we would need to calculate a new x4. This cold be found by the equation for x4 above. But in this case we only need x3. We are narrowing down to the minimum, but we are still not there yet. This means we need to do another iteration.Screen Shot 2015-06-14 at 10.53.48 AMAs before one of our lines changed color and we have a new line and this a new interval. Notice that our interval is no longer getting smaller and smaller proportionally. This is where Fibonacci differs from GSS, we are allowed to change the reduction factor at every iteration. Unlike with GSS, Fibonacci does not terminate with a tolerance condition, but with a number of iterations (we chose 5 at the beginning). Lets jump ahead to 5 total iterations.
Screen Shot 2015-06-14 at 10.58.34 AMI ran the search 2 more times. First the yellow line, then magenta. Graphically it looks like the Fibonacci search pretty much narrowed down to exactly the point we are looking for. At this scale the magenta lines seem to be right on top of each other. However, remember that this is in the design of the search method. The z value for the last step is 1/2 – epsilon, so naturally, since epsilon is so low (i chose 0.05), these last two lines should be very close to each other. So what value did the search close in on? Fibonacci search produced a min of -1.4462.  The actual minimum of this function is -1.5, which is kinda close to the minimum we calculated. With more iterations and a stricter tolerance we can get closer to the actual minimum. However with only 5 iterations we are within 4% of the actual value.

Now you got another search method under your belt. But! There is more that we need to go over (just like with GSS). Next post we will program a simple program for it in matlab and python. Also we will go over the limitations of Fibonacci.

-Marcello

One Dimensional Search Method: Fibonacci Search Part 1

Our next one dimensional search method or line search method is the Fibonacci Search method. Like the golden section search method, the Fibonacci search method is a method of finding an  extrema of a unimodal function. Also like the GSS, Fibonacci search’s name directly describes how the search method reduces its search space. In GSS, we use the golden ratio to reduce our space. In Fibonacci search, we use, you guessed it, the fibonacci sequence.

Just a brief refresher, the Fibonacci sequence is defined by the following recursive equation:Screen Shot 2015-06-13 at 2.13.53 PM

Each number in the sequence is the sum of the two numbers in the sequence before it. By convention, when k=-1 F=0 and when k = 0 F=1. This allows us to calculate values for all k>0.

OK back to the search method. With the GSS method we were limited on the reduction of our search interval. Each interval would reduce proportionally by 0.618 (remember (1-z) from the previous pots). This limits the speed convergence and how close our search method can get to our minimum. Fibonacci search remedies this. Fibonacci search allow us to vary our z values so that the space we search decreases much faster and gets closer to our target. The follow up question is obvious, how do we determine how z changes? Well that is an optimization problem in itself.

Quickly going back to the GSS, we know that with every iteration our uncertainty range reduces by 0.618 or (1-z). So it follows that for N iterations, we have a total reduction factor of (0.618)^N or (1-z)^N. Now if we are allowed to vary z we get the following reduction factor:Screen Shot 2015-06-13 at 4.08.08 PMThe objective is to minimize the above. Minimizing this function results in a smaller final uncertainty range. However, we cannot just pick  any z values for this equations. The are subject to certain requirements or constraints. The first is that for any z it must be greater or equal to 0 and less than or equal to 1/2. 1/2 is chosen as un upper limit to ensure that only one new testing point is needed for each iteration. If we have a reduction of 1/2 the two test points coincide in the middle of the interval. This prevents the method from continuing. The new z values must also satisfy the condition that the distance between our two test points must be equal to the distance from the selected test point to the nearest boundary point. With these constraints we can form the optimization problem and find our minimum.

How does the Fibonacci sequence fit into all this? It happens to be that the optimal solution to our minimization problem involves the ratio of two sequential numbers in the sequence. The optimal z values are as follows:

Screen Shot 2015-06-13 at 8.37.38 PM

This leads is to our first limitation. Fibonacci search, unlike GSS, requires a finite number of predetermined iterations. As seen by Zn, the end of the search is the beginning of the Fibonacci sequence. We need to take a quick pause at the last Z. When we plug in F1 and F2 of the FS we get Zn = 1/2. Earlier i warned of the dangers of a 1/2 reduction. To avoid falling into that pit, we introduce an ε (epsilon) which gently moves our Zn away from 1/2. In order for the epsilon fix to work, ε must be small.  Now that we have our step reduction we can find our test points.

We find our test points just like in GSS. X1 and X2 are the initial boundaries, X3 and X4 are calculated by the following equations:

Screen Shot 2015-06-13 at 9.08.00 PM

With all this information we can form our search. The method takes on the same form of GSS, where the range is split into sections and one section is chosen and the other rejected. The accepted section is then further split and the method repeats. Now that we have the basics down and a rough outline of how to proceed we can try our method. Thats for part 2.

-Marcello