One Dimensional Search Method: Golden Section Search Part 3

Weve been over what GSS is in part 1 and how exactly to use GSS in part 2. Now we need to know when we can use GSS and explore a real world example. So if you noticed our example was pretty simple. The minimum could be found by simple inspection or some derivative tests. Another thing to notice is that the function had exactly one minimum, which ended up being the global minimum of the function. However, lets look what happens to our search method when we have two minimum in our interval.

Lets take a look at the function: y=x^4-6x^2+3x+7. This function has two local minimums very close to each other, one of which is the global minimum of the function. The function is plotted below.

Screen Shot 2015-06-11 at 10.10.16 PM

Here you can clearly tell there are two minimum one around -2 and other other around 2. However, the minimum at -2 seems to be lower than that at 2. Lets see how our GSS tackles this problem. I am going to expand our interval to [-3,5]. This produces the following plot with each step a different color.
Screen Shot 2015-06-11 at 10.16.47 PMAs you can see the GSS seems to be closing in on one of our minimum. This is one limitation of the GSS it is not a global minimum search. GSS can find a minimum but there is no guarantee that that minimum is not just a local minimum. What happens if we change our interval. I will reduce our interval to [-3.3] and run the gss again. This produces the following plot with each step a different color. Screen Shot 2015-06-11 at 10.22.57 PMThis interval produces a very different search. Simply changing the interval we search over switches which minimum the gss converges on. This is not exactly something we want to look for in a search method. However, in both intervals the method did converge to a minimum which is something we want to look for in a search method.

Local vs global minimum explains a major issue with GSS. One of the main requirements for GSS is that a function must be unimodal. Now what does unimodal mean? This basically means that the function has a single local extremum. As we saw above this becomes important in terms of converging to the correct (and if unimodal, only) minimum.

To review, when should be use GSS: Function has one minimum (unimodal), you have an idea of the interval that contains the minimum, the function is of one variable.

Hopefully this gave you a better understanding of a basic search method. This is only the beginning.  I will clean up my code and post it below, for both matlab and python. I will also post it to my github at github.com/mgric.

-Marcello

ONE DIMENSIONAL SEARCH METHOD: GOLDEN SECTION SEARCH PART 2

Last post we went over how Golden Section Search works and how it got its name. This post we will go through our example step by step through the search method. Now if you remember from last time our function was f(x) = x^2 +3x+7 over the interval [-3,1]. Here’s the graph again below (this time made with matlab).

gss1This time I have included our interval. So our next step is to find our next two points. From part 1, x3 = x2 + (1-gr)*(x1-x2) and x4 = x1 + (1-gr)*(x2-x1). Substituting -3 for x1 and 1 for x2. This gives us an initial x3 =  -1.4721 and x4 = -0.5279. I have plotted the new lines on our graph below.

Screen Shot 2015-06-08 at 9.41.31 PM

As you can see our new interval as already drastically decreased from our original interval.  However, we need to determine which interval we keep. Graphically we can see that the green line on the left is the one we want to keep. So we create a new interval ensuring the green line is within the interval. This time we only have to evaluate one additional point. That point will lie between our left most boundary (black line) and the left green line. We first re evaluate our interval, x1 is still x1, x4 is now our right most boundary (x2). Now we need our two interior points. Well we already have one, the left green line, so we need to calculate the other one.Screen Shot 2015-06-08 at 10.04.21 PM

The two new boundaries are shown in red. As you can see one of the green lines before is now red and the other boundary is found by the formula x3 = x2 + (1-gr)*(x1-x2). If we had chosen the other line we would need to calculate a new x4. This cold be found by x4 = x1 +(1-gr)*(x2-x1). But in this case we only need x3. We are narrowing down to the minimum, but we are still not there yet. This means we need to do another iteration.

Screen Shot 2015-06-08 at 10.19.15 PMAs before one of our lines changed color and we have a new line and this a new interval. Notice that our interval is getting smaller and smaller proportionally though. This brings us to a good point. When do we decide when to stop? We have to set a tolerance condition. For the sake of brevity lets set our tolerance to be 0.1, which isn’t a great interval, but it will get us pretty close to our minimum.  Our tolerance is defined by the absolute value of the difference between x3 and x4 our interior points. Lets run it a few more times till we pass our threshold.
Screen Shot 2015-06-08 at 10.31.34 PMI ran the search 3 more times. First the yellow line, then magenta, and finally blue. The blue lines give us a tolerance of 0.08453 with our minimum in the middle of this interval at -1.5147. The actual minimum of this function is -1.5, which is close to the minimum we calculated. With more iterations and a stricter tolerance we can get closer to the actual minimum. However with only 6 iterations we are within 1% of the actual value.

Hopefully this shed a little light on how golden section search converges on our minimum. But! There is more that we need to go over. Next post we will program a simple program for it in matlab and python. Also we will go over the limitations of GSS.

-Marcello

One Dimensional Search Method: Golden Section Search Part 1

When approaching optimization problems it is essential to know what you are looking for. Whether we want a minimum or maximum, we need a direction to look or a way to search. One of the simplest search methods is the golden section search.

Golden Section Search (GSS) using a closing window to narrow down to our minimum or “boxing in”. By restriction the area with each step the minimum becomes trapped between the shrinking boundaries. To help visualize imagine the trash compactor scene from Star Wars IV.

Our minimum is Han Solo, the trash compactor is our function, and the space between our walls our search area. When the compactor walls are not activated, there is plenty of room for our heroes to occupy. However, as the walls start to come together, there is less room for our heroes to move around in. Eventually if allowed to progress, the walls will frame our hereos, making them very easy to spot amongst the garbage.

This is not exactly how GSS works as this would be terribly inefficient, as we would have to try every point starting from the left and the right till they meet at our minimum. This becomes very expensive in terms of time and calculations. This is the exact reason why we use and develop search methods.

GSS works a little different. In order to cut down on the number of iterations and function evaluations, GSS breaks our domain into sections. GSS then systematically eliminates sections which (hopefully) do not contain our minimum. The section that remains is then further divided and the method repeats. This happens until a termination condition is met or a tolerance is reached.  So know that we have a basic idea of how this works lets jump in.

First lets take a very easy function and run through all the steps. Below is a graph of the function y = x²+3x-7. (taken from wolfram alpha)

x2

Now this problem can be solved simply by inspection, but lets use the GSS anyway. First we have to pick an interval on which we believe our minimum lies. Lets use an interval of [-3,1]. This interval contains our minimum as well as a good amount of extra so we can see the effects of the search. Now that we have an interval we need to make our sections.

We first choose the boundaries as the first two points to evaluate our function, lets call them x1 and x2. However, this doesn’t exactly break up our function into the sections we need. So we choose two more points. We first choose a general x3 which lies between x1 and x2. For x4, we need to insure that we have two equal intervals and that it also lies between x3 and x2. If we have unequal intervals our method might choose the wider interval which would cause our method to reach our minimum slower. Also if we choose a point outside this range, we will need to redefine the constrains we put on this equation. Because of that we need a formula to determine a fourth point that produces equal sections. GSS chooses the point x4 by the following function x4 =x1 + (x3-x2). This insures that the distance between  x1 and x2 is the same as x3 and x4.

But where exactly do we choose our x3? GSS has a procedure for this. GSS relies upon a constant proportional reduction of the interval we search. This means that uncertainty range is reduced by ρ at every iteration. This also means that our new evaluation points must maintain this proportion. At this point GSS starts to get a little confusing. We need to derive a way to keep a constant proportion throughout the points as well as between iterations.

So we need the ratio of x4-x3 to x3-x1 to be equal to the ratio of x3-x1to x2-x3. So it should look like this:

Screen Shot 2015-06-07 at 2.31.33 PM

This equation is telling us that the proportions of the two sections of the first iteration is equal to the proportions of the two sections of the second iteration of our search. This is for the case that the interval between x4 and x1 brings us closer to the minimum. If it is the interval between x2 and x3 that brings us closer we have the following proportion:

Screen Shot 2015-06-07 at 2.45.00 PM

Both of these equations must be true as either section should be viable even though our search should not pick it. So we use these two equations to find out what our proportion is. Eliminating (x4-x3) and substituting in for x4 with a bit of rearranging gives us:Screen Shot 2015-06-07 at 2.58.00 PM

Substituting rho for (x2-x3)/(x3-x1) and solving gives us rho = 1.61803…. or the Golden Ratio. This is where GSS gets its name from. So to come back to our main point each subsequent testing point is chosen so that the golden ratio is maintained with each new section. This allows the search range by a factor of 1-z or 0.61803 with each iteration.

But how does that give us our points? Well since we know that they have to be even intervals and they must be proportional by 0.618. This is enough along with a little algebra to determine our points. x3 = x2 + (1-z)*(x1-x2) and x4 = x1 + (1-z)*(x2-x1).

Now that we know how the method works we can solve our original problem. Ill save that for part two.

-Marcello