Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
DataScienceUWL
GitHub Repository: DataScienceUWL/DS775
Path: blob/main/Homework/Lesson 06 HW - Global Optimization 1/Homework_06.ipynb
870 views
Kernel: Python 3 (system-wide)
# imports, add to this as needed # change to matplotlib notebook for classic view import numpy as np import matplotlib.pyplot as plt import seaborn as sns sns.set_style("darkgrid") from scipy.optimize import minimize, dual_annealing import json from simanneal import Annealer from warnings import filterwarnings from skopt import gp_minimize

Homework 6

When asking questions about homework in Piazza please use a tag in the subject line like HW1.3 to refer to Homework 1, Question 3. So the subject line might be HW1.3 question. Note there are no spaces in "HW1.3". This really helps keep Piazza easily searchable for everyone!

For full credit, all code in this notebook must be both executed in this notebook and copied to the Canvas quiz where indicated.

Hints for 1-2

This should be a straightforward adaptation of the code in the lesson.

Be sure you're using the simanneal package and not the simanneal_tsp() function from the lesson.

Note, the distances in the distance matrix are in meters so you'll need to divide by 1000 to get distances in kilometers.

Question 1 (1 point, manual)

Adapt the code in the lesson for using the simanneal package for the Traveling Salesman Problem to find a tour of the 48 state capitals that is less than 18,000 km in length. You don't need to make any changes to the TravellingSalesmanProblem class, but if your tour is longer than 18,000 km or has self-intersection (crossovers) then repeat the search. The global minimum for total distance is 17,357 km, but you don't need to find that. Use the plotting code from Lesson 5 to show a picture of the tour that includes the distance. In the space below, include your picture of the tour that includes the distance. Do not include code here, instead include it in Question 2.

Question 2 (5 points, manual)

Include your code for using simanneal and plotting the tour in the space below. Also make sure your code is in CoCalc and that it has been executed with output showing.

Hints for 3-5

We recognize many of you aren't familiar with object-oriented programming so we've provide a template
for your solution in the code cell in Question 5.

You can also refer to the object-oriented solution to Question 12 in Homework 5. The setup is similar.
In the Homework 5 solutions we wrote separate functions for making moves and evaluating the objective function.
Then we called those functions from our LocalSearcher object.
You can follow that approach here or you include the code directly in the class.
For example, you could pick a random index and "flip that bit" in self.state directly in your code for the class.

Question 3 (1 point, manual)

Use simulated annealing in the simanneal package to solve the knapsack problem introduced in Lesson 5 and Homework 5. Your "move" function/method should use a "hard constraint" and reject any potential knapsack solutions with weight greater than 50:

  • copy self.state with new_state = self.state.copy()

  • make a move on new_state (toggle a random item like in HW 5)

  • if the total weight of new_state is 50\leq 50 then set self.state = new_state

  • else don't change self.state.

Your fitness function/method should return the negative total value of the items in the knapsack since simanneal is for minimizing functions, but here we want the maximum. Use this exact code to set up the knapsack. Do not change anything.

Note: the move method doesn't return anything, it should simply modify self.state.

# generate random weights and values for a knapsack problem # DO NOT CHANGE ANYTHING in this block of code import numpy as np num_items = 20 np.random.seed(seed=123) values = np.random.randint(low=5, high=50, size=num_items) weights = np.random.randint(low=1, high=10, size=num_items) max_weight = 50 np.random.seed() # use system clock to reset the seed so future random numbers will appear random

Simanneal should be able to find the global maximum value. Execute your code a few times and report the maximum (positive) total value for the items in the knapsack. Note: simanneal is for minimizing functions.

Question 4 (1 point, auto)

How many items are included in your optimized knapsack?

Question 5 (4 points, manual)

Include the complete code for your simanneal solution to the "hard constraint" knapsack problem in the cell below. Your complete and executed code must be in CoCalc also.

# solution code for Question 5 - this time we code the move and fitness directly into the methods of the Annealer object # generate random weights and values for a knapsack problem ### DO NOT CHANGE ANYTHING in this block of code import numpy as np num_items = 20 np.random.seed(seed=123) values = np.random.randint(low=5, high=50, size=num_items) weights = np.random.randint(low=1, high=10, size=num_items) max_weight = 50 np.random.seed() # use system clock to reset the seed so future random numbers will appear random ### from simanneal import Annealer class KnapsackProblem(Annealer): # pass extra data into the constructor def __init__(self, state, weights, XXX, XXX ): assert( len(state) == len(weights) ) self.weights = weights self.XXX = XXX self.XXX = XXX super(KnapsackProblem, self).__init__(state) # important! def move(self): # note, you can refer to self.weights, etc. as needed here new_state = self.state.copy() # choose a random index and flip that bit in new_state # if the total weight of the items in the new_state knapsack is lower than the max, then self.state = new_state def energy(self): # return the negated total value of the items in the knapsack represented by self.state # initialize an empty knapsack init_knapsack = np.zeros(num_items, dtype = bool) # create an object of type KnapsackProblem. Pass in initial knapsack, values, etc. knapsack = KnapsackProblem(init_knapsack,XXX,XXX) # set the annealing schedule knapsack.set_schedule(knapsack.auto(minutes=.2)) # call the annealer to find a solution best_knapsack, best_value = knapsack.anneal() # print the results print(f"The maximum knapsack value found is {-best_value:.0f}") print(f"The weight of that knapsack is {sum(weights[best_knapsack])}") print(f"Include items {[i for i,b in enumerate(best_knapsack) if b]} in knapsack")

Hints for 6-7

This is pretty similar to Question 3-5, but make sure you don't reject any knapsacks inside the move method
- it's OK to have overweight knapsacks in this approach.
. The penalized objective function should steer the solution away from knapsacks that are too heavy.

Question 6 (1 point, auto)

Use simanneal to solve the knapsack problem again. This time use a "soft constraint" approach. This means all knapsacks can be considered, even those that are over 50 in total weight, so the move function should not reject any potential solutions. Soft constraints are implemented by including a penalty in the objective function when the proposed solution doesn't satisfy the constraint. To do this, you'll modify the function called by the energy() method:

def knapsack_value_penalty(x, values, weights, max_tot_weight): # x is a vector of booleans of which items to include tot_value = sum(values[x]) penalty = sum(values)*min( max_tot_weight - sum(weights[x]), 0) return tot_value+penalty

The penalty here is negative when the total weight is too large so that the optimizer knows it hasn't found a good maximizing solution and will continue to look elsewhere.

You should be able to find the global maximum with this approach. What is the maximum total value of the knapsack? If you've done everything correctly the answer should be the same as in Question 3.

Question 7 (5 points, manual)

Include your complete code for the soft constraint approach to the knapsack problem in the space below. Also make sure the code is in CoCalc and that it has been executed there.

Questions 8 - 20 Text

In the following you're going to minimize the ten-dimension Rastrigin function several different ways and compare the results. The Rastrigin function was introduced in Lesson 5 and Homework 5. Make sure using decision variables x1,x2,,x10x_1, x_2, \ldots, x_{10} that are in one array / list / numpy array with 10 elements. The bounds are 5.12xi5.12-5.12 \leq x_i \leq 5.12 for each ii. The global minimum value is 0 and occurs when x1=x2==x10=0x_1 = x_2 = \ldots = x_{10} = 0. At the end you'll compare the efficiency of the optimization algorithms.

Note, the Rastrigin function is a popular function for testing optimization algorithms. While it doesn't come from an application, it is realistic to have an objective function with MANY local minima. In fact, one of the challenges of deep learning is that the underlying objective function has many local minima (and has thousands or even billions of weights or decision variables to be tuned).

The code in the next cell shows you how to setup the Rastrigin function with a global counter so that you can easily track how many function evaluations are made. Set func_count to zero before the optimization and then print out the value of func_count after.

Hints

To make the Rastrigin function 10-dimensional, you have to evaluate the rastrigin() function on a numpy array of length 10.
Set the dimension by changing the length of the input array.
Note, we introduced the Rastrigin function in Lesson 5 in the section "Multistart for finding Global Optima¶". You may find it helpful to review that section.
We also included an example of a local search for a 10-dimensional Rastrigin function below.
# rastrigin definition def rastrigin(x): # x can be any length ... the length of x determines the number of input variables and is called the dimension global func_count func_count += 1 x = np.array(x) # force a numpy arrray here so that the math below works # pass a single vector of length n (=dim) to evaluate Rastrigin return sum(x**2 + 10 - 10 * np.cos(2 * np.pi * x)) # as a demo we'll use scipy.optimize.minimize to find a local min and # see how we can retrieve the number of function evaluations required by # the optimization func_count = 0 # must set to zero before evaluation x0 = np.random.uniform(-5.12,5.12,size=10) res = minimize( rastrigin, x0) print(f'The Rastrigin function was evaluated {func_count} times') # not all optimization routines track the number of evaluations, but minimize does # like this print(f'According to minimize the number of iterations is {res.nfev}.') print(f'The minimum value found was: {res.fun:0.4f}') print(f'This minvalue occurs at: {res.x}')
The Rastrigin function was evaluated 429 times According to minimize the number of iterations is 429. The minimum value found was: 86.5610 This minvalue occurs at: [-2.98485571 -3.97978387 -2.98485572 -4.97469141 0.99495863 1.98991223 -1.98991224 0.99495863 2.98485571 2.98485569]

Question 8 (1 point, auto)

Use the simanneal package as described in the lesson (with the "bumpy" function and in the self-assessment) to estimate the global minimum of the ten-dimensional Rastrigin function. Use sigma = 2. Also use the "auto schedule" with minutes = 0.2. What's the lowest value of the objective function you are able to obtain?

Question 9 (1 point, auto)

How many times did simanneal evaluate the Rastrigin function to find the best answer you reported in Question 8?

Question 10 (1 point manual)

Experiment with the value of sigma. Larger sigma corresponds to larger random moves. Explain how to adjust sigma to achieve lower estimates of the global minimum value (closer to zero).

Question 11 (4 points, manual)

Include the complete code to minimize the ten-dimensional Rastrigin function with simanneal. Make sure your code is in CoCalc and has been run there.

Question 12 (1 point, auto)

Now include local search in your simanneal solution. To do this, you'll want to keep the move method as above and modify the energy method to go like this:

energy method initiate a local search of Rastrigin starting at self.state, incorporate the bounds and use `tol=0.1` or `tol=0.5` for faster (and less accurate) local search set self.state to result.x from the local search return result.fun (the best function value from the local search)

For the local search use minimize from scipy.optimize with tol = 0.5. After your annealing is done one extra local search using the default tolerance value for tol so that your final answer is an accurate minimum. Don't forget to include bounds in minimize() so that the local search doesn't move to an out-of-bounds solution.

The auto scheduler in simanneal is very slow here since the function evaluations are pretty slow. You should not use auto_schedule and instead add a manual temperature schedule like this:

rast.Tmax = 10000.0 # Max (starting) temperature rast.Tmin = 0.1 # Min (ending) temperature rast.steps = 5000 # Number of iterations

where rast is the name of your instance of the Annealer object.

Use sigma = 2. What's the lowest value of the objective function you are able to obtain?

Question 13 (1 point, auto)

How many times did simanneal evaluate the Rastrigin function to find the best answer you reported in Question 12?

Question 14 (5 points, manual)

Include the complete code to minimize the ten-dimensional Rastrigin function with simanneal and local search. Make sure your code is in CoCalc and has been run there.

def energy(self): result = minimize(rastrigin, self.state, bounds = ..., tol = 0.1) # result.x is best_x, result.fun is best_y set self.state to result.x return result.fun

Question 15 (1 point, auto)

Now use the gp_minimize function to approximate the global minimum value of the ten-dimensional Rastrigin function. Include n_random_starts = 5 like in the examples.

What's the lowest value of the objective function you are able to obtain?

Question 16 (1 point, auto)

How many times did gp_minimize evaluate the Rastrigin function to find the best answer you reported in Question 15?

Question 17 (5 points, manual)

Include the complete code to minimize the ten-dimensional Rastrigin function with gp_minimize. Use Expected Improvement for the acquisition function and set ncalls = 40 (this will be slow as there is a lot of overhead for Bayesian Optimization). Make sure your code is in CoCalc and has been run there. (Hint: to make a list of 10 tuples for the bounds, you could do this [(-5.12,5.12)]*10.)

Question 18 (1 point, auto)

Use dual_annealing from scipy.optimize to approximate the global minimum value of the ten-dimensional Rastrigin function.

What's the lowest value of the objective function you are able to obtain? Round your answer to two decimal places, e.g. 0.53 or 9.74. Notice that 4.2e-14 is 0.00 to two decimal places.

Question 19 (1 point, auto)

How many times did dual_annealing evaluate the Rastrigin funciton to find the best answer you reported in Question 18?

Question 20 (5 points, manual)

Include the complete code to minimize the ten-dimensional Rastrigin function with dual_annealing. Make sure your code is in CoCalc and has been run there.

Question 21 (4 points, manual)

Write a brief comparison of the four methods used to minimize the 10D Rastrigin problem. Which worked best in terms of finding the global minimum. Which was most efficient? Which was easiest to use? Note: pay attention to scientific notation. 5.01900952e-09 is the the same as 0.000000005102900952. You very likely have some numbers which are effectively zero in your results.

Hints

Note: it's expected that bayesian optimization does poorly here. Bayesian optimization is designed to optimize functions which
are very expensive to evaluate so that we can only afford a few evaluations.
Bayesian optimization would be very competitive if all of the approaches were only allowed to evaluate the objective function only a small number of times.