Path: blob/master/Week 2/Programming Assignment-1/ex1_multi.m
628 views
%% Machine Learning Online Class1% Exercise 1: Linear regression with multiple variables2%3% Instructions4% ------------5%6% This file contains code that helps you get started on the7% linear regression exercise.8%9% You will need to complete the following functions in this10% exericse:11%12% warmUpExercise.m13% plotData.m14% gradientDescent.m15% computeCost.m16% gradientDescentMulti.m17% computeCostMulti.m18% featureNormalize.m19% normalEqn.m20%21% For this part of the exercise, you will need to change some22% parts of the code below for various experiments (e.g., changing23% learning rates).24%2526%% Initialization2728%% ================ Part 1: Feature Normalization ================2930%% Clear and Close Figures31clear ; close all; clc3233fprintf('Loading data ...\n');3435%% Load Data36data = load('ex1data2.txt');37X = data(:, 1:2);38y = data(:, 3);39m = length(y);4041% Print out some data points42fprintf('First 10 examples from the dataset: \n');43fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');4445fprintf('Program paused. Press enter to continue.\n');46pause;4748% Scale features and set them to zero mean49fprintf('Normalizing Features ...\n');5051[X mu sigma] = featureNormalize(X);5253% Add intercept term to X54X = [ones(m, 1) X];555657%% ================ Part 2: Gradient Descent ================5859% ====================== YOUR CODE HERE ======================60% Instructions: We have provided you with the following starter61% code that runs gradient descent with a particular62% learning rate (alpha).63%64% Your task is to first make sure that your functions -65% computeCost and gradientDescent already work with66% this starter code and support multiple variables.67%68% After that, try running gradient descent with69% different values of alpha and see which one gives70% you the best result.71%72% Finally, you should complete the code at the end73% to predict the price of a 1650 sq-ft, 3 br house.74%75% Hint: By using the 'hold on' command, you can plot multiple76% graphs on the same figure.77%78% Hint: At prediction, make sure you do the same feature normalization.79%8081fprintf('Running gradient descent ...\n');82%X = [ones(m, 1), data(:,1)]; %Adding ones to 1st feature83% Choose some alpha value84alpha = 0.021;85num_iters = 400;8687% Init Theta and Run Gradient Descent88theta = zeros(3, 1);89[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);9091% Plot the convergence graph92figure;93plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);94xlabel('Number of iterations');95ylabel('Cost J');9697% Display gradient descent's result98fprintf('Theta computed from gradient descent: \n');99fprintf(' %f \n', theta);100fprintf('\n');101102% Estimate the price of a 1650 sq-ft, 3 br house103% ====================== YOUR CODE HERE ======================104% Recall that the first column of X is all-ones. Thus, it does105% not need to be normalized.106price = 0; % You should change this107108%Feature scaling109temp = [1 1650 3];110temp(1,2) = (temp(1,2) - mu(1,1))/(sigma(1,1));111temp(1,3) = (temp(1,3) - mu(1,2))/(sigma(1,2));112113price = temp * theta;114115% ============================================================116117fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...118'(using gradient descent):\n $%f\n'], price);119120fprintf('Program paused. Press enter to continue.\n');121pause;122123%% ================ Part 3: Normal Equations ================124125fprintf('Solving with normal equations...\n');126127% ====================== YOUR CODE HERE ======================128% Instructions: The following code computes the closed form129% solution for linear regression using the normal130% equations. You should complete the code in131% normalEqn.m132%133% After doing so, you should complete this code134% to predict the price of a 1650 sq-ft, 3 br house.135%136137%% Load Data138data = csvread('ex1data2.txt');139X = data(:, 1:2);140y = data(:, 3);141m = length(y);142143% Add intercept term to X144X = [ones(m, 1) X];145146% Calculate the parameters from the normal equation147theta = normalEqn(X, y);148149% Display normal equation's result150fprintf('Theta computed from the normal equations: \n');151fprintf(' %f \n', theta);152fprintf('\n');153154155% Estimate the price of a 1650 sq-ft, 3 br house156% ====================== YOUR CODE HERE ======================157price = 0; % You should change this158159%Feature scaling for test prediction160temp = [1 1650 3];161price = temp * theta;162163% ============================================================164165fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...166'(using normal equations):\n $%f\n'], price);167168169170