Path: blob/master/Week 3/Programming Assignment - 2/machine-learning-ex2/ex2/ex2.m
863 views
%% Machine Learning Online Class - Exercise 2: Logistic Regression1%2% Instructions3% ------------4%5% This file contains code that helps you get started on the logistic6% regression exercise. You will need to complete the following functions7% in this exericse:8%9% sigmoid.m10% costFunction.m11% predict.m12% costFunctionReg.m13%14% For this exercise, you will not need to change any code in this file,15% or any other files other than those mentioned above.16%1718%% Initialization19clear ; close all; clc2021%% Load Data22% The first two columns contains the exam scores and the third column23% contains the label.2425data = load('ex2data1.txt');26X = data(:, [1, 2]); y = data(:, 3);2728%% ==================== Part 1: Plotting ====================29% We start the exercise by first plotting the data to understand the30% the problem we are working with.3132fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...33'indicating (y = 0) examples.\n']);3435plotData(X, y);3637% Put some labels38hold on;39% Labels and Legend40xlabel('Exam 1 score')41ylabel('Exam 2 score')4243% Specified in plot order44legend('Admitted', 'Not admitted')45hold off;4647fprintf('\nProgram paused. Press enter to continue.\n');48pause;495051%% ============ Part 2: Compute Cost and Gradient ============52% In this part of the exercise, you will implement the cost and gradient53% for logistic regression. You neeed to complete the code in54% costFunction.m5556% Setup the data matrix appropriately, and add ones for the intercept term57[m, n] = size(X);5859% Add intercept term to x and X_test60X = [ones(m, 1) X];6162% Initialize fitting parameters63initial_theta = zeros(n + 1, 1);6465% Compute and display initial cost and gradient66[cost, grad] = costFunction(initial_theta, X, y);6768fprintf('Cost at initial theta (zeros): %f\n', cost);69fprintf('Expected cost (approx): 0.693\n');70fprintf('Gradient at initial theta (zeros): \n');71fprintf(' %f \n', grad);72fprintf('Expected gradients (approx):\n -0.1000\n -12.0092\n -11.2628\n');7374% Compute and display cost and gradient with non-zero theta75test_theta = [-24; 0.2; 0.2];76[cost, grad] = costFunction(test_theta, X, y);7778fprintf('\nCost at test theta: %f\n', cost);79fprintf('Expected cost (approx): 0.218\n');80fprintf('Gradient at test theta: \n');81fprintf(' %f \n', grad);82fprintf('Expected gradients (approx):\n 0.043\n 2.566\n 2.647\n');8384fprintf('\nProgram paused. Press enter to continue.\n');85pause;868788%% ============= Part 3: Optimizing using fminunc =============89% In this exercise, you will use a built-in function (fminunc) to find the90% optimal parameters theta.9192% Set options for fminunc93options = optimset('GradObj', 'on', 'MaxIter', 400);9495% Run fminunc to obtain the optimal theta96% This function will return theta and the cost97[theta, cost] = ...98fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);99100% Print theta to screen101fprintf('Cost at theta found by fminunc: %f\n', cost);102fprintf('Expected cost (approx): 0.203\n');103fprintf('theta: \n');104fprintf(' %f \n', theta);105fprintf('Expected theta (approx):\n');106fprintf(' -25.161\n 0.206\n 0.201\n');107108% Plot Boundary109plotDecisionBoundary(theta, X, y);110111% Put some labels112hold on;113% Labels and Legend114xlabel('Exam 1 score')115ylabel('Exam 2 score')116117% Specified in plot order118legend('Admitted', 'Not admitted')119hold off;120121fprintf('\nProgram paused. Press enter to continue.\n');122pause;123124%% ============== Part 4: Predict and Accuracies ==============125% After learning the parameters, you'll like to use it to predict the outcomes126% on unseen data. In this part, you will use the logistic regression model127% to predict the probability that a student with score 45 on exam 1 and128% score 85 on exam 2 will be admitted.129%130% Furthermore, you will compute the training and test set accuracies of131% our model.132%133% Your task is to complete the code in predict.m134135% Predict probability for a student with score 45 on exam 1136% and score 85 on exam 2137138prob = sigmoid([1 45 85] * theta);139fprintf(['For a student with scores 45 and 85, we predict an admission ' ...140'probability of %f\n'], prob);141fprintf('Expected value: 0.775 +/- 0.002\n\n');142143% Compute accuracy on our training set144p = predict(theta, X);145146fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100);147fprintf('Expected accuracy (approx): 89.0\n');148fprintf('\n');149150151152153