Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 3/Programming Assignment - 2/machine-learning-ex2/ex2/ex2_reg.m
863 views
1
%% Machine Learning Online Class - Exercise 2: Logistic Regression
2
%
3
% Instructions
4
% ------------
5
%
6
% This file contains code that helps you get started on the second part
7
% of the exercise which covers regularization with logistic regression.
8
%
9
% You will need to complete the following functions in this exericse:
10
%
11
% sigmoid.m
12
% costFunction.m
13
% predict.m
14
% costFunctionReg.m
15
%
16
% For this exercise, you will not need to change any code in this file,
17
% or any other files other than those mentioned above.
18
%
19
20
%% Initialization
21
clear ; close all; clc
22
23
%% Load Data
24
% The first two columns contains the X values and the third column
25
% contains the label (y).
26
27
data = load('ex2data2.txt');
28
X = data(:, [1, 2]); y = data(:, 3);
29
30
plotData(X, y);
31
32
% Put some labels
33
hold on;
34
35
% Labels and Legend
36
xlabel('Microchip Test 1')
37
ylabel('Microchip Test 2')
38
39
% Specified in plot order
40
legend('y = 1', 'y = 0')
41
hold off;
42
43
44
%% =========== Part 1: Regularized Logistic Regression ============
45
% In this part, you are given a dataset with data points that are not
46
% linearly separable. However, you would still like to use logistic
47
% regression to classify the data points.
48
%
49
% To do so, you introduce more features to use -- in particular, you add
50
% polynomial features to our data matrix (similar to polynomial
51
% regression).
52
%
53
54
% Add Polynomial Features
55
56
% Note that mapFeature also adds a column of ones for us, so the intercept
57
% term is handled
58
X = mapFeature(X(:,1), X(:,2));
59
60
% Initialize fitting parameters
61
initial_theta = zeros(size(X, 2), 1);
62
63
% Set regularization parameter lambda to 1
64
lambda = 1;
65
66
% Compute and display initial cost and gradient for regularized logistic
67
% regression
68
[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);
69
70
fprintf('Cost at initial theta (zeros): %f\n', cost);
71
fprintf('Expected cost (approx): 0.693\n');
72
fprintf('Gradient at initial theta (zeros) - first five values only:\n');
73
fprintf(' %f \n', grad(1:5));
74
fprintf('Expected gradients (approx) - first five values only:\n');
75
fprintf(' 0.0085\n 0.0188\n 0.0001\n 0.0503\n 0.0115\n');
76
77
fprintf('\nProgram paused. Press enter to continue.\n');
78
pause;
79
80
% Compute and display cost and gradient
81
% with all-ones theta and lambda = 10
82
test_theta = ones(size(X,2),1);
83
[cost, grad] = costFunctionReg(test_theta, X, y, 10);
84
85
fprintf('\nCost at test theta (with lambda = 10): %f\n', cost);
86
fprintf('Expected cost (approx): 3.16\n');
87
fprintf('Gradient at test theta - first five values only:\n');
88
fprintf(' %f \n', grad(1:5));
89
fprintf('Expected gradients (approx) - first five values only:\n');
90
fprintf(' 0.3460\n 0.1614\n 0.1948\n 0.2269\n 0.0922\n');
91
92
fprintf('\nProgram paused. Press enter to continue.\n');
93
pause;
94
95
%% ============= Part 2: Regularization and Accuracies =============
96
% Optional Exercise:
97
% In this part, you will get to try different values of lambda and
98
% see how regularization affects the decision coundart
99
%
100
% Try the following values of lambda (0, 1, 10, 100).
101
%
102
% How does the decision boundary change when you vary lambda? How does
103
% the training set accuracy vary?
104
%
105
106
% Initialize fitting parameters
107
initial_theta = zeros(size(X, 2), 1);
108
109
% Set regularization parameter lambda to 1 (you should vary this)
110
lambda = 1;
111
112
% Set Options
113
options = optimset('GradObj', 'on', 'MaxIter', 400);
114
115
% Optimize
116
[theta, J, exit_flag] = ...
117
fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options);
118
119
% Plot Boundary
120
plotDecisionBoundary(theta, X, y);
121
hold on;
122
title(sprintf('lambda = %g', lambda))
123
124
% Labels and Legend
125
xlabel('Microchip Test 1')
126
ylabel('Microchip Test 2')
127
128
legend('y = 1', 'y = 0', 'Decision boundary')
129
hold off;
130
131
% Compute accuracy on our training set
132
p = predict(theta, X);
133
134
fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100);
135
fprintf('Expected accuracy (with lambda = 1): 83.1 (approx)\n');
136
137
138