Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 4/Programming Assignment - 3/machine-learning-ex3/ex3/oneVsAll.m
864 views
1
function [all_theta] = oneVsAll(X, y, num_labels, lambda)
2
%ONEVSALL trains multiple logistic regression classifiers and returns all
3
%the classifiers in a matrix all_theta, where the i-th row of all_theta
4
%corresponds to the classifier for label i
5
% [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
6
% logistic regression classifiers and returns each of these classifiers
7
% in a matrix all_theta, where the i-th row of all_theta corresponds
8
% to the classifier for label i
9
10
% Some useful variables
11
m = size(X, 1);
12
n = size(X, 2);
13
14
% You need to return the following variables correctly
15
all_theta = zeros(num_labels, n + 1);
16
17
% Add ones to the X data matrix
18
X = [ones(m, 1) X];
19
20
% ====================== YOUR CODE HERE ======================
21
% Instructions: You should complete the following code to train num_labels
22
% logistic regression classifiers with regularization
23
% parameter lambda.
24
%
25
% Hint: theta(:) will return a column vector.
26
%
27
% Hint: You can use y == c to obtain a vector of 1's and 0's that tell you
28
% whether the ground truth is true/false for this class.
29
%
30
% Note: For this assignment, we recommend using fmincg to optimize the cost
31
% function. It is okay to use a for-loop (for c = 1:num_labels) to
32
% loop over the different classes.
33
%
34
% fmincg works similarly to fminunc, but is more efficient when we
35
% are dealing with large number of parameters.
36
%
37
% Example Code for fmincg:
38
%
39
% % Set Initial theta
40
% initial_theta = zeros(n + 1, 1);
41
%
42
% % Set options for fminunc
43
% options = optimset('GradObj', 'on', 'MaxIter', 50);
44
%
45
% % Run fmincg to obtain the optimal theta
46
% % This function will return theta and the cost
47
% [theta] = ...
48
% fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
49
% initial_theta, options);
50
%
51
options = optimset ('GradObj', 'on', 'MaxIter', 50);
52
initial_theta = zeros(n + 1, 1);
53
for k = 1:num_labels,
54
[theta] = fmincg(@(t)(lrCostFunction(t, X, (y == k), lambda)), initial_theta, options);
55
all_theta(k,:) = theta';
56
%printf("all_theta(%f, :)",k);
57
%disp(all_theta(k,:));
58
end;
59
60
61
62
63
64
65
66
67
% =========================================================================
68
69
70
end
71
72