Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 2/Programming Assignment-1/gradientDescentMulti.m
626 views
1
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
2
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
3
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
4
% taking num_iters gradient steps with learning rate alpha
5
6
% Initialize some useful values
7
m = length(y); % number of training examples
8
J_history = zeros(num_iters, 1);
9
10
for iter = 1:num_iters
11
12
% ====================== YOUR CODE HERE ======================
13
% Instructions: Perform a single gradient step on the parameter vector
14
% theta.
15
%
16
% Hint: While debugging, it can be useful to print out the values
17
% of the cost function (computeCostMulti) and gradient here.
18
%
19
error = (X*theta)-y;
20
21
%temp = X * theta;
22
%error = temp - y;
23
%novaX = error' * X;
24
%theta = theta - ((alpha/m) * novaX');
25
26
for i=1:size(theta,1),
27
temp(i) = theta(i) - (alpha/m)*sum(error.*(X(:,i)));
28
theta(i,:)=temp(i);
29
end;
30
% ============================================================
31
32
% Save the cost J in every iteration
33
J_history(iter) = computeCostMulti(X, y, theta);
34
35
printf("Compute Cost\n");
36
disp(J_history(iter));
37
end
38
39
end
40
41