Path: blob/master/Week 2/Programming Assignment-1/gradientDescentMulti.m
626 views
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)1%GRADIENTDESCENTMULTI Performs gradient descent to learn theta2% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by3% taking num_iters gradient steps with learning rate alpha45% Initialize some useful values6m = length(y); % number of training examples7J_history = zeros(num_iters, 1);89for iter = 1:num_iters1011% ====================== YOUR CODE HERE ======================12% Instructions: Perform a single gradient step on the parameter vector13% theta.14%15% Hint: While debugging, it can be useful to print out the values16% of the cost function (computeCostMulti) and gradient here.17%18error = (X*theta)-y;1920%temp = X * theta;21%error = temp - y;22%novaX = error' * X;23%theta = theta - ((alpha/m) * novaX');2425for i=1:size(theta,1),26temp(i) = theta(i) - (alpha/m)*sum(error.*(X(:,i)));27theta(i,:)=temp(i);28end;29% ============================================================3031% Save the cost J in every iteration32J_history(iter) = computeCostMulti(X, y, theta);3334printf("Compute Cost\n");35disp(J_history(iter));36end3738end394041