Path: blob/master/Week 2/Programming Assignment-1/gradientDescent.m
626 views
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)1%GRADIENTDESCENT Performs gradient descent to learn theta2% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by3% taking num_iters gradient steps with learning rate alpha45% Initialize some useful values6m = length(y); % number of training examples7J_history = zeros(num_iters, 1);89for iter = 1:num_iters1011% ====================== YOUR CODE HERE ======================12% Instructions: Perform a single gradient step on the parameter vector13% theta.14%15% Hint: While debugging, it can be useful to print out the values16% of the cost function (computeCost) and gradient here.17%18predictions = X * theta;19error = (predictions - y);20temp1 = theta(1) - (alpha/m) * sum(error.*(X(:,1)));21temp2 = theta(2) - (alpha/m) * sum(error.*(X(:,2)));22% ============================================================23% Save the cost J in every iteration24J_history(iter) = computeCost(X, y, theta);25%printf("Compute Cost: J_history(iter%d) = %f\n", iter,J_history(iter));26end27predict3 = [1, 6] * theta;28fprintf('For population = 60,000, we predict a profit of %f\n',...29predict3*10000);30predict4 = [1, 8] * theta;31fprintf('For population = 80,000, we predict a profit of %f\n',...32predict4*10000);33end343536