Path: blob/master/Week 6/Programming Assignment - 5/machine-learning-ex5/ex5/linearRegCostFunction.m
864 views
function [J, grad] = linearRegCostFunction(X, y, theta, lambda)1%LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear2%regression with multiple variables3% [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the4% cost of using theta as the parameter for linear regression to fit the5% data points in X and y. Returns the cost in J and the gradient in grad67% Initialize some useful values8m = length(y); % number of training examples910% You need to return the following variables correctly11J = 0;12grad = zeros(size(theta));1314% ====================== YOUR CODE HERE ======================15% Instructions: Compute the cost and gradient of regularized linear16% regression for a particular choice of theta.17%18% You should set J to the cost and grad to the gradient.19%20% Computing cost2122thetaReg = theta;23thetaReg(1) = 0;2425J = (1/(2*m)) * sum(((X*theta)-y).^2) + ((lambda/(2*m)) * sum(thetaReg.^2));26% Computing gradient27grad = (1/m) * (X'*((X*theta)-y)) + (lambda/m) * thetaReg;28% =========================================================================2930grad = grad(:);3132end333435