Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 6/Programming Assignment - 5/machine-learning-ex5/ex5/linearRegCostFunction.m
864 views
1
function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
2
%LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear
3
%regression with multiple variables
4
% [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the
5
% cost of using theta as the parameter for linear regression to fit the
6
% data points in X and y. Returns the cost in J and the gradient in grad
7
8
% Initialize some useful values
9
m = length(y); % number of training examples
10
11
% You need to return the following variables correctly
12
J = 0;
13
grad = zeros(size(theta));
14
15
% ====================== YOUR CODE HERE ======================
16
% Instructions: Compute the cost and gradient of regularized linear
17
% regression for a particular choice of theta.
18
%
19
% You should set J to the cost and grad to the gradient.
20
%
21
% Computing cost
22
23
thetaReg = theta;
24
thetaReg(1) = 0;
25
26
J = (1/(2*m)) * sum(((X*theta)-y).^2) + ((lambda/(2*m)) * sum(thetaReg.^2));
27
% Computing gradient
28
grad = (1/m) * (X'*((X*theta)-y)) + (lambda/m) * thetaReg;
29
% =========================================================================
30
31
grad = grad(:);
32
33
end
34
35