Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 3/Programming Assignment - 2/machine-learning-ex2/ex2/costFunction.m
863 views
1
function [J, grad] = costFunction(theta, X, y)
2
%COSTFUNCTION Compute cost and gradient for logistic regression
3
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
4
% parameter for logistic regression and the gradient of the cost
5
% w.r.t. to the parameters.
6
7
% Initialize some useful values
8
m = length(y); % number of training examples
9
10
% You need to return the following variables correctly
11
J = 0;
12
grad = zeros(size(theta));
13
14
% ====================== YOUR CODE HERE ======================
15
% Instructions: Compute the cost of a particular choice of theta.
16
% You should set J to the cost.
17
% Compute the partial derivatives and set grad to the partial
18
% derivatives of the cost w.r.t. each parameter in theta
19
%
20
% Note: grad should have the same dimensions as theta
21
%
22
23
J = (-1/m) * sum ( (y.*log (sigmoid(X * theta))) + ((1-y).*log (1-sigmoid(X * theta))));
24
grad = (1/m) * (X'*((sigmoid(X * theta)) - y));
25
% =============================================================
26
27
end
28
29