Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 9/Programming Assignment - 8/ex8/computeNumericalGradient.m
616 views
1
function numgrad = computeNumericalGradient(J, theta)
2
%COMPUTENUMERICALGRADIENT Computes the gradient using "finite differences"
3
%and gives us a numerical estimate of the gradient.
4
% numgrad = COMPUTENUMERICALGRADIENT(J, theta) computes the numerical
5
% gradient of the function J around theta. Calling y = J(theta) should
6
% return the function value at theta.
7
8
% Notes: The following code implements numerical gradient checking, and
9
% returns the numerical gradient.It sets numgrad(i) to (a numerical
10
% approximation of) the partial derivative of J with respect to the
11
% i-th input argument, evaluated at theta. (i.e., numgrad(i) should
12
% be the (approximately) the partial derivative of J with respect
13
% to theta(i).)
14
%
15
16
numgrad = zeros(size(theta));
17
perturb = zeros(size(theta));
18
e = 1e-4;
19
for p = 1:numel(theta)
20
% Set perturbation vector
21
perturb(p) = e;
22
loss1 = J(theta - perturb);
23
loss2 = J(theta + perturb);
24
% Compute Numerical Gradient
25
numgrad(p) = (loss2 - loss1) / (2*e);
26
perturb(p) = 0;
27
end
28
29
end
30
31