Path: blob/master/Week 5/Programming Assignment - 4/machine-learning-ex4/ex4/checkNNGradients.m
864 views
function checkNNGradients(lambda)1%CHECKNNGRADIENTS Creates a small neural network to check the2%backpropagation gradients3% CHECKNNGRADIENTS(lambda) Creates a small neural network to check the4% backpropagation gradients, it will output the analytical gradients5% produced by your backprop code and the numerical gradients (computed6% using computeNumericalGradient). These two gradient computations should7% result in very similar values.8%910if ~exist('lambda', 'var') || isempty(lambda)11lambda = 0;12end1314input_layer_size = 3;15hidden_layer_size = 5;16num_labels = 3;17m = 5;1819% We generate some 'random' test data20Theta1 = debugInitializeWeights(hidden_layer_size, input_layer_size);21Theta2 = debugInitializeWeights(num_labels, hidden_layer_size);22% Reusing debugInitializeWeights to generate X23X = debugInitializeWeights(m, input_layer_size - 1);24y = 1 + mod(1:m, num_labels)';2526% Unroll parameters27nn_params = [Theta1(:) ; Theta2(:)];2829% Short hand for cost function30costFunc = @(p) nnCostFunction(p, input_layer_size, hidden_layer_size, ...31num_labels, X, y, lambda);3233[cost, grad] = costFunc(nn_params);34numgrad = computeNumericalGradient(costFunc, nn_params);3536% Visually examine the two gradient computations. The two columns37% you get should be very similar.38disp([numgrad grad]);39fprintf(['The above two columns you get should be very similar.\n' ...40'(Left-Your Numerical Gradient, Right-Analytical Gradient)\n\n']);4142% Evaluate the norm of the difference between two solutions.43% If you have a correct implementation, and assuming you used EPSILON = 0.000144% in computeNumericalGradient.m, then diff below should be less than 1e-945diff = norm(numgrad-grad)/norm(numgrad+grad);4647fprintf(['If your backpropagation implementation is correct, then \n' ...48'the relative difference will be small (less than 1e-9). \n' ...49'\nRelative Difference: %g\n'], diff);5051end525354