Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 6/Programming Assignment - 5/machine-learning-ex5/ex5/validationCurve.m
864 views
1
function [lambda_vec, error_train, error_val] = ...
2
validationCurve(X, y, Xval, yval)
3
%VALIDATIONCURVE Generate the train and validation errors needed to
4
%plot a validation curve that we can use to select lambda
5
% [lambda_vec, error_train, error_val] = ...
6
% VALIDATIONCURVE(X, y, Xval, yval) returns the train
7
% and validation errors (in error_train, error_val)
8
% for different values of lambda. You are given the training set (X,
9
% y) and validation set (Xval, yval).
10
%
11
12
% Selected values of lambda (you should not change this)
13
lambda_vec = [0 0.001 0.003 0.01 0.03 0.1 0.3 1 3 10]';
14
15
% You need to return these variables correctly.
16
error_train = zeros(length(lambda_vec), 1);
17
error_val = zeros(length(lambda_vec), 1);
18
19
% ====================== YOUR CODE HERE ======================
20
% Instructions: Fill in this function to return training errors in
21
% error_train and the validation errors in error_val. The
22
% vector lambda_vec contains the different lambda parameters
23
% to use for each calculation of the errors, i.e,
24
% error_train(i), and error_val(i) should give
25
% you the errors obtained after training with
26
% lambda = lambda_vec(i)
27
%
28
% Note: You can loop over lambda_vec with the following:
29
%
30
% for i = 1:length(lambda_vec)
31
% lambda = lambda_vec(i);
32
% % Compute train / val errors when training linear
33
% % regression with regularization parameter lambda
34
% % You should store the result in error_train(i)
35
% % and error_val(i)
36
% ....
37
%
38
% end
39
%
40
%
41
42
43
for i = 1:length(lambda_vec)
44
lambda = lambda_vec(i);
45
theta = trainLinearReg(X,y,lambda);
46
error_train(i) = linearRegCostFunction(X,y,theta,0);
47
error_val(i) = linearRegCostFunction(Xval,yval,theta,0);
48
end;
49
50
% =========================================================================
51
52
end
53
54