Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 2/Programming Assignment-1/ex1_multi.m
628 views
1
%% Machine Learning Online Class
2
% Exercise 1: Linear regression with multiple variables
3
%
4
% Instructions
5
% ------------
6
%
7
% This file contains code that helps you get started on the
8
% linear regression exercise.
9
%
10
% You will need to complete the following functions in this
11
% exericse:
12
%
13
% warmUpExercise.m
14
% plotData.m
15
% gradientDescent.m
16
% computeCost.m
17
% gradientDescentMulti.m
18
% computeCostMulti.m
19
% featureNormalize.m
20
% normalEqn.m
21
%
22
% For this part of the exercise, you will need to change some
23
% parts of the code below for various experiments (e.g., changing
24
% learning rates).
25
%
26
27
%% Initialization
28
29
%% ================ Part 1: Feature Normalization ================
30
31
%% Clear and Close Figures
32
clear ; close all; clc
33
34
fprintf('Loading data ...\n');
35
36
%% Load Data
37
data = load('ex1data2.txt');
38
X = data(:, 1:2);
39
y = data(:, 3);
40
m = length(y);
41
42
% Print out some data points
43
fprintf('First 10 examples from the dataset: \n');
44
fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');
45
46
fprintf('Program paused. Press enter to continue.\n');
47
pause;
48
49
% Scale features and set them to zero mean
50
fprintf('Normalizing Features ...\n');
51
52
[X mu sigma] = featureNormalize(X);
53
54
% Add intercept term to X
55
X = [ones(m, 1) X];
56
57
58
%% ================ Part 2: Gradient Descent ================
59
60
% ====================== YOUR CODE HERE ======================
61
% Instructions: We have provided you with the following starter
62
% code that runs gradient descent with a particular
63
% learning rate (alpha).
64
%
65
% Your task is to first make sure that your functions -
66
% computeCost and gradientDescent already work with
67
% this starter code and support multiple variables.
68
%
69
% After that, try running gradient descent with
70
% different values of alpha and see which one gives
71
% you the best result.
72
%
73
% Finally, you should complete the code at the end
74
% to predict the price of a 1650 sq-ft, 3 br house.
75
%
76
% Hint: By using the 'hold on' command, you can plot multiple
77
% graphs on the same figure.
78
%
79
% Hint: At prediction, make sure you do the same feature normalization.
80
%
81
82
fprintf('Running gradient descent ...\n');
83
%X = [ones(m, 1), data(:,1)]; %Adding ones to 1st feature
84
% Choose some alpha value
85
alpha = 0.021;
86
num_iters = 400;
87
88
% Init Theta and Run Gradient Descent
89
theta = zeros(3, 1);
90
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
91
92
% Plot the convergence graph
93
figure;
94
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
95
xlabel('Number of iterations');
96
ylabel('Cost J');
97
98
% Display gradient descent's result
99
fprintf('Theta computed from gradient descent: \n');
100
fprintf(' %f \n', theta);
101
fprintf('\n');
102
103
% Estimate the price of a 1650 sq-ft, 3 br house
104
% ====================== YOUR CODE HERE ======================
105
% Recall that the first column of X is all-ones. Thus, it does
106
% not need to be normalized.
107
price = 0; % You should change this
108
109
%Feature scaling
110
temp = [1 1650 3];
111
temp(1,2) = (temp(1,2) - mu(1,1))/(sigma(1,1));
112
temp(1,3) = (temp(1,3) - mu(1,2))/(sigma(1,2));
113
114
price = temp * theta;
115
116
% ============================================================
117
118
fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
119
'(using gradient descent):\n $%f\n'], price);
120
121
fprintf('Program paused. Press enter to continue.\n');
122
pause;
123
124
%% ================ Part 3: Normal Equations ================
125
126
fprintf('Solving with normal equations...\n');
127
128
% ====================== YOUR CODE HERE ======================
129
% Instructions: The following code computes the closed form
130
% solution for linear regression using the normal
131
% equations. You should complete the code in
132
% normalEqn.m
133
%
134
% After doing so, you should complete this code
135
% to predict the price of a 1650 sq-ft, 3 br house.
136
%
137
138
%% Load Data
139
data = csvread('ex1data2.txt');
140
X = data(:, 1:2);
141
y = data(:, 3);
142
m = length(y);
143
144
% Add intercept term to X
145
X = [ones(m, 1) X];
146
147
% Calculate the parameters from the normal equation
148
theta = normalEqn(X, y);
149
150
% Display normal equation's result
151
fprintf('Theta computed from the normal equations: \n');
152
fprintf(' %f \n', theta);
153
fprintf('\n');
154
155
156
% Estimate the price of a 1650 sq-ft, 3 br house
157
% ====================== YOUR CODE HERE ======================
158
price = 0; % You should change this
159
160
%Feature scaling for test prediction
161
temp = [1 1650 3];
162
price = temp * theta;
163
164
% ============================================================
165
166
fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
167
'(using normal equations):\n $%f\n'], price);
168
169
170