Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
hackassin
GitHub Repository: hackassin/Coursera-Machine-Learning
Path: blob/master/Week 9/Programming Assignment - 8/ex8/ex8.m
616 views
1
%% Machine Learning Online Class
2
% Exercise 8 | Anomaly Detection and Collaborative Filtering
3
%
4
% Instructions
5
% ------------
6
%
7
% This file contains code that helps you get started on the
8
% exercise. You will need to complete the following functions:
9
%
10
% estimateGaussian.m
11
% selectThreshold.m
12
% cofiCostFunc.m
13
%
14
% For this exercise, you will not need to change any code in this file,
15
% or any other files other than those mentioned above.
16
%
17
18
%% Initialization
19
clear ; close all; clc
20
21
%% ================== Part 1: Load Example Dataset ===================
22
% We start this exercise by using a small dataset that is easy to
23
% visualize.
24
%
25
% Our example case consists of 2 network server statistics across
26
% several machines: the latency and throughput of each machine.
27
% This exercise will help us find possibly faulty (or very fast) machines.
28
%
29
30
fprintf('Visualizing example dataset for outlier detection.\n\n');
31
32
% The following command loads the dataset. You should now have the
33
% variables X, Xval, yval in your environment
34
load('ex8data1.mat');
35
36
% Visualize the example dataset
37
plot(X(:, 1), X(:, 2), 'bx');
38
axis([0 30 0 30]);
39
xlabel('Latency (ms)');
40
ylabel('Throughput (mb/s)');
41
42
fprintf('Program paused. Press enter to continue.\n');
43
pause
44
45
46
%% ================== Part 2: Estimate the dataset statistics ===================
47
% For this exercise, we assume a Gaussian distribution for the dataset.
48
%
49
% We first estimate the parameters of our assumed Gaussian distribution,
50
% then compute the probabilities for each of the points and then visualize
51
% both the overall distribution and where each of the points falls in
52
% terms of that distribution.
53
%
54
fprintf('Visualizing Gaussian fit.\n\n');
55
56
% Estimate my and sigma2
57
[mu sigma2] = estimateGaussian(X);
58
59
% Returns the density of the multivariate normal at each data point (row)
60
% of X
61
p = multivariateGaussian(X, mu, sigma2);
62
63
% Visualize the fit
64
visualizeFit(X, mu, sigma2);
65
xlabel('Latency (ms)');
66
ylabel('Throughput (mb/s)');
67
68
fprintf('Program paused. Press enter to continue.\n');
69
pause;
70
71
%% ================== Part 3: Find Outliers ===================
72
% Now you will find a good epsilon threshold using a cross-validation set
73
% probabilities given the estimated Gaussian distribution
74
%
75
76
pval = multivariateGaussian(Xval, mu, sigma2);
77
78
[epsilon F1] = selectThreshold(yval, pval);
79
fprintf('Best epsilon found using cross-validation: %e\n', epsilon);
80
fprintf('Best F1 on Cross Validation Set: %f\n', F1);
81
fprintf(' (you should see a value epsilon of about 8.99e-05)\n');
82
fprintf(' (you should see a Best F1 value of 0.875000)\n\n');
83
84
% Find the outliers in the training set and plot the
85
outliers = find(p < epsilon);
86
87
% Draw a red circle around those outliers
88
hold on
89
plot(X(outliers, 1), X(outliers, 2), 'ro', 'LineWidth', 2, 'MarkerSize', 10);
90
hold off
91
92
fprintf('Program paused. Press enter to continue.\n');
93
pause;
94
95
%% ================== Part 4: Multidimensional Outliers ===================
96
% We will now use the code from the previous part and apply it to a
97
% harder problem in which more features describe each datapoint and only
98
% some features indicate whether a point is an outlier.
99
%
100
101
% Loads the second dataset. You should now have the
102
% variables X, Xval, yval in your environment
103
load('ex8data2.mat');
104
105
% Apply the same steps to the larger dataset
106
[mu sigma2] = estimateGaussian(X);
107
108
% Training set
109
p = multivariateGaussian(X, mu, sigma2);
110
111
% Cross-validation set
112
pval = multivariateGaussian(Xval, mu, sigma2);
113
114
% Find the best threshold
115
[epsilon F1] = selectThreshold(yval, pval);
116
117
fprintf('Best epsilon found using cross-validation: %e\n', epsilon);
118
fprintf('Best F1 on Cross Validation Set: %f\n', F1);
119
fprintf(' (you should see a value epsilon of about 1.38e-18)\n');
120
fprintf(' (you should see a Best F1 value of 0.615385)\n');
121
fprintf('# Outliers found: %d\n\n', sum(p < epsilon));
122
123