-
Notifications
You must be signed in to change notification settings - Fork 4
/
index.html
220 lines (215 loc) · 18.1 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
<!DOCTYPE html>
<html>
<head>
<title>
Support Vector Machine
</title>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<meta name="viewport" content="width=device-width, minimum-scale=1.0, initial-scale=1, user-scalable=yes">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.5.0/css/bootstrap.min.css">
<script src="https://polyfill.io/v3/polyfill.min.js?features=es6"></script>
<script id="MathJax-script" async src="https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js"></script>
<link rel="stylesheet" href="style.css">
<link href="https://code.jquery.com/ui/1.10.4/themes/ui-lightness/jquery-ui.css" rel="stylesheet">
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://code.jquery.com/ui/1.10.4/jquery-ui.js"></script>
<script src="http://maxcdn.bootstrapcdn.com/bootstrap/4.5.0/js/bootstrap.min.js"></script>
<style>
</style>
</head>
<body style="overflow-x: hidden; width: auto;">
<div style="background-color: #ffffff;">
<section class="title-area">
<div class="jumbotron jumbotron-fluid py-4" style="text-align: center; color: rgb(230,230,250); background-color: #512DA8; margin-bottom: 0em;">
<h2 style="color: #FFFF99;">
Support Vector Machines
</h2>
<h6>
The Mr. Perfect of Machine Learning Classifiers
</h6>
</div>
</section>
<section>
<p class="section-head" style="text-align: center;">
Introduction
</p>
<div class="container" style="width: 100%;">
<div class="row" style="margin: auto;">
<h3 style="margin: auto;">What are Support Vector Machines?</h3>
<div style="padding-top: 20px;">
Support Vector Machines (SVM) are a very powerful class of supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. It presents one of the most robust prediction methods, based on
the statistical learning framework (VC Theory). It is indigenously a non-probabilistic binary linear classifier, but can also perform efficient non-linear classification through the Kernel Trick.
</div><br/>
<h3 style="margin: auto; padding-top: 20px;">What are the advantages of Support Vector Machines?</h3>
<div style="padding-top: 20px;">
Support Vector Machines provide us with the best decision boundary there is between the two classes. It basically creates the widest street possible in between the classes. As a result, when there is a clear margin of separatio between the classes, SVM
works really well. Also, due to the Kernel Trick, it is highly efficient in higher dimensions. Due to its high efficiency, it reaches the optimum faster, and thus, can be effective in cases where number of dimensions is greater
than the number of samples.
<br/>
</div>
</div>
</div>
<div id="container" style="margin-bottom: 1em; margin-top: 3em;">
<div id="img">
<img src="images/class.png" style=" margin: auto;
width: 70%;;">
</div>
<div id="text">
<div id="MLC">
<p>
Any other Machine Learning Classifier's Decision Boundary
</p>
</div>
<div id="SVM">
<p>
Support Vector Machine Decision Boundary
</p>
</div>
</div>
</div>
</section>
<section style="margin-top: 0em; margin-bottom: 3em;">
<p class="section-head">
Playground
</p>
<div id="khelobc">
<div id="canvas_box">
<div>
<h2 style="margin-left: 4.75em;font-size: x-large; text-align: left;"><a data-toggle="collapse" href="#list1" id="collapseInst">Show Instructions:</a></h2>
<div id="list1" class="collapse">
<ul>
<li class="shortcuts">Click for adding a yellow point.</li>
<li class="shortcuts">Shift + Click for adding a blue point.</li>
<li class="shortcuts">Press c for clearing all the points you added.</li>
<li class="shortcuts">Press R to switch to the Gaussian kernel.</li>
<li class="shortcuts">Press P to switch to the Polynomial kernel. </li>
<li class="shortcuts">Press S to switch to the Sigmoid kernel.</li>
<li class="shortcuts">Press L to switch back to the Linear kernel.</li>
</ul>
</div>
</div>
<canvas id="NPGcanvas" width="500" height="500">Browser not supported for Canvas. Get a real browser.</canvas><br /><br />
<div id="slider_box">
<div id="slider1_box">
<div id="slider1"></div>
<span id="creport">C = 10.0</span>
</div>
<div id="slider2_box">
<div id="rider"></div>
<span id="report"></span>
</div>
<div id="slider3_box">
<div id="provider"></div>
<span id="preport"></span>
</div>
<div>
<div id="slider2" style="display: none;"></div>
<span id="sigreport" style="display: none; text-align: center;">Gaussian Kernel sigma = 1.0</span>
</div>
<div>
<div id="slider3" style="display: none;"></div>
<span id="degreport" style="display: none;text-align: center;">Polynomial Kernel degree = 3</span>
</div>
<div>
<div id="slider4" style="display: none;"></div>
<span id="areport" style="display: none;text-align: center;">Polynomial Kernel a = 1.0</span>
</div>
<div>
<div id="slider5" style="display: none;"></div>
<span id="alpreport" style="display: none;text-align: center;">Sigmoid Kernel alpha = 0.32</span>
</div>
<div>
<div id="slider6" style="display: none;"></div>
<span id="csigreport" style="display: none;text-align: center;">Sigmoid Kernel c = 0.05</span>
</div>
</div>
</div>
<div id="optsdiv">
<div class="col-md-11">
<div class="card border-dark">
<div class="card-header" style="background-color: #512DA8; text-align: center; vertical-align: middle;">
<h2 style="color: white;">Information about the Model</h2>
</div>
<div class="card-body">
<div>
<h2 style="margin-left: 1em;font-size: large; text-align: left;">The SVM algorithm is trying to plot a decision boundary separating the Yellow points from the Blue Points. The SVM tries to create the widest possible decision boundary that can be made between the two classes. The
SVM implemented here uses the <a href="http://cs229.stanford.edu/materials/smo.pdf" style="color: #035aa6;">SMO algorithm</a> to find the decision boundary. </br><a data-toggle="collapse" href="#list2" id="collapseInfo"
style="color: #035aa6;">Show More..</a></h2>
<div id="list2" class="collapse">
<div id="linear_info" style="display: block;">
<p style="text-align: left;margin-left: 1em; font-size: large; margin-top: 1em;">You have opted Linear Kernel to find the Decision boundary. A Linear Kernel is not a kernel per se, but the simple SVM. As evident, from the name, it finds a linear decision boundary between the classes, and
works well when the classes are linearly separable. However since the points here, are not linearly separable, we cannot use it to find the perfect decisiion boundary. What we can do though, is to allow
the SVM to perform some misclassifications in order to achieve the best 'possible' decision boundary. Here comes the parameter C. This parameter is a universal parameter for all kernels, allowing them to
perform a few misclassifications in order to complete the task.
</p>
</br>
<h2 style="text-align: left;margin-left: 1em; font-size: larger; margin-top: 1em;">Kernel Equation: <b style="font-size:x-large; font-weight: 400;"><i style="font-family:'Times New Roman', Times, serif;">K</i>\((\vec{x_i}, \vec{x_j})\) = \((\vec{x_i} \cdot \vec{x_j})\)</b></h2>
</div>
<div id="poly_info" style="display: none;">
<p style="text-align: left;margin-left: 1em; font-size: large; margin-top: 1em;">You have opted Polynomial Kernel to find the Decision boundary. Polynomial Kernels are used in specific cases, when you know the degree of the decision boundary, because in those cases the polynomial kernel
exploits its advantage of being finite-dimensioned over the likes of the Gaussian RBF kernel. We use two parameters apart from the universal parameter C which is used for allowing misclassifications, and
for what is called Soft-Margin Classification, that are 'a' and degree. The degree is used to control the dimension of the space where the kernel computes this dot product. 'a' on the other hand is merely
a constant to produce a mixture of terms in that space. A higher degree tends to overfit the data, while a higher 'a', doesn't contribute much to the bias-variance tradeoff. Often, the polynomial kernel
is used with a=1.
</p>
</br>
<h2 style="text-align: left;margin-left: 1em; font-size: larger; margin-top: 1em;">Kernel Equation: <b style="font-size:x-large; font-weight: 400;"><i style="font-family:'Times New Roman', Times, serif;">K</i>\((\vec{x_i}, \vec{x_j})\) = \((\vec{x_i} \cdot \vec{x_j}+a)^{d}\)</b></h2>
</div>
<div id="rbf_info" style="display: none;">
<p style="text-align: left;margin-left: 1em; font-size: large; margin-top: 1em;">You have opted Gaussian(RBF) Kernel to find the Decision boundary. A Gaussian Kernel is a special case of the RBF kernel. It maps the features into a very high dimensional space, and as a result, can produce
highly non-linear boundaries. Due to this fact, it is the most popular kernel in use. To control this nature of this kernel, we have two parameters, C and sigma. C is the usual universal parameter for allowing
misclassifications, and for what is called Soft-Margin Classification. However, sigma is a parameter specific to Gaussian Kernel. It is used to basically control the variance of the higher order projection
from the original point. That is, if sigma is higher, a point will influence and occupy a larger portion around itself. Thus, increasing sigma makes the decision boundary more flexible and smooth. As a result,
the SVM does a bit of misclassifications as well. However, this is not a bad thing. Gaussian Kernels, being as powerful as they are, have a very high tendency of overfitting the data. Hence, increasing sigma
tends to reduce overfitting.
</p>
</br>
<h2 style="text-align: left;margin-left: 1em; font-size: larger; margin-top: 1em;">Kernel Equation: <b style="font-size:x-large; font-weight: 400;"><i style="font-family:'Times New Roman', Times, serif;">K</i>\((\vec{x_i}, \vec{x_j})\) = \(e^{-\frac{||\vec{x_i} - \vec{x_j}||^2}{2\sigma^{2}}}\)</b></h2>
</div>
<div id="sigmoid_info" style="display: none;">
<p style="text-align: left;margin-left: 1em; font-size: large; margin-top: 1em;">You have opted for the Sigmoid Kernel to find the Decision boundary. It is interesting to note that a SVM model using a sigmoid kernel function is equivalent to a two-layer, perceptron neural network. This kernel
is quite popular for support vector machines due to its origin from neural network theory. This Kernel is controlled using two parameters apart from the universal parameter C which is used for allowing misclassifications,
and for what is called Soft-Margin Classification, that are 'c-sigma' and alpha. c is just a parameter added to the dot product to regulate the sum and doesn't contribute much to the bias-variance tradeoff.
However, \(\alpha\) on the other hand, is a scaling parameter, and increases the area of influence of each point. It is not as good as the RBF kernel though.
</p>
</br>
<h2 style="text-align: left;margin-left: 1em; font-size: larger; margin-top: 1em;">Kernel Equation: <b style="font-size:x-large; font-weight: 400;"><i style="font-family:'Times New Roman', Times, serif;">K</i>\((\vec{x_i}, \vec{x_j})\) = \(tanh(\alpha\vec{x_i} \cdot \vec{x_j} + c)\)</b></h2>
</div>
</div>
<p style="text-align: left;margin-left: 1em; font-size: large; margin-top: 2em; font-weight: 400;">You can see the <a data-toggle="collapse" href="#stats" style="color: #035aa6;">Training statistics</a> below.</p>
<div id="stats" class="collapse">
<ul>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;" id="acc"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;" id="covg"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;" id="supp"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;" id="kern"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;" id="c"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large; display: none;" id="sig"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;display: none" id="a"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;display: none" id="deg"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;display: none" id="alp"></li>
<li style="list-style-type:square;text-align: left;margin-left: 1em; font-size: large;display: none" id="csig"></li>
</ul>
</div>
</div>
</div>
</div>
</div>
</div>
</section>
<section>
<p class="section-head" style="margin-bottom: 0em; height: 1.5em;"></p>
</section>
<div class="jumbotron jumbotron-fluid py-4" style="text-align: center; color: rgb(230,230,250); background-color: #512DA8; margin-bottom: 0em;">
<h2 style="color: #FFFF99;">
DSG IIT-R
</h2>
<h6>
Aaryan Garg | Abhinav Saini | Vivek Kumar
</h6>
</div>
</div>
<script src="index.js"></script>
</body>
</html>