| ... | ... | @@ -53,7 +53,9 @@ $`y_{i} = w . x_i + b >= 1`$ |
|
|
|
|
|
|
|
$`y_{i} = w . x_i + b <= -1`$
|
|
|
|
|
|
|
|
</div>We are adding this rule to limit of degrees of freedom, as we can propose many solutions to the classification problem. This little touch is what we call ‘margin’. The margin is defined as the as the perpendicular distance between the decision boundary and the closest of the data points. When we set an objective such as “maximize the margin”, then we end up with a particular solution. Herein, the solution is found via a subset of points called support vectors.
|
|
|
|
</div>
|
|
|
|
|
|
|
|
We are adding this rule to limit of degrees of freedom, as we can propose many solutions to the classification problem. This little touch is what we call ‘margin’. The margin is defined as the as the perpendicular distance between the decision boundary and the closest of the data points. When we set an objective such as “maximize the margin”, then we end up with a particular solution. Herein, the solution is found via a subset of points called support vectors.
|
|
|
|
|
|
|
|
<img src="uploads/1363053e315b3f87492ed55dbdbfeee2/svm_2.png" height="300">
|
|
|
|
|
| ... | ... | |
| ... | ... | |