| ... | @@ -32,7 +32,8 @@ SVM is now one of the mostly applied techniques in supervised machine learning. |
... | @@ -32,7 +32,8 @@ SVM is now one of the mostly applied techniques in supervised machine learning. |
|
|
|
|
|
|
|
In order to understand how it works, let’s go back to its origins and look at a binary classification task (reds and blues) in 2D data space, which is linearly separable. If we try with paper & pen, we see that may lines do exist and the million-dollar question here is to figure out which line (a hyperplane in N dimensional case) is the best way to separate.
|
|
In order to understand how it works, let’s go back to its origins and look at a binary classification task (reds and blues) in 2D data space, which is linearly separable. If we try with paper & pen, we see that may lines do exist and the million-dollar question here is to figure out which line (a hyperplane in N dimensional case) is the best way to separate.
|
|
|
|
|
|
|
|
<img src="others/images/svm_1.png" width="300" height="300">
|
|
<img src="/data-driven-engineering/-/blob/master/others/images/svm_1.png" width="300" height="300">
|
|
|
|
|
|
|
|
|
|
|
|
For mathematical convenience, let’s go into the number domain, rather than sticking to colors, and say that we are trying to separate positive numbers from the negative numbers. In this case, the decision boundary will correspond to the locations of zeros along this special line:
|
|
For mathematical convenience, let’s go into the number domain, rather than sticking to colors, and say that we are trying to separate positive numbers from the negative numbers. In this case, the decision boundary will correspond to the locations of zeros along this special line:
|
|
|
|
|
|
| ... | |
... | |
| ... | | ... | |