The optimisation problem in the support vector machine methods is hard to solve for large sample size.
It often fails when class overlap in the feature space is large.
It does not have statistical foundation.
Takeaways
There are three types of support vector machine methods:
Maximal margin classifier is for when the data is perfectly separated by a hyperplane
Support vector classifier/soft margin classifier for when data is not perfectly separated by a hyperplane but still has a linear decision boundary, and
Support vector machines used for when the data has nonlinear decision boundaries.
Support vector machines - table of contents ETC3250/5250 Support vector machines Hyperplane Hyperplane in 2D Hyperplane in 3D Separating hyperplanes Example SVM disambiguation Maximal marginal classifier Toy breast cancer diagnosis data Infinite hyperplanes Distance to hyperplane Margin for hyperplane (a) Margin for hyperplane (b) Margin for hyperplane (c) Maximal margin classifier Support vectors Maximal margin hyperplane Non-separable case Support vector classifier Support vector classifier Optimisation Tuning parameter C Nonlinear boundaries Enlarging the feature space Support vector machines Support vector machines The inner product Dual representation Dual representation Kernel functions Examples of kernels An application to breast cancer diagnosis Breast cancer diagnosis Support vector machine with R Main output of svm Classification plot for linear kernel Support vector machine with other kernels Classification plots for polynomial & radial kernels Limitations of support vector machine methods Takeaways