# SVM Kernels

Kerneling is a technique used in Machine Learning to separate data into feature vectors. This of course is a simplified definition but it captures the main essence in usage of kernel methods very well. Kerneling, also known as kernel methods or kernel tricks reach feature vectorization goals more accurately and more efficiently using hyperplanes to separate data.

There are 4 main built-in kernels in Scikit-learn’s Support Vector Machine implementation. These are:

• Linear: Kernel to go with when data is linear
• Polynomial: Kernel to go with data is polynomial
• RBF: Kernel to go with when data is more complex
• Sigmoid: Kernel to go with when more complicated tasks need to be tackled similar to neural network applications.

Generally speaking, these kernels will perform faster going from simpler to more complex. In that sense it’s fair to say Linear kernel will perform the fastest among four. But there are many other important factors of course that can affect the performance of a Support Vector algorithm implementation.

## Linear Kernel Usage

Nullam tincidunt adipiscing enim. Etiam imperdiet imperdiet orci. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, sit amet adipiscing sem neque sed ipsum. Sed in libero ut nibh placerat accumsan. Aenean imperdiet.

## Polynomial Kernel Usage

Nullam tincidunt adipiscing enim. Etiam imperdiet imperdiet orci. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, sit amet adipiscing sem neque sed ipsum. Sed in libero ut nibh placerat accumsan. Aenean imperdiet.

## RBF Kernel Usage

Nullam tincidunt adipiscing enim. Etiam imperdiet imperdiet orci. Maecenas tempus, tellus eget condimentum rhoncus, sem quam semper libero, sit amet adipiscing sem neque sed ipsum. Sed in libero ut nibh placerat accumsan. Aenean imperdiet.

## Sigmoid Kernel Usage

Sigmoid kernel originates from Neural Network applications and it is possible to utilize it with Support Vector Machines although it’s not a very popular or practical application.

It is mostly preferred for neural networks and logistic regression. This kernel function is similar to a two-layer perceptron model of the neural network, which works as an activation function for neurons.

It can be shown as,

Sigmoid Kernel Function Formulation:
F(x, xj) = tanh(αxay + c)

## Summary

Different kernels can be used to approach different data problems and there are variety of applications ranging from most simplistic linear kernel methods to more complicated kernel methods that create complex multi-dimensional hyperplanes to serve as decision borders and separate complex data relations.

Kernels are known as a pro and con of Support Vector Machines. Their usage advantage is the flexibility and customization they offer to tackle many different problems in different ways. Their disadvantage is they can complicate practical solutions as well as usage of computational resources.

Kernel selection can be a very important aspect of Support Vector Machine implementations. In general kernel selection is very dependent on the problem at hand and you will want to use kernels that are suitable to the linearity of the dataset.

custom kernel example

https://scikit-learn.org/stable/auto_examples/svm/plot_custom_kernel.html

#### Sigmoid Kernel Usage

Praesent porttitor, nulla vitae posuere iaculis, arcu nisl dignissim dolor, a pretium mi sem ut ipsum. Fusce fermentum.

#### Sigmoid Kernel Usage

Praesent porttitor, nulla vitae posuere iaculis, arcu nisl dignissim dolor, a pretium mi sem ut ipsum. Fusce fermentum.