Kernel Methods for Dynamical Systems
Kernel Methods approach the problem by mapping the data into a high
dimensional feature space, where each coordinate corresponds to one
feature of the data items, transforming the data into a set of points
in a Euclidean space. In that space, a variety of methods can be used
to find relations in the data. Since the mapping can be quite general
(not necessarily linear, for example), the relations found in this way
are accordingly very general. This approach is called the kernel trick.
More information on Wikipedia.
Kernel Parameters:
- Kernel Type:
- Linear: linear kernel
- Polynomial: polynomial kernel
- RBF: radial basis function (gaussian) kernel
- Kernel Width: inverse variance for the kernel function, determines the radius of influence of each sample (RBF + Poly)
- Degree: degree of the polynomial (Poly)
Methods:
- eps-SVR: Support Vector Regression with epsilon-tube
- C: penalty parameter
- eps: size of the epsilon-tube
- nu-SVR: Support Vector Regression
- C: penalty parameter
- nu: fraction of support vectors to be kept (computes eps automatically)
- SOGP: Sparse Online Gaussian Process regression
- Capacity: number of bases allowed (0: automatic, -1: standard Gaussian Process Regression)
- Noise: allowed noise estimation