Kernel Methods for Classification
    
    Kernel Methods approach the problem by mapping the data into a high
    dimensional feature space, where each coordinate corresponds to one
    feature of the data items, transforming the data into a set of points
    in a Euclidean space. In that space, a variety of methods can be used
    to find relations in the data. Since the mapping can be quite general
    (not necessarily linear, for example), the relations found in this way
    are accordingly very general. This approach is called the kernel trick.
    
    More information on Wikipedia.
    
    The model information displays Support Vectors as black or white
    circles. When both black and white circles are shown, black circles
    correspond to SVs that fall inside or beyond the SVM margin.
    
    The ROC curve is generated by varying the threshold on the output function
    of the SVM/RVM classifier (whereas by default this threshold is set to 0 and
    the classification function is the sign(x) function).
    
    Kernel Parameters:
    
      - Kernel Type:
        - Linear: linear kernel
 
- Polynomial: polynomial kernel
- RBF: radial basis function (gaussian) kernel
- Kernel Width: inverse variance for the kernel function, determines the
        radius of influence of each sample (RBF + Poly)
- Degree: degree of the polynomial (Poly)
Methods:
    
      - C-SVM: Support Vector Machine
 
        - C: penalty parameter (computes nu automatically)
- nu-SVM: Support Vector Machine
        - nu: fraction of support vectors to be kept (computes C
          automatically)
- RVM: Relevance Vector Machine
        - eps: convergence threshold
 
- Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
        - lambda: convergence threshold
- maxSV: maximum number of support vector allowed