orngSVM provides accsess to orange Support Vector Machine functionality.
Important!!! On some datasets this learner can perform very badly. It is a known fact that SVM's can be very sensitive to the proper choice of the parameters. If you are having problems with
learner's accuracy try scaling the data and using diffrent parameters or choose an easier approach and use the SVMLearnerEasy
class whitch does this automatically.
SVMLearner
SVMLearner is a function that constructs a SVMLearner class and optionaly trains it on provided examples
Arguments
- svm_type
- Defines the type of SVM (can be SVMLearner.C_SVC, SVMLearner.Nu_SVC (default), SVMLearner.OneClass, SVMLearner.Epsilon_SVR, SVMLearner.Nu_SVR)
- kernel_type
- Defines the type of a kernel to use for learning (can be SVMLearner.RBF (default), SVMLearner.Linear, SVMLearner.Polynomial, SVMLearner.Sigmoid, SVMLearner.Custom)
- degree
- Kernel parameter (Polynomial) (default 3)
- gamma
- Kernel parameter (Polynomial/RBF/Sigmoid) (default 1/number_of_examples)
- coef0
- Kernel parameter (Polynomial/Sigmoid) (default 0)
- kernelFunc
- Function that will be called if
kernel_type
is SVMLearner.Custom. It must accept two orange.Example arguments and return a float.
- C
- C parameter for C_SVC, Epsilon_SVR, Nu_SVR
- nu
- Nu parameter for Nu_SVC, Nu_SVR and OneClass (default 0.5)
- p
- Epsilon in loss-function for Epsilon_SVR
- cache_size
- Cache memory size in MB (default 100)
- eps
- Tolerance of termination criterion (default 0.001)
- shrinking
- Determines whether to use shrinking heuristics (default True)
- probability
- Determines if a probability model should be build (default False)
SVMLearnerSparse
Same as SVMLearner
except that it learns from the examples mata attributes. Note that meta attributes dont need to be registerd with the dataset domain, or present in all the examples.
Use this if you are using large sparse datasets.
SVMLearnerEasy
Same as above except that it will automaticaly scale the data and perform parameter optimization using the parameter_selection
similar to the easy.py script
in libSVM package. Use this if the SVMLearner
performs badly.
SVMLearnerSparseEasy
Same as SVMLearnerEasy
except that it learns from the examples mata attributes. Note that meta attributes dont need to be registerd with the dataset domain, or present in all the examples.
Use this if you are using large sparse datasets (and have absolutely no respect for the fourth dimension commonly named as time).
getLinearSVMWeights
Returns a list of weights of linear class vs. class classifiers for the linear multiclass svm classifier. The list is in the order of 1vs2, 1vs3 ... 1vsN, 2vs3 ...
KernelWrapper (DualKernelWrapper)
KernelWrapper (DualKernelWrapper) is an abstract wrapper class that take one (two) kernel function (functions) as a initalization parameters
and uses them to compute a new kernel function. The available kernel wrappers are RBFKernelWrapper, PolyKernelWrapper, AdditionKernelWrapper, MultiplicationKernelWrapper.
Methods
- __call__(example1, example2)
- Computes the kernel function for the two examples
RBFKernelWrapper
Takes one kernel function (K1) in initialization and uses it to compute a new kernel function: K(x,y)=exp(K1(x,y)^2/gamma)
Attributes
- gamma
- gamma to use in the kernel function
PolyKernelWrapper
Takes one kernel function (K1) in initialization and uses it to compute a new kernel function: K(x,y)=K1(x,y)^degree
Attributes
- degree
- degree to use in the kernel function
AdditionKernelWrapper
Takes two kernel functions (K1 and K2) in initialization and uses them to compute a new kernel function: K(x,y)=K1(x,y)+K2(x,y)
MultiplicationKernelWrapper
Takes two kernel functions (K1 and K2) in initialization and uses them to compute a new kernel function: K(x,y)=K1(x,y)*K2(x,y)
CompositeKernelWrapper
Takes two kernel functions (K1 and K2) in initialization and uses them to compute a new kernel function: K(x,y)=&lambda*K1(x,y)+(1-&lambda)*K2(x,y)
Attributes
- _lambda
- lambda to use in the kernel function
SparseLinKernel
A linear kernel function that uses the examples meta attributes (must be floats) that need not be present in all examples
Examples
import orange, orngSVM
data=orange.ExampleTable("iris.tab")
l1=orngSVM.SVMLearner()
l1.kernelFunc=orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Euclidean(data), gamma=0.5)
l1.kernel_type=orange.SVMLearner.Custom
l1.probability=True
c1=l1(data)
l1.name="SVM - RBF(Euclidean)"
l2=orngSVM.SVMLearner()
l2.kernelFunc=orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Hamming(data), gamma=0.5)
l2.kernel_type=orange.SVMLearner.Custom
l2.probability=True
c2=l2(data)
l2.name="SVM - RBF(Hamming)"
l3=orngSVM.SVMLearner()
l3.kernelFunc=orngSVM.CompositeKernelWrapper(orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Euclidean(data), gamma=0.5),orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Hamming(data), gamma=0.5), l=0.5)
l3.kernel_type=orange.SVMLearner.Custom
l3.probability=True
c3=l1(data)
l3.name="SVM - Composite"
import orngTest, orngStat
tests=orngTest.crossValidation([l1, l2, l3], data, folds=5)
[ca1, ca2, ca3]=orngStat.CA(tests)
print l1.name, "CA:", ca1
print l2.name, "CA:", ca2
print l3.name, "CA:", ca3
LinearLearner
A wrapper around orange.LinearLearner with a default solver_type == L2Loss_SVM_Dual (the default in orange.LinearLearner is L2_LR).