Efficient Additive Kernels via Explicit Feature Maps
A. Vedaldi and A. Zisserman

06136519.pdf1.44MB
Type: Paper
Tags:

Bibtex:
@ARTICLE{6136519,
author={A. Vedaldi and A. Zisserman},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
title={Efficient Additive Kernels via Explicit Feature Maps},
year={2012},
volume={34},
number={3},
pages={480-492},
abstract={Large scale nonlinear support vector machines (SVMs) can be approximated by linear ones using a suitable feature map. The linear SVMs are in general much faster to learn and evaluate (test) than the original nonlinear SVMs. This work introduces explicit feature maps for the additive class of kernels, such as the intersection, Hellinger's, and χ2 kernels, commonly used in computer vision, and enables their use in large scale problems. In particular, we: 1) provide explicit feature maps for all additive homogeneous kernels along with closed form expression for all common kernels; 2) derive corresponding approximate finite-dimensional feature maps based on a spectral analysis; and 3) quantify the error of the approximation, showing that the error is independent of the data dimension and decays exponentially fast with the approximation order for selected kernels such as χ2. We demonstrate that the approximations have indistinguishable performance from the full kernels yet greatly reduce the train/test times of SVMs. We also compare with two other approximation methods: Nystrom's approximation of Perronnin et al. [1], which is data dependent, and the explicit map of Maji and Berg [2] for the intersection kernel, which, as in the case of our approximations, is data independent. The approximations are evaluated on a number of standard data sets, including Caltech-101 [3], Daimler-Chrysler pedestrians [4], and INRIA pedestrians [5].},
keywords={approximation theory;computer vision;data handling;feature extraction;learning (artificial intelligence);spectral analysis;support vector machines;Nystrom approximation;additive homogeneous kernels;approximate finite-dimensional feature maps;approximation error;computer vision;data dependency;explicit feature maps;exponential decay;large scale nonlinear support vector machines;linear SVM;spectral analysis;Additives;Approximation methods;Histograms;Kernel;Measurement;Support vector machines;Training;Kernel methods;feature map;large scale learning;object detection.;object recognition},
doi={10.1109/TPAMI.2011.153},
ISSN={0162-8828},
month={March},}

Send Feedback Start
   0.000006
DB Connect
   0.000385
Lookup hash in DB
   0.001158
Get torrent details
   0.000726
Get torrent details, finished
   0.000724
Get authors
   0.000088
Parse bibtex
   0.000527
Write header
   0.000681
get stars
   0.000462
home tab
   0.000549
render right panel
   0.000043
render ads
   0.000083
fetch current hosters
   0.000778
Done