Literature DB >> 29398882

Gaussian Quadrature for Kernel Features.

Tri Dao1, Christopher De Sa2, Christopher Ré1.   

Abstract

Kernel methods have recently attracted resurgent interest, showing performance competitive with deep neural networks in tasks such as speech recognition. The random Fourier features map is a technique commonly used to scale up kernel machines, but employing the randomized feature map means that O(ε-2) samples are required to achieve an approximation error of at most ε. We investigate some alternative schemes for constructing feature maps that are deterministic, rather than random, by approximating the kernel in the frequency domain using Gaussian quadrature. We show that deterministic feature maps can be constructed, for any γ > 0, to achieve error ε with O(eγ + ε-1/γ) samples as ε goes to 0. Our method works particularly well with sparse ANOVA kernels, which are inspired by the convolutional layer of CNNs. We validate our methods on datasets in different domains, such as MNIST and TIMIT, showing that deterministic features are faster to generate and achieve accuracy comparable to the state-of-the-art kernel methods based on random Fourier features.

Entities:  

Year:  2017        PMID: 29398882      PMCID: PMC5791159     

Source DB:  PubMed          Journal:  Adv Neural Inf Process Syst        ISSN: 1049-5258


  1 in total

1.  Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation.

Authors:  Jian Zhang; Avner May; Tri Dao; Christopher Ré
Journal:  Proc Mach Learn Res       Date:  2019-04
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.