| Literature DB >> 33383907 |
Ryan Mohr1, Maria Fonoberova1, Zlatko Drmač2, Iva Manojlović1, Igor Mezić1,3.
Abstract
Hierarchical support vector regression (HSVR) models a function from data as a linear combination of SVR models at a range of scales, starting at a coarse scale and moving to finer scales as the hierarchy continues. In the original formulation of HSVR, there were no rules for choosing the depth of the model. In this paper, we observe in a number of models a phase transition in the training error-the error remains relatively constant as layers are added, until a critical scale is passed, at which point the training error drops close to zero and remains nearly constant for added layers. We introduce a method to predict this critical scale a priori with the prediction based on the support of either a Fourier transform of the data or the Dynamic Mode Decomposition (DMD) spectrum. This allows us to determine the required number of layers prior to training any models.Entities:
Keywords: dynamic mode decomposition; fourier transform; koopman operator; support vector regression
Year: 2020 PMID: 33383907 PMCID: PMC7824529 DOI: 10.3390/e23010037
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524