| Literature DB >> 20556846 |
Jochen Einbeck1, Ludger Evers, Benedict Powell.
Abstract
We consider principal curves and surfaces in the context of multivariate regression modelling. For predictor spaces featuring complex dependency patterns between the involved variables, the intrinsic dimensionality of the data tends to be very small due to the high redundancy induced by the dependencies. In situations of this type, it is useful to approximate the high-dimensional predictor space through a low-dimensional manifold (i.e., a curve or a surface), and use the projections onto the manifold as compressed predictors in the regression problem. In the case that the intrinsic dimensionality of the predictor space equals one, we use the local principal curve algorithm for the the compression step. We provide a novel algorithm which extends this idea to local principal surfaces, thus covering cases of an intrinsic dimensionality equal to two, which is in principle extendible to manifolds of arbitrary dimension. We motivate and apply the novel techniques using astrophysical and oceanographic data examples.Mesh:
Year: 2010 PMID: 20556846 DOI: 10.1142/S0129065710002346
Source DB: PubMed Journal: Int J Neural Syst ISSN: 0129-0657 Impact factor: 5.866