Literature DB >> 34262223

Stochastic Mutual Information Gradient Estimation for Dimensionality Reduction Networks.

Ozan Özdenizci1,2,3, Deniz Erdoğmuş1.   

Abstract

Feature ranking and selection is a widely used approach in various applications of supervised dimensionality reduction in discriminative machine learning. Nevertheless there exists significant evidence on feature ranking and selection algorithms based on any criterion leading to potentially sub-optimal solutions for class separability. In that regard, we introduce emerging information theoretic feature transformation protocols as an end-to-end neural network training approach. We present a dimensionality reduction network (MMINet) training procedure based on the stochastic estimate of the mutual information gradient. The network projects high-dimensional features onto an output feature space where lower dimensional representations of features carry maximum mutual information with their associated class labels. Furthermore, we formulate the training objective to be estimated non-parametrically with no distributional assumptions. We experimentally evaluate our method with applications to high-dimensional biological data sets, and relate it to conventional feature selection algorithms to form a special case of our approach.

Entities:  

Keywords:  MMINet; dimensionality reduction; feature projection; information theoretic learning; mutual information; neural networks; stochastic gradient estimation

Year:  2021        PMID: 34262223      PMCID: PMC8274569          DOI: 10.1016/j.ins.2021.04.066

Source DB:  PubMed          Journal:  Inf Sci (N Y)        ISSN: 0020-0255            Impact factor:   8.233


  16 in total

1.  On the momentum term in gradient descent learning algorithms.

Authors:  Ning Qian
Journal:  Neural Netw       Date:  1999-01

2.  Comparison of linear, nonlinear, and feature selection methods for EEG signal classification.

Authors:  Deon Garrett; David A Peterson; Charles W Anderson; Michael H Thaut
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2003-06       Impact factor: 3.802

Review 3.  Introduction to machine learning for brain imaging.

Authors:  Steven Lemm; Benjamin Blankertz; Thorsten Dickhaus; Klaus-Robert Müller
Journal:  Neuroimage       Date:  2010-12-21       Impact factor: 6.556

4.  Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy.

Authors:  Hanchuan Peng; Fuhui Long; Chris Ding
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2005-08       Impact factor: 6.226

Review 5.  Machine learning in bioinformatics.

Authors:  Pedro Larrañaga; Borja Calvo; Roberto Santana; Concha Bielza; Josu Galdiano; Iñaki Inza; José A Lozano; Rubén Armañanzas; Guzmán Santafé; Aritz Pérez; Victor Robles
Journal:  Brief Bioinform       Date:  2006-03       Impact factor: 11.622

6.  Reducing the dimensionality of data with neural networks.

Authors:  G E Hinton; R R Salakhutdinov
Journal:  Science       Date:  2006-07-28       Impact factor: 47.728

7.  Feature extraction using information-theoretic learning.

Authors:  Kenneth E Hild; Deniz Erdogmus; Kari Torkkola; Jose C Principe
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2006-09       Impact factor: 6.226

8.  Using mutual information for selecting features in supervised neural net learning.

Authors:  R Battiti
Journal:  IEEE Trans Neural Netw       Date:  1994

9.  Classification of human lung carcinomas by mRNA expression profiling reveals distinct adenocarcinoma subclasses.

Authors:  A Bhattacharjee; W G Richards; J Staunton; C Li; S Monti; P Vasa; C Ladd; J Beheshti; R Bueno; M Gillette; M Loda; G Weber; E J Mark; E S Lander; W Wong; B E Johnson; T R Golub; D J Sugarbaker; M Meyerson
Journal:  Proc Natl Acad Sci U S A       Date:  2001-11-13       Impact factor: 11.205

10.  Mutual information between discrete and continuous data sets.

Authors:  Brian C Ross
Journal:  PLoS One       Date:  2014-02-19       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.