Literature DB >> 25530672

A novel feature selection method and its application.

Bing Li1, Tommy W S Chow1, Di Huang2.   

Abstract

In this paper, a novel feature selection method based on rough sets and mutual information is proposed. The dependency of each feature guides the selection, and mutual information is employed to reduce the features which do not favor addition of dependency significantly. So the dependency of the subset found by our method reaches maximum with small number of features. Since our method evaluates both definitive relevance and uncertain relevance by a combined selection criterion of dependency and class-based distance metric, the feature subset is more relevant than other rough sets based methods. As a result, the subset is near optimal solution. In order to verify the contribution, eight different classification applications are employed. Our method is also employed on a real Alzheimer's disease dataset, and finds a feature subset where classification accuracy arrives at 81.3%. Those present results verify the contribution of our method.

Entities:  

Keywords:  Alzheimer's disease; class-based distance metric; feature selection; mutual information; rough sets

Year:  2013        PMID: 25530672      PMCID: PMC4269276          DOI: 10.1007/s10844-013-0243-x

Source DB:  PubMed          Journal:  J Intell Inf Syst        ISSN: 0925-9902            Impact factor:   1.888


  7 in total

1.  2012 Alzheimer's disease facts and figures.

Authors: 
Journal:  Alzheimers Dement       Date:  2012       Impact factor: 21.566

2.  Discriminative semi-supervised feature selection via manifold regularization.

Authors:  Zenglin Xu; Irwin King; Michael Rung-Tsong Lyu; Rong Jin
Journal:  IEEE Trans Neural Netw       Date:  2010-06-21

3.  Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information.

Authors:  Tommy W S Chow; D Huang
Journal:  IEEE Trans Neural Netw       Date:  2005-01

4.  Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy.

Authors:  Hanchuan Peng; Fuhui Long; Chris Ding
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2005-08       Impact factor: 6.226

5.  Input feature selection for classification problems.

Authors:  N Kwak; Chong-Ho Choi
Journal:  IEEE Trans Neural Netw       Date:  2002

6.  Using mutual information for selecting features in supervised neural net learning.

Authors:  R Battiti
Journal:  IEEE Trans Neural Netw       Date:  1994

7.  Normalized mutual information feature selection.

Authors:  Pablo A Estévez; Michel Tesmer; Claudio A Perez; Jacek M Zurada
Journal:  IEEE Trans Neural Netw       Date:  2009-01-13
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.