Mutual information based input feature selection for classification problems

Shuang Cang*, Hongnian Yu

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    48 Citations (Scopus)

    Abstract

    The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. This paper investigates the approaches to solve classification problems of the feature selection and proposes a new feature selection algorithm using the mutual information (MI) concept in information theory for the classification problems. The proposed algorithm calculates the MI between the combinations of input features and the class instead of the MI between a single input feature and the class for both continuous-valued and discrete-valued features. Three experimental tests are conducted to evaluate the proposed algorithm. Comparison studies of the proposed algorithm with the previously published classification algorithms indicate that the proposed algorithm is robust, stable and efficient.

    Original languageEnglish
    Pages (from-to)691-698
    Number of pages8
    JournalDecision Support Systems
    Volume54
    Issue number1
    Early online date24 Aug 2012
    DOIs
    Publication statusPublished - 1 Dec 2012

    Keywords

    • Classification
    • Feature ranking
    • Mutual information
    • Optimal feature set

    Fingerprint

    Dive into the research topics of 'Mutual information based input feature selection for classification problems'. Together they form a unique fingerprint.

    Cite this