Efficient Approximate Solutions to Mutual Information Based Global Feature Selection

@article{Venkateswara2015EfficientAS,
  title={Efficient Approximate Solutions to Mutual Information Based Global Feature Selection},
  author={Hemanth Venkateswara and Prasanth Lade and Binbin Lin and Jieping Ye and Sethuraman Panchanathan},
  journal={2015 IEEE International Conference on Data Mining},
  year={2015},
  pages={1009-1014}
}
  • Hemanth Venkateswara, Prasanth Lade, +2 authors Sethuraman Panchanathan
  • Published 2015
  • Computer Science, Mathematics
  • 2015 IEEE International Conference on Data Mining
  • Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 32 REFERENCES

    Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy

    Feature selection based on mutual information

    VIEW 2 EXCERPTS

    Wrappers for Feature Subset Selection

    VIEW 1 EXCERPT

    Quadratic Programming Feature Selection

    VIEW 3 EXCERPTS

    An Introduction to Variable and Feature Selection

    VIEW 3 EXCERPTS