Sciweavers

JMLR
2012

Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection

11 years 6 months ago
Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection
We present a unifying framework for information theoretic feature selection, bringing almost two decades of research on heuristic filter criteria under a single theoretical interpretation. This is in response to the question: “what are the implicit statistical assumptions of feature selection criteria based on mutual information?”. To answer this, we adopt a different strategy than is usual in the feature selection literature—instead of trying to define a criterion, we derive one, directly from a clearly specified objective function: the conditional likelihood of the training labels. While many hand-designed heuristic criteria try to optimize a definition of feature ‘relevancy’ and ‘redundancy’, our approach leads to a probabilistic framework which naturally incorporates these concepts. As a result we can unify the numerous criteria published over the last two decades, and show them to be low-order approximations to the exact (but intractable) optimisation problem. T...
Gavin Brown, Adam Pocock, Ming-Jie Zhao, Mikel Luj
Added 27 Sep 2012
Updated 27 Sep 2012
Type Journal
Year 2012
Where JMLR
Authors Gavin Brown, Adam Pocock, Ming-Jie Zhao, Mikel Luján
Comments (0)