TY - GEN
T1 - Information-theoretic feature selection for classification
AU - Joshi, Alok A.
AU - James, Scott M.
AU - Meckl, Peter H.
AU - King, Galen B.
AU - Jennings, Kristofer
PY - 2007
Y1 - 2007
N2 - Feature selection has always been an important aspect of statistical model identification and pattern classification. In this paper we introduce a novel information-theoretic index called the compensated quality factor (CQF) which selects the important features from a large amount of irrelevant data. The proposed index does an exhaustive combinatorial search of the input space and selects the feature that maximizes the information criterion conditioned on the decision rules defined by the compensated quality factor. The effectiveness of the proposed CQF-based algorithm was tested against the results of Mallows Cp criterion, Akaike information criterion (AIC), and Bayesian information criterion (BIC) using post liver operation survival data [1] (continuous variables) and NIST sonoluminescent light intensity data [2] (categorical variables). Due to computational time and memory constraints, the CQF-based feature selector is only recommended for an input space with dimension p < 20. The problem of higher dimensional input spaces (20 < p < 50) was solved by proposing an information-theoretic stepwise selection procedure. Though this procedure does not guarantee a globally optimal solution, the computational time-memory requirements are reduced drastically compared to the exhaustive combinatorial search. Using diesel engine data for fault detection (43 variables, 8-classes, 30000 records), the performance of the information-theoretic selection technique was tested by comparing the misclassification rates before and after the dimension reduction using various classifiers.
AB - Feature selection has always been an important aspect of statistical model identification and pattern classification. In this paper we introduce a novel information-theoretic index called the compensated quality factor (CQF) which selects the important features from a large amount of irrelevant data. The proposed index does an exhaustive combinatorial search of the input space and selects the feature that maximizes the information criterion conditioned on the decision rules defined by the compensated quality factor. The effectiveness of the proposed CQF-based algorithm was tested against the results of Mallows Cp criterion, Akaike information criterion (AIC), and Bayesian information criterion (BIC) using post liver operation survival data [1] (continuous variables) and NIST sonoluminescent light intensity data [2] (categorical variables). Due to computational time and memory constraints, the CQF-based feature selector is only recommended for an input space with dimension p < 20. The problem of higher dimensional input spaces (20 < p < 50) was solved by proposing an information-theoretic stepwise selection procedure. Though this procedure does not guarantee a globally optimal solution, the computational time-memory requirements are reduced drastically compared to the exhaustive combinatorial search. Using diesel engine data for fault detection (43 variables, 8-classes, 30000 records), the performance of the information-theoretic selection technique was tested by comparing the misclassification rates before and after the dimension reduction using various classifiers.
UR - http://www.scopus.com/inward/record.url?scp=46449100115&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=46449100115&partnerID=8YFLogxK
U2 - 10.1109/ACC.2007.4282270
DO - 10.1109/ACC.2007.4282270
M3 - Conference contribution
AN - SCOPUS:46449100115
SN - 1424409888
SN - 9781424409884
T3 - Proceedings of the American Control Conference
SP - 2000
EP - 2005
BT - Proceedings of the 2007 American Control Conference, ACC
T2 - 2007 American Control Conference, ACC
Y2 - 9 July 2007 through 13 July 2007
ER -