TY - GEN
T1 - Information-theoretic sensor subset selection
T2 - 2006 ASME International Mechanical Engineering Congress and Exposition, IMECE2006
AU - Joshi, Alok A.
AU - Meckl, Peter H.
AU - King, Galen B.
AU - Jennings, Kristofer
PY - 2006/1/1
Y1 - 2006/1/1
N2 - In this paper a stepwise information-theoretic feature selector is designed and implemented to reduce the dimension of a data set without losing pertinent information. The effectiveness of the proposed feature selector is demonstrated by selecting features from forty three variables monitored on a set of heavy duty diesel engines and then using this feature space for classification of faults in these engines. Using a cross-validation technique, the effects of various classification methods (linear regression, quadratic discriminants, probabilistic neural networks, and support vector machines) and feature selection methods (regression subset selection, RV-based selection by simulated annealing, and information-theoretic selection) are compared based on the percentage misclassification. The information-theoretic feature selector combined with the probabilistic neural network achieved an average classification accuracy of 90%, which was the best performance of any combination of classifiers and feature selectors under consideration.
AB - In this paper a stepwise information-theoretic feature selector is designed and implemented to reduce the dimension of a data set without losing pertinent information. The effectiveness of the proposed feature selector is demonstrated by selecting features from forty three variables monitored on a set of heavy duty diesel engines and then using this feature space for classification of faults in these engines. Using a cross-validation technique, the effects of various classification methods (linear regression, quadratic discriminants, probabilistic neural networks, and support vector machines) and feature selection methods (regression subset selection, RV-based selection by simulated annealing, and information-theoretic selection) are compared based on the percentage misclassification. The information-theoretic feature selector combined with the probabilistic neural network achieved an average classification accuracy of 90%, which was the best performance of any combination of classifiers and feature selectors under consideration.
UR - http://www.scopus.com/inward/record.url?scp=84920634147&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84920634147&partnerID=8YFLogxK
U2 - 10.1115/IMECE2006-15903
DO - 10.1115/IMECE2006-15903
M3 - Conference contribution
AN - SCOPUS:84920634147
SN - 0791837904
SN - 9780791837900
T3 - American Society of Mechanical Engineers, Manufacturing Engineering Division, MED
BT - Proceedings of 2006 ASME International Mechanical Engineering Congress and Exposition, IMECE2006 - Manufacturing
PB - American Society of Mechanical Engineers (ASME)
Y2 - 5 November 2006 through 10 November 2006
ER -