Feature selection has always been an important aspect of statistical model identification and pattern classification. In this paper we introduce a novel information-theoretic index called the compensated quality factor (CQF) which selects the important features from a large amount of irrelevant data. The proposed index does an exhaustive combinatorial search of the input space and selects the feature that maximizes the information criterion conditioned on the decision rules defined by the compensated quality factor. The effectiveness of the proposed CQF-based algorithm was tested against the results of Mallows Cp criterion, Akaike information criterion (AIC), and Bayesian information criterion (BIC) using post liver operation survival data  (continuous variables) and NIST sonoluminescent light intensity data  (categorical variables). Due to computational time and memory constraints, the CQF-based feature selector is only recommended for an input space with dimension p < 20. The problem of higher dimensional input spaces (20 < p < 50) was solved by proposing an information-theoretic stepwise selection procedure. Though this procedure does not guarantee a globally optimal solution, the computational time-memory requirements are reduced drastically compared to the exhaustive combinatorial search. Using diesel engine data for fault detection (43 variables, 8-classes, 30000 records), the performance of the information-theoretic selection technique was tested by comparing the misclassification rates before and after the dimension reduction using various classifiers.