Reduced Data Sets and Entropy-Based Discretization

View/ Open
Issue Date
2019-10-28Author
Grzymala-Busse, Jerzy W.
Hippe, Zdzislaw S.
Mroczek, Teresa
Publisher
MDPI
Type
Article
Article Version
Scholarly/refereed, publisher version
Rights
© 2019 by the authors. Licensee MDPI, Basel, Switzerland.
Metadata
Show full item recordAbstract
Results of experiments on numerical data sets discretized using two methods—global versions of Equal Frequency per Interval and Equal Interval Width-are presented. Globalization of both methods is based on entropy. For discretized data sets left and right reducts were computed. For each discretized data set and two data sets, based, respectively, on left and right reducts, we applied ten-fold cross validation using the C4.5 decision tree generation system. Our main objective was to compare the quality of all three types of data sets in terms of an error rate. Additionally, we compared complexity of generated decision trees. We show that reduction of data sets may only increase the error rate and that the decision trees generated from reduced decision sets are not simpler than the decision trees generated from non-reduced data sets.
Description
This work is licensed under a Creative Commons Attribution 4.0 International License.
Collections
Citation
Grzymala-Busse, J.W.; Hippe, Z.S.; Mroczek, T. Reduced Data Sets and Entropy-Based Discretization. Entropy 2019, 21, 1051.
Items in KU ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
We want to hear from you! Please share your stories about how Open Access to this item benefits YOU.