Computational Intelligence and Mathematics for Tackling Complex Problems by László T. Kóczy & Jesús Medina-Moreno & Eloísa Ramírez-Poussa & Alexander Šostak

Computational Intelligence and Mathematics for Tackling Complex Problems by László T. Kóczy & Jesús Medina-Moreno & Eloísa Ramírez-Poussa & Alexander Šostak

Author:László T. Kóczy & Jesús Medina-Moreno & Eloísa Ramírez-Poussa & Alexander Šostak
Language: eng
Format: epub
ISBN: 9783030160241
Publisher: Springer International Publishing


accuracy mean on testing set: 75% (standard deviation: 3.95%).

The above results obtained with no selection of the attributes were compared with results of classification but with selected attributes resulting from Hellwig’s algorithm [9]. The Hellwig’s algorithm was performed on the whole dataset. In effect, instead of 22 attributes we have obtained the following subset of the attributes: {8, 13, 17, 18}. Next, making use of the obtained subset with the above four attributes only, we have repeated the same process of classification like previously, i.e., with the tree with no pruning, with train/test division 67%:33% and with cross validation (10 experiments of 10-fold cross-validation giving 100 trees). The results are: accuracy mean on training set: 79.2%, accuracy standard deviation on training set: 1.78%. Accuracy mean on testing set: 79.3%, accuracy standard deviation on testing set: 4.16%.

We can notice that the accuracy mean with the selected attributes on the testing set is higher (79.3%) than it was for all 22 attributes (75%).

We have repeated our experiment but performing 100 times Hellwig’s algorithm for each step during cross-validation, i.e., we were obtaining subsets of the attributes on each training set. The obtained results from the classifier were the same as above.

We have also used WEKA (http://​www.​cs.​waikato.​ac.​nz/​ml/​weka/​) to evaluate selection of the attributes via comparing results obtained by different well known classification algorithms. The following algorithms were tested:J48 – implementation of the crisp tree proposed by Quinlan C4.5 [29]; – LMT (Logistic Model Tree) – a hybrid tree with the logistic models at the leaves; [30], – NBTree – hybrid decision tree with the Bayes classifiers at the leaves;



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.