Download Automatic Design of Decision-Tree Induction Algorithms by Rodrigo C. Barros, André C. P. L. F. de Carvalho, Alex A. PDF

By Rodrigo C. Barros, André C. P. L. F. de Carvalho, Alex A. Freitas

Offers an in depth examine of the main layout parts that represent a top-down decision-tree induction set of rules, together with facets reminiscent of break up standards, preventing standards, pruning and the techniques for facing lacking values. while the tactic nonetheless hired these days is to take advantage of a 'generic' decision-tree induction set of rules whatever the information, the authors argue at the advantages bias-fitting technique may perhaps carry to decision-tree induction, within which the last word aim is the automated new release of a decision-tree induction set of rules adapted to the appliance area of curiosity. For such, they speak about how you can successfully observe the main appropriate set of elements of decision-tree induction algorithms to house a large choice of purposes throughout the paradigm of evolutionary computation, following the emergence of a unique box referred to as hyper-heuristics.

"Automatic layout of Decision-Tree Induction Algorithms" will be hugely worthy for computer studying and evolutionary computation scholars and researchers alike.

Show description

Read Online or Download Automatic Design of Decision-Tree Induction Algorithms (Springer Briefs in Computer Science) PDF

Best algorithms books

Parallel Algorithms for Irregular Problems: State of the Art

Effective parallel recommendations were came upon to many difficulties. a few of them could be bought immediately from sequential courses, utilizing compilers. although, there's a huge category of difficulties - abnormal difficulties - that lack effective strategies. abnormal ninety four - a workshop and summer time college geared up in Geneva - addressed the issues linked to the derivation of effective strategies to abnormal difficulties.

Algorithms and Computation: 21st International Symposium, ISAAC 2010, Jeju, Korea, December 15-17, 2010, Proceedings, Part II

This publication constitutes the refereed lawsuits of the twenty first foreign Symposium on Algorithms and Computation, ISAAC 2010, held in Jeju, South Korea in December 2010. The seventy seven revised complete papers offered have been rigorously reviewed and chosen from 182 submissions for inclusion within the booklet. This quantity comprises issues equivalent to approximation set of rules; complexity; info constitution and set of rules; combinatorial optimization; graph set of rules; computational geometry; graph coloring; mounted parameter tractability; optimization; on-line set of rules; and scheduling.

Algorithms and Architectures for Parallel Processing: 15th International Conference, ICA3PP 2015, Zhangjiajie, China, November 18-20, 2015, Proceedings, Part II

This 4 quantity set LNCS 9528, 9529, 9530 and 9531 constitutes the refereed lawsuits of the fifteenth overseas convention on Algorithms and Architectures for Parallel Processing, ICA3PP 2015, held in Zhangjiajie, China, in November 2015. The 219 revised complete papers awarded including seventy seven workshop papers in those 4 volumes have been conscientiously reviewed and chosen from 807 submissions (602 complete papers and 205 workshop papers).

Extra resources for Automatic Design of Decision-Tree Induction Algorithms (Springer Briefs in Computer Science)

Sample text

Assign instance x j to the partition with the greatest number of instances that belong to the same class that x j . Formally, if x j is labeled as yl , we assign x j to arg maxvm [Nvm ,yl ] [65]. • Create a surrogate split for each split in the original tree based on a different attribute [12]. For instance, a split over attribute ai will have a surrogate split over attribute a j , given that a j is the attribute which most resembles the original split. 42) where the original split over attribute ai is divided in two partitions, d1 (ai ) and d2 (ai ), and the alternative split over a j is divided in d1 (a j ) and d2 (a j ).

Hyafil, R. Rivest, Constructing optimal binary decision trees is NP-complete. Inf. Process. Lett. 5(1), 15–17 (1976) 51. A. Ittner, Non-linear decision trees, in 13th International Conference on Machine Learning. pp. 1–6 (1996) 52. B. , A new criterion in selection and discretization of attributes for the generation of decision trees. IEEE Trans. Pattern Anal. Mach. Intell. 19(2), 1371–1375 (1997) 53. G. Kalkanis, The application of confidence interval error analysis to the design of decision tree classifiers.

It uses a pruning set (a part of the training set) to evaluate the goodness of a given subtree from T . The idea is to evaluate each non-terminal node t ∈ ζT with regard to the classification error in the pruning set. If such an error decreases when we replace the subtree T (t) by a leaf node, than T (t) must be pruned. Quinlan imposes a constraint: a node t cannot be pruned if it contains a subtree that yields a lower classification error in the pruning set. The practical consequence of this constraint is that REP should be performed in a bottom-up fashion.

Download PDF sample

Rated 4.49 of 5 – based on 32 votes