c++ - Decision trees / stumps with Adaboost -


i started learning decision trees adaboost , trying out on opencv , have questions.

boosted decision trees

i understand when use adaboost decision trees, continuously fit decision trees reweighted version of training data. classification done weighted majority vote

can instead use bootstrapping when training decision trees adaboost ? i.e. select subsets of our dataset , train tree on each subset before feeding classifiers adaboost.

boosted decision stumps

do use same technique decision stumps ? or can instead create stumps equal number of features ? i.e. if have 2 classes 10 features, create total of of 10 decision stumps each feature before feeding classifiers adaboost.

adaboost not trains classifier on different subsets, adjusts weights of dataset elements depending on assemble performance reached. detailed description may found here.

yes, can use same technique train decision stumps. algorithm approximately following:

  1. train decision stump on initial dataset no weights (the same each element having weight = 1).
  2. update weights of elements, using formula adaboost algorithm. weights of correctly classified elements should become less, weights of incorrectly classified - larger.
  3. train decision stump using current weights. is, minimize not number of mistakes made decision stump, sum of weights of mistakes.
  4. if desired quality not achieved, go pt. 2.

Comments

Popular posts from this blog

Spring Boot + JPA + Hibernate: Unable to locate persister -

go - Golang: panic: runtime error: invalid memory address or nil pointer dereference using bufio.Scanner -

c - double free or corruption (fasttop) -