In regression trees, the outcome variable or the decision is continuous, e.g., a letter like ABC. It is similar to the example we have seen above, where the outcome had variables like ‘eligible’ or ‘not eligible.’ The decision variable here is Categorical. The classification trees are the simple yes or no type of trees. The decision trees are classified into two main types:ĭecision Tree Classification 1. This was a simple yes or no type of problem. If no, does she have valid certificates? If yes, she is eligible, else not. If yes, is her age between 18 to 25 years old? If yes, she is eligible, else not. The decision node first asks the question if the girl is a resident of India. Let us consider that you want to find if a girl is eligible for a beauty pageant contest like Miss India. Let us take an example of a simple binary tree to understand the decision trees. A binary tree for “Eligibility for Miss India Beauty Pageant”: The leaf nodes represent the outcomes, classification, or decisions of the event. The decision nodes are the ones where the data splits.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |