The Principle Behind The Working Of A Decision Tree Is Very Simple

Chapter 6

Decision Tree for Regression

In the last two chapters we studied Linear Regression and Polynomial Regression algorithms. These algorithms are based on the principle of error correction using gradient decent algorithm. In this chapter we are going to study another extremely powerful machine learning algorithm based on entropy.

The principle behind the working of a decision tree is very simple. Each feature in the dataset is treated as a node in the decision tree. At each node a decision is made regarding which path to choose in the tree depending upon the value of the feature at that particular node. The process continues until the leaf node is reached. Leaf node contains the final decision.

This explanation might seem daunting at first but we have been using decision trees all our life. Suppose there is a bank that has to decide whether loan should be given to a particular customer or not. The bank has customer data including age, gender and salary. Bank has to decide whether the customer should be give loan or not.

A bank may define criteria which consists of set of rules that defines whether the loan will be awarded or not. These rules can look like this.