top of page
  • Writer's pictureDanielle Costa Nakano

Machine Learning Algorithm - Decision Tree

Updated: Nov 9, 2019

Description: The best way to understand how decision tree works, is to play Jezzball – a classic game from Microsoft (image below). Essentially, you have a room with moving walls and you need to create walls such that maximum area gets cleared off with out the balls. So, every time you split the room with a wall, you are trying to create 2 different populations with in the same room. Decision trees work in very similar fashion by dividing a population in as different groups as possible.


Surprisingly, it works for both categorical and continuous dependent variables. In this algorithm, we split the population into two or more homogeneous sets. This is done based on most significant attributes/ independent variables to make as distinct groups as possible.


Algorithm: By example, to split the population into different heterogeneous groups, it uses various techniques like Gini, Information Gain, Chi-square, entropy.

0 comments

Recent Posts

See All

Data Science Products: Too good to be true

If you are productizing a predictive model at work or playing around with MLE in R for the first time, always check the data. Roles When working on a predictive model at work, everyone has a role. Pr

Post: Blog2_Post
bottom of page