top of page

Decision Tree (Machine Learning Algorithm)

  • Writer: Danielle Costa Nakano
    Danielle Costa Nakano
  • Apr 10, 2019
  • 1 min read

Updated: Dec 14, 2024

Description: The best way to understand how decision tree works, is to play Jezzball – a classic game from Microsoft (image below). Essentially, you have a room with moving walls and you need to create walls such that maximum area gets cleared off with out the balls. So, every time you split the room with a wall, you are trying to create 2 different populations with in the same room. Decision trees work in very similar fashion by dividing a population in as different groups as possible.


ree

Surprisingly, it works for both categorical and continuous dependent variables. In this algorithm, we split the population into two or more homogeneous sets. This is done based on most significant attributes/ independent variables to make as distinct groups as possible.


Algorithm: By example, to split the population into different heterogeneous groups, it uses various techniques like Gini, Information Gain, Chi-square, entropy.

Recent Posts

See All
Data Products: Too good to be true

If you are productizing a predictive model at work or playing around with MLE in R for the first time, always check the data. Roles When...

 
 
 

Comments


Subscribe Form

Thanks for submitting!

©2025 by Danielle Costa Nakano.

  • LinkedIn
bottom of page