Join the conversation
I learned about the important terms of the Decision Tree:
1. Entropy: the measure of impurity/disorder/randomness
2. Gini Impurity: the measure used to determine how often a randomly chosen element from the set (S) would be incorrectly labeled.
3. Information Gain: the measure of reduction in the Entropy/ impurity of the target variable.
Reply
Entropy and Gini Impurity are the same and are used interchangeably according to the characteristics of the features. Entropy takes high computation and provides the best results. On the other hand, Gini impurity takes less computation but provides less efficient results.
Done
Reply
Done
Reply
I learned in this video about Entropy, Gini impurity, and information gain.
Reply
I learned in this lecture about Entropy and gini impurity and information.
Reply
AOA, I also learned about impurities ( mixture of different things), which are1-Entropy ( measure of randomness, disorder, or impurity) ( quantify the impurity or uncertainty)
2-Gini Impurity (the process of finding the best splits and creating decision boundaries that separate different classes as cleanly as possible).
3-Information Gain ( reduction in entropy )ALLAH PAK aap ko sahat o aafiat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey,Ameen.
Reply
In this lecture, I learned three main terminologies of decision tree (1) Entropy (2) Gini Impurity (3) Information Gain. Further, at the leaf node, Entropy will become minimal and information gain will be maximum.
Reply