Join the conversation
![](https://codanics.com/wp-content/uploads/2024/04/IMG_20240410_175137-1-scaled.jpg)
Done
Reply
![](https://codanics.com/wp-content/uploads/2024/04/IMG_16954467347071342.jpg)
Done
Reply
![](https://codanics.com/wp-content/uploads/2023/10/e24e3cc0-f9cd-49cc-9e5e-e7fea619bd42.jpg)
I learned in this video about Entropy, Gini impurity, and information gain.
Reply
![](https://codanics.com/wp-content/uploads/2024/05/My_profile_pic.jpg)
I learned in this lecture about Entropy and gini impurity and information.
Reply
![](https://codanics.com/wp-content/uploads/2023/10/9dd24f5a-b137-440a-baba-1855925152a0.jpg)
AOA, I also learned about impurities ( mixture of different things), which are1-Entropy ( measure of randomness, disorder, or impurity) ( quantify the impurity or uncertainty)
2-Gini Impurity (the process of finding the best splits and creating decision boundaries that separate different classes as cleanly as possible).
3-Information Gain ( reduction in entropy )ALLAH PAK aap ko sahat o aafiat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey,Ameen.
Reply
![](https://codanics.com/wp-content/uploads/2024/02/2.jpg)
In this lecture, I learned three main terminologies of decision tree (1) Entropy (2) Gini Impurity (3) Information Gain. Further, at the leaf node, Entropy will become minimal and information gain will be maximum.
Reply