Join the conversation
![](https://codanics.com/wp-content/uploads/2024/04/IMG_16954467347071342.jpg.webp)
https://www.kaggle.com/muhammadrameez242 , ye meri kaggle ki I'd hai yha par Ap mukamal assignments simple roman urdu mein mill jai gi codes mein
Reply
![](https://codanics.com/wp-content/uploads/2024/04/IMG_20240410_175137-1-scaled.jpg.webp)
Done
Reply
![](https://codanics.com/wp-content/uploads/2024/02/2.jpg.webp)
Very detailed understanding of evaluation metrics for classification techniques and formula usage in these metrics.
Reply
![](https://codanics.com/wp-content/uploads/2024/05/My_profile_pic.jpg)
I learned in this lecture Accuracy, Precision, Recall, and F1 score. Confusion matrix.
Reply
![](https://codanics.com/wp-content/uploads/2023/10/e24e3cc0-f9cd-49cc-9e5e-e7fea619bd42.jpg.webp)
I learned in this video Accuracy, Precision, Recall, and F1 (Confusion metrics).
Reply
![](https://codanics.com/wp-content/uploads/2023/10/9dd24f5a-b137-440a-baba-1855925152a0.jpg.webp)
AOA, In this lecture, I learned how evaluation metrics help us measure model accuracy. We know that there are some types of machine learning algorithms and I also learned about Classification metrics which are1-Accuracy
Description: Proportion of correctly predicted observations to the total observations.
Pros: Simple and intuitive.
Cons: Can be misleading in imbalanced datasets.
Example: For 100 predictions with 90 correct predictions, accuracy is 90%2-Precision
Description: Proportion of correctly predicted positive observations to the total predicted positive observation.
Pros: It focuses on the relevancy of results.
Cons: Does not consider true negative results.
Example: For 30 true positive prediction out of 40 total positive predictions, precision is 75%3-Recall (Sensitivity)Description: Proportion of correctly predicted positive observations to all observation in actual class
Pros: It is useful in cases where False negatives are costly.
Cons: Can lead to ignoring the true negatives.
Example: For 30 true positive predictions out of 40 actual positive instances, recall is 75%
4-F1 ScoreDescription: Harmonic means of Precision and Recall.
Pros: Balances Precision and Recall.
Cons: May not be a good measure when there is an imbalance between precision and Recall
Example: With Precision =75% and Recall = 75% F1 Score is 2*(0.75*0.75)/(0.75+0.75) = 75%5-Area Under the ROC Curve (AUC-ROC)Description: Measures the ability of a classifier to distinguish between classes
Pros: Effective for binary classification problems
Cons: Less information for multi-class problems
Example: If the AUC is 0.90, there is a 90% chance that the model will be able to distinguish between positive and negative6-Confusion Matrix
Description: A table showing actual vs predicted values
Pros: Provides a detailed breakdown of correct and incorrect classification
Cons: More complex to interpret
Example: A matrix showing TP, FN, FP, and TNALLAH PAK aap ko sahat o aafiat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey or ap k walid-e-mohtram ko karwat karwat jannat ata farmay,Ameen.
Reply