Course Content
How and Why to Register
Dear, to register for the 6 months AI and Data Science Mentorship Program, click this link and fill the form give there: https://shorturl.at/fuMX6
0/2
Day-17: Complete EDA on Google PlayStore Apps
0/1
Day-25: Quiz Time, Data Visualization-4
0/1
Day-27: Data Scaling/Normalization/standardization and Encoding
0/2
Day-30: NumPy (Part-3)
0/1
Day-31: NumPy (Part-4)
0/1
Day-32a: NumPy (Part-5)
0/1
Day-32b: Data Preprocessing / Data Wrangling
0/1
Day-37: Algebra in Data Science
0/1
Day-56: Statistics for Data Science (Part-5)
0/1
Day-69: Machine Learning (Part-3)
0/1
Day-75: Machine Learning (Part-9)
0/1
Day-81: Machine Learning (Part-15)-Evaluation Metrics
0/2
Day-82: Machine Learning (Part-16)-Metrics for Classification
0/1
Day-85: Machine Learning (Part-19)
0/1
Day-89: Machine Learning (Part-23)
0/1
Day-91: Machine Learning (Part-25)
0/1
Day-93: Machine Learning (Part-27)
0/1
Day-117: Deep Learning (Part-14)-Complete CNN Project
0/1
Day-119: Deep Learning (Part-16)-Natural Language Processing (NLP)
0/2
Day-121: Time Series Analysis (Part-1)
0/1
Day-123: Time Series Analysis (Part-3)
0/1
Day-128: Time Series Analysis (Part-8): Complete Project
0/1
Day-129: git & GitHub Crash Course
0/1
Day-131: Improving Machine/Deep Learning Model’s Performance
0/2
Day-133: Transfer Learning and Pre-trained Models (Part-2)
0/1
Day-134 Transfer Learning and Pre-trained Models (Part-3)
0/1
Day-137: Generative AI (Part-3)
0/1
Day-139: Generative AI (Part-5)-Tensorboard
0/1
Day-145: Streamlit for webapp development and deployment (Part-1)
0/3
Day-146: Streamlit for webapp development and deployment (Part-2)
0/1
Day-147: Streamlit for webapp development and deployment (Part-3)
0/1
Day-148: Streamlit for webapp development and deployment (Part-4)
0/2
Day-149: Streamlit for webapp development and deployment (Part-5)
0/1
Day-150: Streamlit for webapp development and deployment (Part-6)
0/1
Day-151: Streamlit for webapp development and deployment (Part-7)
0/1
Day-152: Streamlit for webapp development and deployment (Part-8)
0/1
Day-153: Streamlit for webapp development and deployment (Part-9)
0/1
Day-154: Streamlit for webapp development and deployment (Part-10)
0/1
Day-155: Streamlit for webapp development and deployment (Part-11)
0/1
Day-156: Streamlit for webapp development and deployment (Part-12)
0/1
Day-157: Streamlit for webapp development and deployment (Part-13)
0/1
How to Earn using Data Science and AI skills
0/1
Day-160: Flask for web app development (Part-3)
0/1
Day-161: Flask for web app development (Part-4)
0/1
Day-162: Flask for web app development (Part-5)
0/1
Day-163: Flask for web app development (Part-6)
0/1
Day-164: Flask for web app development (Part-7)
0/2
Day-165: Flask for web app deployment (Part-8)
0/1
Day-167: FastAPI (Part-2)
0/1
Day-168: FastAPI (Part-3)
0/1
Day-169: FastAPI (Part-4)
0/1
Day-170: FastAPI (Part-5)
0/1
Day-171: FastAPI (Part-6)
0/1
Day-174: FastAPI (Part-9)
0/1
Six months of AI and Data Science Mentorship Program
    Join the conversation
    Zohaib Zeeshan 3 weeks ago
    U CAN USE DF.SAMPLE TO GET EVERY VALUE OF THE COLUMN
    Reply
    Muhammad_Faizan 3 weeks ago
    Assignment: (GPT response but very helpful) ### Q1: Different Types of Feature Encoding TechniquesFeature encoding is the process of converting categorical data into numerical data so that machine learning algorithms can process it. Here are different types of feature encoding techniques:1. **Label Encoding** 2. **One-Hot Encoding** 3. **Binary Encoding** 4. **Ordinal Encoding** 5. **Frequency Encoding** 6. **Target Encoding** 7. **Hash Encoding** 8. **Leave-One-Out Encoding****Most Important and Famous Ones:**1. **Label Encoding:** - Converts each unique category to a numerical value. - Simple and easy to implement. - Used for ordinal data where there is an inherent order.2. **One-Hot Encoding:** - Converts categories into binary columns. - No ordinal relationship assumed. - Suitable for nominal data.3. **Binary Encoding:** - Reduces dimensionality compared to one-hot encoding. - Each category is converted into binary and then split into columns.4. **Ordinal Encoding:** - Assigns numerical values based on order. - Used for ordinal data with a clear ranking.5. **Frequency Encoding:** - Encodes categories based on the frequency of their occurrence. - Useful for dealing with high cardinality features.6. **Target Encoding:** - Encodes categories based on the mean of the target variable. - Can introduce leakage; needs careful handling.### Q2: Which Feature Encoding Techniques to Use and When1. **Label Encoding:** - **Use When:** You have ordinal data with a meaningful order (e.g., ratings, ranks). - **Example:** ['low', 'medium', 'high'] → [0, 1, 2]2. **One-Hot Encoding:** - **Use When:** You have nominal data without an inherent order. - **Example:** ['red', 'blue', 'green'] → [ [1, 0, 0], [0, 1, 0], [0, 0, 1] ]3. **Binary Encoding:** - **Use When:** You have high cardinality categorical features and want to reduce dimensionality. - **Example:** ['cat', 'dog', 'mouse'] → [ ['cat'] → 001, ['dog'] → 010, ['mouse'] → 011]4. **Ordinal Encoding:** - **Use When:** There is a clear, meaningful order in the categories. - **Example:** ['first', 'second', 'third'] → [1, 2, 3]5. **Frequency Encoding:** - **Use When:** Dealing with high cardinality features and you want to use the frequency information. - **Example:** ['apple', 'banana', 'apple', 'apple', 'banana'] → [3, 2, 3, 3, 2]6. **Target Encoding:** - **Use When:** You want to capture the relationship between categorical feature and target variable (especially in regression tasks). - **Example:** Encoding 'city' based on the average house prices in that city.7. **Hash Encoding:** - **Use When:** You need to handle very high cardinality and want a fixed-size encoding. - **Example:** Using a hash function to map categories to a fixed number of columns.8. **Leave-One-Out Encoding:** - **Use When:** You want to mitigate target leakage in target encoding by excluding the current row when calculating the mean. - **Example:** For each category, calculate the mean of the target variable excluding the current instance.Choosing the right encoding technique depends on the nature of your data and the specific requirements of your machine learning model.
    Reply
    yousuf jawwad 2 months ago
    1. Ordinal Encoding Use Case: Categorical variables with inherent order or ranking. Example: ["Low", "Medium", "High"] could be encoded as [1, 2, 3]. 2. One-Hot Encoding Use Case: Nominal categorical variables with no inherent order. Example: ["Red", "Blue", "Green"] could be encoded as three separate binary columns: Red (1, 0, 0), Blue (0, 1, 0), Green (0, 0, 1). 3. Binary Encoding Use Case: High-cardinality nominal categorical variables. Example: "Category 15" could be encoded to binary and then split into separate columns. 4. Label Encoding Use Case: Categorical variables with a meaningful ordinal relationship. Example: ["First", "Second", "Third"] could be encoded as [1, 2, 3]. 5. Count Encoding Use Case: When the frequency of occurrences of a category is relevant. Example: A category that appears 10 times in the dataset would be encoded as 10. 6. Target Encoding / Mean Encoding Use Case: When the relationship between the categorical variable and the target variable is important. Example: Encoding categories based on the mean of the target variable for each category. 7. Frequency Encoding Use Case: When the frequency of categories is relevant. Example: A category appearing 5% of the time would be encoded as 0.05. 8. Feature Hashing Use Case: Dealing with high-cardinality categorical features to reduce dimensionality. Example: Hashing each category into a fixed number of columns. 9. Embedding Layers Use Case: Embedding layers in neural networks for categorical variables. Example: Mapping each category to a dense vector representation within the network. 10. Entity Embeddings of Categorical Variables Use Case: Learning dense representations of categorical variables in deep learning scenarios. Example: Similar to embedding layers, used to capture relationships between categories in a low-dimensional space. Brief Descriptions: A. Ordinal Encoding: Used for categorical variables with inherent order or ranking. B. One-Hot Encoding: Used for nominal categorical variables without inherent order. C. Binary Encoding: Used with high-cardinality nominal categorical variables. D. Label Encoding: Used when the ordinal relationship between categories is known and meaningful. E. Count Encoding: Used when the frequency of occurrences of a category is relevant information. F. Target Encoding / Mean Encoding: Used when the relationship between the categorical variable and the target variable is important. G. Frequency Encoding: Used when the frequency of categories is relevant. H. Feature Hashing: Used when dealing with high-cardinality categorical features to reduce dimensionality. J. Embedding Layers: Used for embedding layers in neural networks for categorical variables. K. Entity Embeddings of Categorical Variables: Useful in deep learning scenarios for learning dense representations of categorical variables. These encoding methods help transform categorical data into numerical formats suitable for machine learning models.
    Reply
    Muhammad Rameez 2 months ago
    Done
    Reply
    Rana Anjum Sharif 2 months ago
    Done
    Reply
    Anila Gulzar Toor 6 months ago
    1. Label Encoding: Assigns unique label to each category, used for ordinal data where the order matters. 2. On-Hot Encoding: Creates binary columns for each category, indicating the presence or absence. Best for nominal data and works well when the number of categories is not too high. 3. Ordinal Encoding: Assigns numerical values based on the order. Useful for ordinal data when we have a clear order among categories. 4. Binary Encoding: Converts categories into binary code. Efficient when dealing with high cardinality categorical features. 5. Frequency Encoding: Uses the frequency of each category as its representation, works when categories with higher frequencies might carry more significance. 6. Target Encoding: Involves replacing a categorical value with the mean of the target variable for that category. Useful when we want to incorporate target variable information into the encoding. It is effective for improving model performance especially in classification tasks.
    Reply
    Zayan Ahmad Ghous 7 months ago
    You can use df.sample(5) for taking different data points from data.
    Reply
    Mahboob Ul Hassan 7 months ago
    Mahboob ul-Hassan Assignment: Assignment: Types of feature encoding: 1- Ordinal Encoding 2- One-Hot Encoding 3- Binary Encoding 4- Label Encoding 5- Count Encoding 6- Target Encoding or Mean Encoding 7- Frequency Encoding 8- Feature Hashing 9- Embedding Layers 10-Entity Embeddings of Categorical Variables A- Ordinal Encoding is used for categorical variables which have an inherent order or ranking B- One-Hot Encoding is used for nominal categorical variables i.e. categories with no inherent order. C- Binary Encoding is used with high-cardinality nominal categorical variables. D- Label Encoding is used when the ordinal relationship between categories is known and meaningful. E- Count Encoding is used when frequency of occurrences of a category is relevant information. F- Target Encoding /Mean Encoding is used when the relationship between the categorical variable and the target variable is important. G- Frequency Encoding is used when the frequency of categories is relevant. H-Feature Hashing is used when dealing with high-cardinality categorical features to reduce dimensionality. J- Embedding Layers is used for embedding layers when working with categorical variables in neural networks. K-Entity Embeddings of Categorical Variables seful in deep learning scenarios for learning dense representations of categorical variables.
    Reply
    Sibtain Ali 7 months ago
    I Have done this video with 100% practice and Assignment: Q1. How many types of feature encoding are there? Feature encoding is a crucial step in the process of preparing data for machine learning models. Ordinal Encoding: One-Hot Encoding: Binary Encoding: Label Encoding: Count Encoding: Target Encoding (Mean Encoding): Frequency Encoding: Feature Hashing: Embedding Layers: Entity Embeddings of Categorical Variables: Q2. When to use which type of feature encoding? Ordinal Encoding: Use when the categorical variable has an inherent order or ranking. One-Hot Encoding: Suitable for nominal categorical variables (categories with no inherent order). Binary Encoding: When dealing with high-cardinality nominal categorical variables. Label Encoding: Suitable when the ordinal relationship between categories is known and meaningful. Count Encoding: When the frequency of occurrences of a category is relevant information. Target Encoding (Mean Encoding): When the relationship between the categorical variable and the target variable is important. Frequency Encoding: Similar to count encoding, it can be used when the frequency of categories is relevant. Feature Hashing: Useful when dealing with high-cardinality categorical features to reduce dimensionality. Embedding Layers: In the context of deep learning, use embedding layers when working with categorical variables in neural networks. Entity Embeddings of Categorical Variables: Similar to embedding layers, useful in deep learning scenarios for learning dense representations of categorical variables.
    Reply
    tayyab Ali 7 months ago
    I have done this lecture with 100% practice.
    Reply
    tayyab Ali 7 months ago
    Assignment: Q1. How many types of feature encoding are there? Feature encoding is a crucial step in the process of preparing data for machine learning models. Ordinal Encoding: One-Hot Encoding: Binary Encoding: Label Encoding: Count Encoding: Target Encoding (Mean Encoding): Frequency Encoding: Feature Hashing: Embedding Layers: Entity Embeddings of Categorical Variables:Q2. When to use which type of feature encoding? Ordinal Encoding: Use when the categorical variable has an inherent order or ranking. One-Hot Encoding: Suitable for nominal categorical variables (categories with no inherent order). Binary Encoding: When dealing with high-cardinality nominal categorical variables. Label Encoding: Suitable when the ordinal relationship between categories is known and meaningful. Count Encoding: When the frequency of occurrences of a category is relevant information. Target Encoding (Mean Encoding): When the relationship between the categorical variable and the target variable is important. Frequency Encoding: Similar to count encoding, it can be used when the frequency of categories is relevant. Feature Hashing: Useful when dealing with high-cardinality categorical features to reduce dimensionality. Embedding Layers: In the context of deep learning, use embedding layers when working with categorical variables in neural networks. Entity Embeddings of Categorical Variables: Similar to embedding layers, useful in deep learning scenarios for learning dense representations of categorical variables.
    0% Complete