Course Content
How and Why to Register
Dear, to register for the 6 months AI and Data Science Mentorship Program, click this link and fill the form give there: https://shorturl.at/fuMX6
0/2
Day-17: Complete EDA on Google PlayStore Apps
0/1
Day-25: Quiz Time, Data Visualization-4
0/1
Day-27: Data Scaling/Normalization/standardization and Encoding
0/2
Day-30: NumPy (Part-3)
0/1
Day-31: NumPy (Part-4)
0/1
Day-32a: NumPy (Part-5)
0/1
Day-32b: Data Preprocessing / Data Wrangling
0/1
Day-37: Algebra in Data Science
0/1
Day-56: Statistics for Data Science (Part-5)
0/1
Day-69: Machine Learning (Part-3)
0/1
Day-75: Machine Learning (Part-9)
0/1
Day-81: Machine Learning (Part-15)-Evaluation Metrics
0/2
Day-82: Machine Learning (Part-16)-Metrics for Classification
0/1
Day-85: Machine Learning (Part-19)
0/1
Day-89: Machine Learning (Part-23)
0/1
Day-91: Machine Learning (Part-25)
0/1
Day-93: Machine Learning (Part-27)
0/1
Day-117: Deep Learning (Part-14)-Complete CNN Project
0/1
Day-119: Deep Learning (Part-16)-Natural Language Processing (NLP)
0/2
Day-121: Time Series Analysis (Part-1)
0/1
Day-123: Time Series Analysis (Part-3)
0/1
Day-128: Time Series Analysis (Part-8): Complete Project
0/1
Day-129: git & GitHub Crash Course
0/1
Day-131: Improving Machine/Deep Learning Model’s Performance
0/2
Day-133: Transfer Learning and Pre-trained Models (Part-2)
0/1
Day-134 Transfer Learning and Pre-trained Models (Part-3)
0/1
Day-137: Generative AI (Part-3)
0/1
Day-139: Generative AI (Part-5)-Tensorboard
0/1
Day-145: Streamlit for webapp development and deployment (Part-1)
0/3
Day-146: Streamlit for webapp development and deployment (Part-2)
0/1
Day-147: Streamlit for webapp development and deployment (Part-3)
0/1
Day-148: Streamlit for webapp development and deployment (Part-4)
0/2
Day-149: Streamlit for webapp development and deployment (Part-5)
0/1
Day-150: Streamlit for webapp development and deployment (Part-6)
0/1
Day-151: Streamlit for webapp development and deployment (Part-7)
0/1
Day-152: Streamlit for webapp development and deployment (Part-8)
0/1
Day-153: Streamlit for webapp development and deployment (Part-9)
0/1
Day-154: Streamlit for webapp development and deployment (Part-10)
0/1
Day-155: Streamlit for webapp development and deployment (Part-11)
0/1
Day-156: Streamlit for webapp development and deployment (Part-12)
0/1
Day-157: Streamlit for webapp development and deployment (Part-13)
0/1
How to Earn using Data Science and AI skills
0/1
Day-160: Flask for web app development (Part-3)
0/1
Day-161: Flask for web app development (Part-4)
0/1
Day-162: Flask for web app development (Part-5)
0/1
Day-163: Flask for web app development (Part-6)
0/1
Day-164: Flask for web app development (Part-7)
0/2
Day-165: Flask for web app deployment (Part-8)
0/1
Day-167: FastAPI (Part-2)
0/1
Day-168: FastAPI (Part-3)
0/1
Day-169: FastAPI (Part-4)
0/1
Day-170: FastAPI (Part-5)
0/1
Day-171: FastAPI (Part-6)
0/1
Day-174: FastAPI (Part-9)
0/1
Six months of AI and Data Science Mentorship Program
    Join the conversation
    yousuf jawwad 4 weeks ago
    1. Ordinal Encoding Use Case: Categorical variables with inherent order or ranking. Example: ["Low", "Medium", "High"] could be encoded as [1, 2, 3]. 2. One-Hot Encoding Use Case: Nominal categorical variables with no inherent order. Example: ["Red", "Blue", "Green"] could be encoded as three separate binary columns: Red (1, 0, 0), Blue (0, 1, 0), Green (0, 0, 1). 3. Binary Encoding Use Case: High-cardinality nominal categorical variables. Example: "Category 15" could be encoded to binary and then split into separate columns. 4. Label Encoding Use Case: Categorical variables with a meaningful ordinal relationship. Example: ["First", "Second", "Third"] could be encoded as [1, 2, 3]. 5. Count Encoding Use Case: When the frequency of occurrences of a category is relevant. Example: A category that appears 10 times in the dataset would be encoded as 10. 6. Target Encoding / Mean Encoding Use Case: When the relationship between the categorical variable and the target variable is important. Example: Encoding categories based on the mean of the target variable for each category. 7. Frequency Encoding Use Case: When the frequency of categories is relevant. Example: A category appearing 5% of the time would be encoded as 0.05. 8. Feature Hashing Use Case: Dealing with high-cardinality categorical features to reduce dimensionality. Example: Hashing each category into a fixed number of columns. 9. Embedding Layers Use Case: Embedding layers in neural networks for categorical variables. Example: Mapping each category to a dense vector representation within the network. 10. Entity Embeddings of Categorical Variables Use Case: Learning dense representations of categorical variables in deep learning scenarios. Example: Similar to embedding layers, used to capture relationships between categories in a low-dimensional space. Brief Descriptions: A. Ordinal Encoding: Used for categorical variables with inherent order or ranking. B. One-Hot Encoding: Used for nominal categorical variables without inherent order. C. Binary Encoding: Used with high-cardinality nominal categorical variables. D. Label Encoding: Used when the ordinal relationship between categories is known and meaningful. E. Count Encoding: Used when the frequency of occurrences of a category is relevant information. F. Target Encoding / Mean Encoding: Used when the relationship between the categorical variable and the target variable is important. G. Frequency Encoding: Used when the frequency of categories is relevant. H. Feature Hashing: Used when dealing with high-cardinality categorical features to reduce dimensionality. J. Embedding Layers: Used for embedding layers in neural networks for categorical variables. K. Entity Embeddings of Categorical Variables: Useful in deep learning scenarios for learning dense representations of categorical variables. These encoding methods help transform categorical data into numerical formats suitable for machine learning models.
    Reply
    Muhammad Rameez 4 weeks ago
    Done
    Reply
    Rana Anjum Sharif 1 month ago
    Done
    Reply
    Anila Gulzar Toor 5 months ago
    1. Label Encoding: Assigns unique label to each category, used for ordinal data where the order matters. 2. On-Hot Encoding: Creates binary columns for each category, indicating the presence or absence. Best for nominal data and works well when the number of categories is not too high. 3. Ordinal Encoding: Assigns numerical values based on the order. Useful for ordinal data when we have a clear order among categories. 4. Binary Encoding: Converts categories into binary code. Efficient when dealing with high cardinality categorical features. 5. Frequency Encoding: Uses the frequency of each category as its representation, works when categories with higher frequencies might carry more significance. 6. Target Encoding: Involves replacing a categorical value with the mean of the target variable for that category. Useful when we want to incorporate target variable information into the encoding. It is effective for improving model performance especially in classification tasks.
    Reply
    Zayan Ahmad Ghous 5 months ago
    You can use df.sample(5) for taking different data points from data.
    Reply
    Mahboob Ul Hassan 6 months ago
    Mahboob ul-Hassan Assignment: Assignment: Types of feature encoding: 1- Ordinal Encoding 2- One-Hot Encoding 3- Binary Encoding 4- Label Encoding 5- Count Encoding 6- Target Encoding or Mean Encoding 7- Frequency Encoding 8- Feature Hashing 9- Embedding Layers 10-Entity Embeddings of Categorical Variables A- Ordinal Encoding is used for categorical variables which have an inherent order or ranking B- One-Hot Encoding is used for nominal categorical variables i.e. categories with no inherent order. C- Binary Encoding is used with high-cardinality nominal categorical variables. D- Label Encoding is used when the ordinal relationship between categories is known and meaningful. E- Count Encoding is used when frequency of occurrences of a category is relevant information. F- Target Encoding /Mean Encoding is used when the relationship between the categorical variable and the target variable is important. G- Frequency Encoding is used when the frequency of categories is relevant. H-Feature Hashing is used when dealing with high-cardinality categorical features to reduce dimensionality. J- Embedding Layers is used for embedding layers when working with categorical variables in neural networks. K-Entity Embeddings of Categorical Variables seful in deep learning scenarios for learning dense representations of categorical variables.
    Reply
    Sibtain Ali 6 months ago
    I Have done this video with 100% practice and Assignment: Q1. How many types of feature encoding are there? Feature encoding is a crucial step in the process of preparing data for machine learning models. Ordinal Encoding: One-Hot Encoding: Binary Encoding: Label Encoding: Count Encoding: Target Encoding (Mean Encoding): Frequency Encoding: Feature Hashing: Embedding Layers: Entity Embeddings of Categorical Variables: Q2. When to use which type of feature encoding? Ordinal Encoding: Use when the categorical variable has an inherent order or ranking. One-Hot Encoding: Suitable for nominal categorical variables (categories with no inherent order). Binary Encoding: When dealing with high-cardinality nominal categorical variables. Label Encoding: Suitable when the ordinal relationship between categories is known and meaningful. Count Encoding: When the frequency of occurrences of a category is relevant information. Target Encoding (Mean Encoding): When the relationship between the categorical variable and the target variable is important. Frequency Encoding: Similar to count encoding, it can be used when the frequency of categories is relevant. Feature Hashing: Useful when dealing with high-cardinality categorical features to reduce dimensionality. Embedding Layers: In the context of deep learning, use embedding layers when working with categorical variables in neural networks. Entity Embeddings of Categorical Variables: Similar to embedding layers, useful in deep learning scenarios for learning dense representations of categorical variables.
    Reply
    tayyab Ali 6 months ago
    I have done this lecture with 100% practice.
    Reply
    tayyab Ali 6 months ago
    Assignment: Q1. How many types of feature encoding are there? Feature encoding is a crucial step in the process of preparing data for machine learning models. Ordinal Encoding: One-Hot Encoding: Binary Encoding: Label Encoding: Count Encoding: Target Encoding (Mean Encoding): Frequency Encoding: Feature Hashing: Embedding Layers: Entity Embeddings of Categorical Variables:Q2. When to use which type of feature encoding? Ordinal Encoding: Use when the categorical variable has an inherent order or ranking. One-Hot Encoding: Suitable for nominal categorical variables (categories with no inherent order). Binary Encoding: When dealing with high-cardinality nominal categorical variables. Label Encoding: Suitable when the ordinal relationship between categories is known and meaningful. Count Encoding: When the frequency of occurrences of a category is relevant information. Target Encoding (Mean Encoding): When the relationship between the categorical variable and the target variable is important. Frequency Encoding: Similar to count encoding, it can be used when the frequency of categories is relevant. Feature Hashing: Useful when dealing with high-cardinality categorical features to reduce dimensionality. Embedding Layers: In the context of deep learning, use embedding layers when working with categorical variables in neural networks. Entity Embeddings of Categorical Variables: Similar to embedding layers, useful in deep learning scenarios for learning dense representations of categorical variables.
    Javed Ali 6 months ago
    Assignment of Day-73: 13-dec-2023 ML (day-6) How many types of feature encoding are there, and when to use which type of feature encoding? A list of common feature encoding techniques and there uses:1- One-Hot Encoding (One-hot encoding is suitable for algorithms that can handle high-dimensional input.)2- Label Encoding ( Use label encoding when preserving the relative order of the categories is important.)3- Ordinal Encoding ( ordinal encoding is used when there is an ordinal relationship between categories.)4- Binary Encoding ( Binary encoding is useful for reducing the dimensionality of high-cardinality categorical variables.)5-Count Encoding ( It replaces each category with the count of occurrences of that category in the dataset.)6- Target Encoding ( Use target encoding when the relationship between a categorical variable and the target variable is important.)7- Feature Hashing ( Feature hashing is suitable for high-dimensional categorical variables with high cardinality.)8- Embedding ( They capture semantic relationships between words or entities in a continuous vector space )
    Reply
    Javed Ali 6 months ago
    AOA, I learned in this lecture about the ML algorithm of LINER REGRESSION and its types in Python, which are1-Label Encoding 2-One Hot Encoding 3-Ordinal Encoding 4-Binary Encoding ALLAH PAK aap ko sahat o aafiat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey, Ameen.
    Reply
    0% Complete