Course Content
How and Why to Register
Dear, to register for the 6 months AI and Data Science Mentorship Program, click this link and fill the form give there: https://shorturl.at/fuMX6
0/2
Day-17: Complete EDA on Google PlayStore Apps
0/1
Day-25: Quiz Time, Data Visualization-4
0/1
Day-27: Data Scaling/Normalization/standardization and Encoding
0/2
Day-30: NumPy (Part-3)
0/1
Day-31: NumPy (Part-4)
0/1
Day-32a: NumPy (Part-5)
0/1
Day-32b: Data Preprocessing / Data Wrangling
0/1
Day-37: Algebra in Data Science
0/1
Day-56: Statistics for Data Science (Part-5)
0/1
Day-69: Machine Learning (Part-3)
0/1
Day-75: Machine Learning (Part-9)
0/1
Day-81: Machine Learning (Part-15)-Evaluation Metrics
0/2
Day-82: Machine Learning (Part-16)-Metrics for Classification
0/1
Day-85: Machine Learning (Part-19)
0/1
Day-89: Machine Learning (Part-23)
0/1
Day-91: Machine Learning (Part-25)
0/1
Day-93: Machine Learning (Part-27)
0/1
Day-117: Deep Learning (Part-14)-Complete CNN Project
0/1
Day-119: Deep Learning (Part-16)-Natural Language Processing (NLP)
0/2
Day-121: Time Series Analysis (Part-1)
0/1
Day-123: Time Series Analysis (Part-3)
0/1
Day-128: Time Series Analysis (Part-8): Complete Project
0/1
Day-129: git & GitHub Crash Course
0/1
Day-131: Improving Machine/Deep Learning Model’s Performance
0/2
Day-133: Transfer Learning and Pre-trained Models (Part-2)
0/1
Day-134 Transfer Learning and Pre-trained Models (Part-3)
0/1
Day-137: Generative AI (Part-3)
0/1
Day-139: Generative AI (Part-5)-Tensorboard
0/1
Day-145: Streamlit for webapp development and deployment (Part-1)
0/3
Day-146: Streamlit for webapp development and deployment (Part-2)
0/1
Day-147: Streamlit for webapp development and deployment (Part-3)
0/1
Day-148: Streamlit for webapp development and deployment (Part-4)
0/2
Day-149: Streamlit for webapp development and deployment (Part-5)
0/1
Day-150: Streamlit for webapp development and deployment (Part-6)
0/1
Day-151: Streamlit for webapp development and deployment (Part-7)
0/1
Day-152: Streamlit for webapp development and deployment (Part-8)
0/1
Day-153: Streamlit for webapp development and deployment (Part-9)
0/1
Day-154: Streamlit for webapp development and deployment (Part-10)
0/1
Day-155: Streamlit for webapp development and deployment (Part-11)
0/1
Day-156: Streamlit for webapp development and deployment (Part-12)
0/1
Day-157: Streamlit for webapp development and deployment (Part-13)
0/1
How to Earn using Data Science and AI skills
0/1
Day-160: Flask for web app development (Part-3)
0/1
Day-161: Flask for web app development (Part-4)
0/1
Day-162: Flask for web app development (Part-5)
0/1
Day-163: Flask for web app development (Part-6)
0/1
Day-164: Flask for web app development (Part-7)
0/2
Day-165: Flask for web app deployment (Part-8)
0/1
Day-167: FastAPI (Part-2)
0/1
Day-168: FastAPI (Part-3)
0/1
Day-169: FastAPI (Part-4)
0/1
Day-170: FastAPI (Part-5)
0/1
Day-171: FastAPI (Part-6)
0/1
Day-174: FastAPI (Part-9)
0/1
Six months of AI and Data Science Mentorship Program
    Join the conversation
    Muhammad Asif Iqbal 3 weeks ago
    I have learned about Multi-layer perceptron in which I understood forward propagation and backward propagation.
    Reply
    Muhammad_Faizan 2 months ago
    I learned about the Multi-layer Perceptrons and their key points: 1. Forward Propagation: Input --> Hidden --> Output 2. Back Propagation: Input <-- Hidden <-- Output (on going back it adjusts the weights of the input layer) 3. Learning rate: Controls how much the weights will be adjusted during backword propagation 4. Activation function: Used to introduce non-linear properties.
    Reply
    Muhammad Rameez 2 months ago
    style of teaching is impresive
    Reply
    faaiq ahmed 2 months ago
    Multilayer perceptron (MLP) is an ANN that has hidden layers in it, that can handle complex and non linear functions. Its working i: Forward propagation: Signals move from input to output layer, and the activation functions are used by hidden and output layers. Backward propagation: Signals move from output to input layer. The errors are updated to the weights of neurons to improve accuracy. Learning rate: controls how much the weights are updated in back propagation. Activation functions: help to convert linear pattern to probability like Sigmoid, RELU, tanh
    Reply
    Rana Anjum Sharif 3 months ago
    Done
    Reply
    Shahid Umar 8 months ago
    The main discussion in this lecture is key points of multi-layer perceptron i,e forward propagation, backward propagation, learning rate and activation function
    Reply
    tayyab Ali 9 months ago
    I learned in this lecture Multi-layer Perceptron (MLPs) 1-Input Layer ( receives the input signal, which is data ) 2-Hidden Layer ( one or more hidden layers ) 3-Output Layer ( produces the final output ) I also learned how it works so key points of working are 1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only ) 2-Key point no. two is Back Propagation (updating the weights of neurons through the backpropagation method ) 3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during backpropagation )
    Reply
    Sibtain Ali 9 months ago
    I learned in this video Multi-layer Perceptron (MLPs) 1-Input Layer ( receives the input signal, which is data ) 2-Hidden Layer ( one or more hidden layers ) 3-Output Layer ( produces the final output ) I also learned how it works so key points of working are 1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only ) 2-Key point no. two is Back Propagation (updating the weights of neurons through the back propagation method ) 3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during backpropagation )
    Reply
    Javed Ali 9 months ago
    AOA, I learned in this lecture about multi-layer perceptron (MLP), a type of neural network that consists of more than one layer of neurons. Unlike a single-layer perceptron, which can only learn linearly separable patterns, a multi-layer perceptron can learn more complex, non-linear functions. This makes it a fundamental model in the fields of deep learning and neural networks.And also learned structure of Multi-layer perceptrons, which are1-Input Layer ( receives the input signal, which is data ) 2-Hidden Layer ( one or more hidden layers ) 3-Output Layer ( produces the final output )And I also learned how it works so key points of working are1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only )2-Key point no. two is Back Propagation (updating the weights of neurons through back propagation method )3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during back propagation )4-Key point no. four ( by mistake, three was written in the lecture) is Activation function ( functions like Sigmoid, ReLU or Tanh are allowing the MLP to learn more complex patterns.)ALLAH PAK aap ko sahat o aafiyat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey aur aap ke walid-e-mohtram ko karwat karwat jannat ata farmaye,Ameen.
    Reply
    0% Complete