Join the conversation
I have learned about Multi-layer perceptron in which I understood forward propagation and backward propagation.
Reply
I learned about the Multi-layer Perceptrons and their key points:
1. Forward Propagation: Input --> Hidden --> Output
2. Back Propagation: Input <-- Hidden <-- Output (on going back it adjusts the weights of the input layer)
3. Learning rate: Controls how much the weights will be adjusted during backword propagation
4. Activation function: Used to introduce non-linear properties.
Reply
style of teaching is impresive
Reply
Multilayer perceptron (MLP) is an ANN that has hidden layers in it, that can handle complex and non linear functions. Its working i:
Forward propagation: Signals move from input to output layer, and the activation functions are used by hidden and output layers.
Backward propagation: Signals move from output to input layer. The errors are updated to the weights of neurons to improve accuracy.
Learning rate: controls how much the weights are updated in back propagation.
Activation functions: help to convert linear pattern to probability like Sigmoid, RELU, tanh
Reply
Done
Reply
The main discussion in this lecture is key points of multi-layer perceptron i,e forward propagation, backward propagation, learning rate and activation function
Reply
I learned in this lecture Multi-layer Perceptron (MLPs) 1-Input Layer ( receives the input signal, which is data ) 2-Hidden Layer ( one or more hidden layers ) 3-Output Layer ( produces the final output ) I also learned how it works so key points of working are 1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only ) 2-Key point no. two is Back Propagation (updating the weights of neurons through the backpropagation method ) 3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during backpropagation )
Reply
I learned in this video Multi-layer Perceptron (MLPs) 1-Input Layer ( receives the input signal, which is data ) 2-Hidden Layer ( one or more hidden layers ) 3-Output Layer ( produces the final output ) I also learned how it works so key points of working are 1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only ) 2-Key point no. two is Back Propagation (updating the weights of neurons through the back propagation method ) 3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during backpropagation )
Reply
AOA, I learned in this lecture about multi-layer perceptron (MLP), a type of neural network that consists of more than one layer of neurons. Unlike a single-layer perceptron, which can only learn linearly separable patterns, a multi-layer perceptron can learn more complex, non-linear functions. This makes it a fundamental model in the fields of deep learning and neural networks.And also learned structure of Multi-layer perceptrons, which are1-Input Layer ( receives the input signal, which is data )
2-Hidden Layer ( one or more hidden layers )
3-Output Layer ( produces the final output )And I also learned how it works so key points of working are1-Key point no. one is Forward Propagation (signals travel from the input layer forward to the output) ( activation function used in the hidden and output layers only )2-Key point no. two is Back Propagation (updating the weights of neurons through back propagation method )3-Key point no. three is Learning Rate ( It controls how much the weights are adjusted during back propagation )4-Key point no. four ( by mistake, three was written in the lecture) is Activation function ( functions like Sigmoid, ReLU or Tanh are allowing the MLP to learn more complex patterns.)ALLAH PAK aap ko sahat o aafiyat wali lambi umar ata kray aor ap ko dono jahan ki bhalian naseeb farmaey aur aap ke walid-e-mohtram ko karwat karwat jannat ata farmaye,Ameen.
Reply